Satyam Jha , Bachelor In Clinical Optometry
Student, Pailan College of Management and Technology, Kolkata, India
We live in a loud world. We don’t usually notice every sound because most of it is constantly tuned out by our brains. The unimportant noise which surrounds us, we must be able to filter those, so it doesn’t sound like a jumble mess. But imagine being able to pick out any one sound, and use only that sound to navigate any environment, even one you’ve never been in before. Now as a person with a sense of vision, this seems impossible. But with the help of a teacher in the O & M program, one can learn how to do it.
Figure 1: Echolocation in Bats and Whales
[Picture Courtesy : https://www.zmescience.com/medicine/genetic/convergent-evolution-echolocation-bats-dolphins-0423432/]
Can anyone learn Echolocation ?
ECHOLOCATION is used by tons of animals, from whales to bats, to birds even some shrews can do it (Fig1). The species that are the best at it use Active echolocation, the same way that sonar works on a ship. Instead of just listening, they first send out a sound, like a click. Those sound waves sweep through the environment, and if they hit something, they bounce back. By reading these echoes, the brain can form a mental map. The time between when the sound is made , and when it bounces back helps the brain calculate things like distance and the quality of the sound bouncing back can even carry information like an object’s texture or hardness.(1,2)
Figure 2: Click Echolocation in a visually challenged person
[Picture Courtesy: https://www.dailymail.co.uk/sciencetech/article-9645473/Echolocation-help-vision-loss-study.html]
Passive And Active Echolocation
Passive Echolocation detects things using sound already in the environment but in Active Echolocation special sound is emitted. Active echolocation is just passive echolocation at a more enhanced level. So, whether we send the brain patterns of light, which is vision, or a pattern of sound, the brain will still construct an image. Scientists have found that blind people are almost a little better at echolocation than those with vision. Their brains had to develop new ways to handle sensory information.
Research :
When scientists put blind echolocators into an MRI machine, and played recorded echoes back to them, the regions of the brain associated with vision were activated, even though they weren’t getting any visual input. The parts of the brain that handle motion and movement were turned on during active echolocation, even if the person wasn’t moving at all. The weird part is that we don’t really understand exactly how brains rewire like this, but it’s another sign how adaptable the brain is.(3,4)
Figure 3: Human Echolocation Showing Visual Components
[Picture Courtesy : https://en.wikipedia.org/wiki/Human_echolocation]
Conclusion: When it’s extremely crowded , guide dogs can’t help effectively in navigation similarly the features of canes are also limited In really crowded environments most blind people end up going to a sighted guide. Every blind person needs to be educated about echolocation and possibly we should expand the research works to rehabilitate those with visually impaired to be able to use the brain’s capacity and navigate as independently as possible.
Reference:
- Tonelli, Alessia, Claudio Campus, and Luca Brayda. “How body motion influences echolocation while walking.” Scientific reports 8.1 (2018): 15704. https://www.nature.com/articles/s41598-018-34074-7
- Flanagin, Virginia L., et al. “Human exploration of enclosed spaces through echolocation.” Journal of Neuroscience 37.6 (2017): https://www.jneurosci.org/content/37/6/1614
- Thaler, Lore, Rosanna C. Wilson, and Bethany K. Gee. “Correlation between vividness of visual imagery and echolocation ability in sighted, echo-naïve people.” Experimental brain research 232.6 (2014): 1915-1925 https://www.ncbi.nlm.nih.gov/pubmed/24584899?dopt=Abstract
- Thaler, Lore, Stephen R. Arnott, and Melvyn A. Goodale. “Neural correlates of natural human echolocation in early and late blind echolocation experts.” PLoS one 6.5 (2011): e20162. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0020162
- Rojas, Juan Antonio Martínez, et al. “Physical analysis of several organic signals for human echolocation: oral vacuum pulses.” Acta acustica united with acustica 95.2 (2009): 325-330. https://www.ingentaconnect.com/content/dav/aaua/2009/00000095/00000002/art00013%3bjsessionid=22n2u0km0n1qr.x-ic-live-03
Recent Comments