Final Project: Concept

My goal is to make a sonar headset rather than something that will actually create a visual image based on sound. My reason for this is that I want to create a sensory experience for my neuroscience students that demonstrates what it is like to use a sensory modality other than sight to navigate around objects. Many students have the misconception that when using echolocation the bats are in fact creating an actual visual image based on sound waves rather than light waves. The phenomenon of sensory substitution click here suggests that this not a completely outrageous assumption. For example researchers have used tongue stimulation to create visual images in blind people click here It is indeed fascinating that it activated many visual pathways and resulted in an image otherwise blind people. I want students experience what it is like to “see” without actually using their eyes.

Some blind individuals such as Daniel Kish to see Ted Talk click here have famously mastered the ability of using clicks for echolocation. However, it took him years of focused practice to master this ability. My goal is to create an echolocation device that will enable students to experience sensory substitution without the years of training.

My final project is to make echolocation goggles. Echolocation is an adaption that uses high frequency sound waves to obtain sensory information like the distance of objects and even size. Animals such as bats and dolphins use it and even a some species of shrews. To read more about echolocation click here . The idea of echolocation is basically sonar in which a sound wave is emitted and the time it takes for the sound wave to return is used. Bats do this and have very sophisticated ways of modulating the frequency and amplitude of the waves to detect small vs large prey or detect prey in crowded vs sparse sensory fields. click here and click here .

My first rotation in graduate school was doing quasi intracellular recording in the LGN of rat brains to study the visual system. What fascinated me then as now was how the brain is essentially an input/output device that works through neurons that use many of the same principals of electricity as our brains. For my FAB Academy project I wanted to make an input/output device that would force students to see the world in this way. I hit upon the idea of making echolocation goggles that would translate echolocation information into visual information that would best reflect the information that bats obtain.

For this project I wanted to enable students to get detailed information using echolocation. Many previous projects used sound as an output but while users can get some general idea of relative distance based on the frequency of beeps but not exact distance. Also I wanted to use multiple sensors, which would be very difficult for the users if multiple speakers were going off at the same time. As such I will use an LCD screen that will display the distance from two ultrasonic sensors. The advantage of this is also that this it will also make possible edge detection.

I was originally hoping to model echolocation by creating different ear types that are found on bats. I eventually realized that the sonic pulse emitted by sonar is too narrow to be changed by an ear. I would eventually like to model bat echolocation using an output/input device ears but this would require making a custom echolocation device. Interesting work in this area is being done by Rolf Mueller, in the College of Engineering at Virginia Tech click here .

I believe this project when completed will be a very useful and exiting tool to use in my neuroscience class. It is one thing to tell students animals can experience the world through different sensory modalities, it is so much better to get them experience what it is really like.