We use vision to localize objects around us with barely a second thought. However, the process is incredibly complex in ways that are still subjects of active research. For example, the brain works at a finite speed -- signals from the eyes need some time to arrive in the visual cortex for processing. During that time, objects in motion continue to move. How, then, are we still able to catch a fastball, or avoid people in a crowd?
Like so much of visual perception, visual localization happens in the brain rather than in the eyes. As a result, the perceived positions of objects and their physical, or retinal, positions, are not always the same. These discrepancies, or illusions, are not "defects" in your visual system. Rather, the processes underlying them have evolved to help us see better in everyday situations. Many of these processes work together in the brain to determine where we see things.
Not everybody can use vision to sense their environment. About 45 million people are blind worldwide, with three times that number visually impaired. Blind people use many strategies to navigate their environment or identify objects around them. Some use active echolocation, similarly to bats and dolphins, in which they emit tongue clicks and interpret the reflected echoes to aid in navigation and object recognition. We are interested in the behavioral, perceptual, and neural processes underlying echolocation in blind and sighted humans.