ECHOLOCATION: How humans can "see" without sight By Daniel Kish, M.A. / M.A. [Note: The following review and accompanying reference list has been revised and updated from a Master's thesis presented in 1995 entitled "Evaluation of an Echo-Mobility Training Program for Young Blind P." ABOUT THE AUTHOR Daniel Kish holds Master's degrees in Developmental Psychology and Special Education. He holds a California state credential and national certification as an Orientation and Mobility Specialist. He has served in this capacity since 1996 as an itinerant instructor for many school districts and rehabilitation centers. Dan is one of the first and very few totally blind individuals to maintain ongoing employment in this capacity. From 1997 to 2001, Dan has served as Youth Outreach Coordinator for the Blind Childrens Learning Center where he coordinated the delivery of all types of blindness educational services including assistive technology, vision instruction, and mobility. Dan has presented internationally in dozens of forums on all topics related to blindness. His main expertise lies in human, sonic echolocation. In this area he has conducted experimental pilot research to investigate the design of a training program for blind children. This work includes one of the most comprehensive literature reviews of the topic detailing the nature and utility of echolocation in humans. This, together with thousands of hours of work with blind students and a lifetime of personal echo use, is soon to become publically available in several forums. This work forms the foundation of a systematic, comprehensive echolocation training curriculum (under development), and the design of a device to enhance sonic echolocation. Dan has begun work with Dr. Leslie Kay (inventor of the Sonic Torch, SonicGuide, and KASPA) to enhance the utility of ultrasonic sonar technology, improve instructional strategies, and expand its availability. Dan is striving to combine sonic with ultrasonic echolocation and other techniques to form a powerful and versatile approach to nonvisual spatial perception and control of movement. He has also begun consultation with the Origan Research Institute to develop PC driven virtual reality models for acoustic instruction, and with the National Institute of Health to study how the human brain resolves spatial images from acoustic input. Dan and a colleague, Hannah Bleier, have recently completed a section on audition training for a textbook chapter on blindness issues for families and educators. He has also made over half-a-dozen national TV appearances, usually with students, to demonstrate the functionality and educational relevance of sonic echolocation as it applies to highly advanced movement without vision. In addition, Dan coordinates a mountain biking project (called TEAM-BAT) which applies echolocation and other techniques to enable independent, high speed movement through complex, unfamiliar environments by blind youth. This project was recently featured a long with some of Dan's more advanced students on Riply's "Believe It or Not," "Discovery Break-throughs Inside Science," and the Sally Show." Finally, Dan is one of the Founders and a principal guiding force behind the formation of WORLD ACCESS FOR THE BLIND. INTRODUCTION According to the late Emerson Foulke (1971), a prominent figure in the field of perceptual psychology who himself is blind, "The ability to travel safely, comfortably, gracefully, and independently ... is a factor of primary importance in the life of a blind individual" (p. 1). Since the mid 18th century, the ability of some blind people to perceive objects from a distance without physical contact has been of gradually mounting human interest, probably due to its apparent capacity to enhance those assets of nonvisual travel of which Emerson Foulke so eloquently wrote (Norris, Spaulding, & Brodie, 1957; Barth & Foulke, 1979; Warren & Kocon, 1974; Zemtzova, Kulagin, & Novikova, 1962). Over the centuries, anecdotes have abounded of some blind people exhibiting keen powers of awareness, and of the ability to move through their surroundings with ease and grace without guidance or the need to feel about (Lende, 1940). Examples of documented reports of such abilities can be found as far back as Diderot who wrote in 1749 of a blind friend so sensitive to his surroundings that he could distinguish an open street from a cul-de-sac (discussed in Hayes, 1935; Griffin, 1986). Felts (1909), wrote of a totally blind acquaintance who went regularly about the crowded streets of New York with perfect ease and freedom without the use of a cane or any sort of guide. Hayes (1935) tells of a six year old blind boy able to ride his tricycle along the sidewalk without a blunder More recently, newspapers and periodicals have shown this author demonstrating, with several of his students, the ability to move rapidly about complex environments with ease. The Orange County Register (Nicolosi, 1994) described a 13 year old blind boy who skates with phenomenal agility in congested public rinks. The L.A. Times showed blind preschoolers learning to detect and find trees, parked cars, and other objects many yards away (Kaff, 1997). The cable television show beyond Chance aired a segment showing this author riding without assistance along public roads, and successfully teaching the technique of sensing objects to young children and adults (Mendoza, 1999). Mountain Bike Action Magazine featured a mountain biking club in which blind youth have learned to navigate rugged mountain trails independently at moderate speeds (Car, in press). This same project was recently featured on Riply's "Believe It or Not," (Cowger, 2000), Discovery Break-throughs Inside Science (Hanson, 2001), and the Sally Show (Ferber, 2001). Even a few experimental reports attest to highly developed abilities in the blind to sense the world around them without making physical contact. McCarty and Worchel (1954), for instance, studied an 11 year old, totally blind boy who could avoid obstacles placed in his path with almost perfect accuracy while riding his bicycle at top speed. Personal contact with this participant (B. Taylor, personal communication, April 26, 1995) revealed that he, like the man described by Felts in 1909 and those shown on recent television, traveled freely about his town, school, and college campus without the use of a cane or guide until his mid 20's. In 1974 Magruder studied a blind man who could describe with great precision the distance, direction, dimensions, and general nature of novel objects as far as 13 feet away in unfamiliar environments. Personal contact with this participant (L. Scadden, personal communication, May 5, 1993) found that he, too, blind from the age of 4, rode a bicycle on a regular basis as a boy. Frequent, anecdotal reports from among those who work with the blind as well as the blind themselves under-score the veracity and significance of documented phenomena. Dozens of mobility and special education instructors informally surveyed by this author have known of at least one student with skills of spatial awareness and mobility that they considered remarkable. In addition, several personal acquaintances reveal further tales of impressive ability to perceive their surrounding from a distance by nonvisual means. his... the better one becomes acquainted with blind people, or the more one reads about their abilities, the more obvious it is that some objects can be detected well in advance of actual contact" (Griffin, 1986, p. 299). Even so, it has not been until about the past six decades that this sense in the blind of the presence and position of objects around them without tactual contact has come under careful empirical study. Such study may be of incalculable value to blind people by making available the knowledge needed to improve nonvisual competence in spatial awareness and travel. A thorough understanding of the nature of this skill could have staggering implications for training and rehabilitation. This report examines thoroughly the empirical findings as well as modern theoretical perspectives concerning human echolocation, and explores the logistics of designing and implementing an effective program to train and refine echolocation abilities in the blind. HISTORICAL OVERVIEW An excellent review and examination of the earliest investigations into the sense of objects by the blind is provided by Hayes (1935). A brief review is given here to provide a context for understanding more modern research of the issue. Facial Vision The first documented consideration of the nonvisual sense of objects is found in an account by the French philosopher Diderot in 1749 about a blind friend who was reportedly his... able to judge the nearness of bodies by the action of the air against his face." [Diderot's observation is widely cited in the literature on human echolocation, but particular attention thereto is given by Griffin (1986) and Hayes (1935).] From that time to the early 20th century, two major sets of theories evolved regarding the nature of this sense. One set constituted the tactile or skin sense theories which proposed, much as Diderot suggested in 1749 (reprinted 1951), that the blind were sometimes able to sense, through the skin of their face, some systematic change in subtle properties of nature that alerted them to the presence of objects in their vicinity. These explanations were derived in large part from the reports of many of the blind that they felt the presence of obstacles through the skin of their face. Though these remained the predominant theories until the early 1940's, little agreement was reached regarding the exact natural properties involved, or, specifically, by what means were these properties perceived. These theories ranged from hyper-sensitivity to air currents and temperature, to perception of light or other electromagnetic waves through specialized nerves in the face, to a recognition of ether waves and other occult forces. A second set of theories comprised the audition theories which implicated auditory processing as responsible for the perception of objects. These fell into two main classes - the pressure theory which stated that the tympanic membrane was sensitive to subtle changes in air pressure caused by the presence of objects, and the auditory theories which asserted that the auditory system can perceive subtle variations in sound waves as they bounce off objects. Throughout the late 19th and early 20th centuries, studies on this object sense in the blind were carried out with some rigor, and, in the face of evidence for all sides, the tactile theories held sway. Thus, by the turn of the century, the term "facial vision" came to be applied most commonly to this little understood phenomenon - implying that sensory mechanisms in the face provided some pseudo-visual perception of space. It was not until the 1940's that a series of unassailable studies of this ability in humans laid the controversy squarely to rest. Facial Vision to Echolocation In the early 1940's Dallenbach and his associates at Cornell University investigated the specific sensory processes responsible for the nonvisual detection of obstacles (Cotzin, 1942). This investigation took the form of three sets of studies in which auditory, tactile, and tympanic stimuli were each systematically controlled. In the first two sets of experiments (Supa, Cotzin, & Dallenbach, 1944; Worchel & Dallenbach, 1947), 2 blind, 10 deaf-blind, and 2 sighted participants, all blindfolded, walked under varying conditions toward an obstacle. This obstacle usually consisted of a maisonite screen 0.25 inches thick by 48 inches wide by 58 inches tall which was raised so that its upper edge was 82 inches above the floor. Both the position of the screen and the starting point of each participant were varied randomly throughout an 18 by 61 foot chamber. All participants were asked to indicate when they first perceived the obstacle (first perception), and to stop as close as possible to the obstacle without touching it (final appraisal). Ratios of these figures were then calculated for each participant in each trial so that performance in each condition could be measured and compared between conditions. Reliability of participant judgements was rigorously controlled by setting up the obstacle while participants were outside the chamber, and by randomly introducing check trials in which no obstacle was present. Several sets of 25 trials constituted each condition in both studies. In all experiments in which participants' hearing was left in-tact, performance was consistently good for the blind and fair for the sighted. When participants walked with shoes on over a hardwood floor, the 2 blind participants were readily able to perceive the obstacle at distances as far as 24 feet. After about 9 practice trials, the sighted became able to perceive the obstacle up to about 6 feet. The blind and sighted were also able to approach to within half a foot of the obstacle on most occasions without touching it. When this exercise was repeated with footsteps muffled by stockinged feet over thick carpet, all performance indices dropped somewhat for all participants, but performance still remained relatively consistent. Performance was only slightly effected when participants' faces were loosely veiled and hands covered by thick cloth that air currents could not penetrate. [In 1953 Kohler and his associates obtained similar results by anesthetizing the skin of first one, then both sides of participants' faces (reported in Kohler, 1964).] In another experiment that removed all perception of stimuli other than hearing, participants were still able to estimate obstacle distance with fair accuracy. In this experiment the blind and sighted participants listened through headphones in a separate room to the experimenter's footsteps transmitted via microphone held by the experimenter as he walked with shoes on over the bare floor toward a stone wall. Under these conditions first perceptions and final appraisals of the experimenter's approach to the wall were only slightly lower than those obtained when the participants themselves approached the same wall, and the patterns of occasions in which the participants allowed the experimenter to collide with the wall resembled participant collisions in other experiments where hearing was left in-tact. In those experiments in which the hearing of the participants was heavily occluded, however, the participants evidenced no ability to detect the obstacle. They collided with the screen on every one of 100 trials. [Similar results were obtained in a later investigation by Ammons, Worchel, and Dallenbach, (1953) with 20 deafened participants out-of-doors.] Moreover, when the deaf-blind participants, all of whom had inner ear disruption leaving the tympanic membranes in-tact, ran through a similar series of experiments, not one could perceive the obstacle in any one of hundred's of trials. [This finding was also replicated later by Worchel and Berry (1952) with 10 deaf-blindfolded participants who failed to perceive obstacles out of doors given 210 trials.] Thus, the investigators clearly established a definitive relationship between the presence of perceptible sound and the ability to detect obstacles, and affirmed that no such relationship exist involving tactile sensation. It was concluded that auditory perception is "necessary and sufficient" for the detection of obstacles, and that sound waves such as those emanating from footsteps reflected by the obstacle comprise the primary stimuli. However, the specific components of reflected sound that make obstacle detection possible without vision still needed to be determined. In an additional series of experiments (Cotzin & Dallenbach, 1950), 2 sighted and 2 blind participants listened through headphones to a microphone-speaker assembly in a separate chamber. Participants could move this assembly remotely toward a large maisonite screen similar to that used in the previous studies. Continuous signals of various types were emitted from the speaker in the assembly, and transmitted via microphone to headphones worn by the participants. The participants were able to vary the rate of motion of the assembly, and give first perceptions and final appraisals as in the previous studies. Nine types of signals were emitted from the speaker - white noise spanning 100 Hz to 10 kHz, and eight pure tones comprised of sine waves ranging by octave intervals from 125 Hz to 10 kHz. Performances among participants using white noise were comparable to performances shown in the earlier studies in which participants themselves walked toward the obstacle. When the pure tones were used, however, participants were only able to detect the obstacle with the 10 kHz tone. Even so, performance using this tone fell greatly short of performance with white noise, and that demonstrated in the earlier studies. Though participants sensed the proximity of the screen reliably with the 10 kHz tone, they were unable to estimate distance reliably. Participants reported that, as the assembly approached the obstacle, they could judge its proximity by a change in the nature of the signal which seemed to constitute a rise in pitch. This change was most perceptible when using the white noise, less so with the 10 kHz tone, and not at all with the other tones. These reports were similar to those given by participants in an earlier experiment (Cotzin, Worchel, & Dallenbach, 1944) in which the sounds of the experimenter's footsteps were transmitted to the participants via microphone to headphones. In light of these reports, the experimenters concluded that the perception of obstacles without vision depends on a rise in the pitch of sounds as they are reflected or echoed from approaching surfaces, and that this rise in pitch is only perceptible with frequencies around 10 kHz and above. Since these three reports, terms that refer to the perception of echoes - "echo detection," "echolocation," "echo ranging" - have come into common use in reference to the nonvisual perception of obstacles by humans. Lessons from Hind-Sight Perhaps it should not be too difficult in some respects to understand why this controversy over the perception of objects by nonvisual means raged for so long. In truth, as indicated earlier, the blind themselves are notoriously mystified as to the nature of these perceptions (Supa, Cotzin, & Dallenbach, 1944; Juurmaa, 1969). Even some with extraordinary skill are unable to report how they accomplish this feat (Felts, 1909; Shephard & Howell, 1980). Indeed, many skilled at the perception of objects report this perception as a distinct sensation or pressure on the face (Juurmaa & Jrvilehto, 1969; Juurmaa, 1970a; Ono, Fay, & Tarbell, 1986; Schenkman, 1985b). Two explanations of this tactile sensation have evolved. The first implicates an increase of muscle potential tension in the face due to unconsciously learned anxiety responses to the proximity of objects (Dolanski, 1931; Taylor, 1962). Echolocation seems typically to be an unconscious process (Juurmaa & Jrvilehto, 1969; Juurmaa, 1970a) learned primarily by random trial and error (Juurmaa, 1969; Worchel & Mauney, 1950). When objects are struck it is typically the head and face that receive the most memorable impact. An unconscious connection is thereby made between actual object perception through unconsciously processed echo information, and an involuntary response of muscle tension in the face. This perspective need not invalidate the subjective tactile experience often associated with obstacle perception. In fact, Juurmaa and Jrvilehto (1969; Juurmaa, 1970a) use this experience to justify a distinction between phenomenal experience and functional stimulation. This distinction is best exemplified in studies which report tactual sensations in participants exposed to the presentation of phantom obstacles created by sound synthesis techniques (Kohler, 1967). In such studies, the presence of a surface can be produced artificially through sonic illusions, and participants respond to these presentations similarly to presentations of real surfaces. A more recent empirical explanation involving a series of studies (Ono, Fay, & Tarbell, 1986) indicates that the experience of tactile, facial sensations is connected with vision. Although these authors did not compare people blinded early in life to those blinded later on, they found that much higher percentages of sighted than blind people reported the experience of tactile sensations in the face when objects were near. In addition, the sighted participants reported experiencing a dim light upon closed eyelids as facial pressure. These authors suggest that those blind later in life may associate the presence of objects - once a consciously visual experience - with genuine sensations upon the face. Thus, the term "facial vision" may have, at least in part, arisen from actual phenomena. It is of interest to note in relation to these considerations that a lengthy series of obstacle perception training studies reported by Ammons, Worchel, and Dallenbach (1953) with 20 sighted-blindfolded participants failed to yield a single report of "facial vision" - i.e experience of tactile sensation or pressure. All of the participants became aware of the auditory nature of the perception, though many also reported imaginal visual experiences such as "black curtains" and "dark shades" that seemed to coincide with close proximity to the obstacle. At any rate, whatever the reasons for the protracted confusion of the past, Griffin (1986) points out a lesson to be learned: "In retrospect it seems clear that most of the better controlled experiments, as well as many of the most carefully collected introspective reports ... indicated a preponderant importance of hearing." (p. 303) He notes further that the most rigorous studies in the 1700's of an apparently similar ability in bats to detect and locate objects without the use of vision also found hearing to be of primary importance. Yet, these most salient examinations of this phenomenon in bats as well as in humans went unrecognized and unappreciated for almost 200 years, and the link between the related phenomena in bats and men did not become thoroughly clear until about the 1960's with the astute observations of Griffin (1958) and the insightful work of Kellogg (1962/1964). Investigations into echolocation in animals as well as humans have since united to develop a greater understanding of this ability, and how it can be applied to effective mobility without vision. WHAT IS ECHOLOCATION? As indicated earlier, "echolocation" is an aspect of auditory perception which may be broadly defined as the ability to perceive echoes. On the surface, such an ability may seem unremarkable and of little use - largely because echoes are not commonly believed to convey much information. Popular conception holds echoes to be a specialized phenomenon unique to specific circumstances such as firing a gun in the mountains, or calling out in caves and tunnels. But this is like saying that light reflects only from mirrors and highly polished surfaces. In actuality, the visual system is enabled to perceive its surrounds by its ability to process the complex patterns of photons of visible light as they reflect into the eye from surfaces in those surroundings. If all we could see were sources of light and not reflected light, our eyes would give us very little awareness of the nature of our surroundings. By perceiving and interpreting patterns of reflected light, extremely rich and detailed information can be gathered about the layout and characteristics of surrounding space and objects therein. Vision and audition are close cousins in that both can process reflected waves of energy. Vision processes photons (waves of light) as they travel from their source, bounce off surfaces throughout the environment, and enter the eyes. Similarly, the auditory system can process phonons (waves of sound) as they travel from their source, bounce off surfaces, and enter the ears. Both systems can extract a great deal of information about the environment by interpreting the complex patterns of reflected energy that they receive. As Gibson put it "There is a flow of energy, the ambient array of radiant energy reflected from every face and facet of every surface and object in the environment" (Schwartz, 1984, p. 27). Though Gibson was referring to light energy, his poetic depiction holds for sound as well. In the case of sound, these waves of reflected energy are called echoes. Echoes occur to varying degrees and forms under virtually all circumstances in all environments that support life as we know it. Echo information can be perceived and processed by the auditory system to enable a great many determinations about surrounding space and one's physical relationship to it. The functional effectiveness of echolocation in animals who possess little or no vision is legendary and little questioned. Lee, van der Weel, Hitchcock, Matejowsky, and Pettigrew (1992) point out that certain species of bats can use echoes elicited by their own ultrasonic chirps to "move as gracefully as birds through the cluttered environment" (p. 563), and to negotiate obstacles as thin as 0.65 mm. These authors further indicate that some echolocating bats can develop a precise spatial memory of previously explored environments to an accuracy within 2 centimeters. Griffin (1986) points out that the capture of insects as minute as 0.2 mm without the use of vision poses little difficulty for many species of bats. Numerous investigations such as these concerning nonvisual navigation and foraging by bats, nocturnal birds, and marine animals (Ayrapetyants & Konstantinov, 1974; Griffin, 1986) clearly demonstrate that echoes can provide detailed and consistent information about the surrounding environment that is pragmatically useful to auditory observers in the animal kingdom. With this information, sightless animals perform all essential functions of productive living just as those with sight. They mate, raise their young, hunt, avoid danger, range far and wide, and build and maintain their domiciles. Moreover, they do all this successfully and competitively in a world of intense selective pressures where vision predominates. Studies of blind humans along similar lines do not demonstrate the ability to negotiate micro-thin wires or swoop down with expert precision on the tiniest of insects, but the results are nevertheless striking in the context of practical functioning demanded by human civilization. This author was quoted as saying: "For me, personally, one of the key senses is echolocation. ... It allows me to perceive my environment from a distance. I didn't necessarily have to touch everything to know what it was, or where it was. I have been doing this ever since I can remember, so it must have been something I had from early on." (Brim, 1997) It has been shown, for example, that the blind can sense the presence of small objects from 2 to 3 meters away (Jones & Myers, 1954; Myers & Jones, 1958; Rice, Feinstein, & Schusterman, 1965), judge the distance of a single object to an accuracy of scarce inches at close range (Juurmaa & Jrvilehto, 1969, Juurmaa, 1970b; Kellogg, 1962/1964), ascertain the lateral location of a single object to within a few degrees (Rice, 1969; 1970), judge size variations to mere fractions of an inch at close distances (Juurmaa & Jrvilehto, 1969; Juurmaa, 1970b; Kellogg, 1962/1964; Rice & Feinstein, 1965), determine distinct shapes of objects (Hausfeld, Power, Gorta, & Harris, 1982; Rice, 1967a, 1967b, 1967c), and identify textures of surfaces (Hausfeld, Power, Gorta, & Harris, 1982; Juurmaa & Jrvilehto, 1969; Juurmaa, 1970b; Kellogg, 1962/1964). Mills (1961, 1963) demonstrated one participants' ability to detect a one meter by half a meter cardboard target as far away as 100 feet, and Rice (1969, 1970) found one blind man who could reliably detect the presence of a 1 inch disk 3 feet away. Moreover, let's not forget about the recent demonstrations of some blind people riding bicycles at respectable speeds through complex, unfamiliar environments. In order to understand fully the experimental findings and appreciate the implications ofecholocation research, it is essential to have at least a basic grasp of how echolocation works. HOW ECHOLOCATION WORKS Approaches through physics and mathematics to the study of sound and environment, together with many behavioral studies of the use of echoes by animals and humans under varying conditions have lead to a rather comprehensive and practical understanding of the processes behind echolocation and its utility. Eloquently simple and concise examinations of human echolocation are given by Rice (1967c), Welch (1964), and Wiener and Lausen (1997). For more extended and detailed examinations of the processes involved, see Griffin (1986), and Rice (1967a). For more technical analyses see Schenkman (1985b) and Wilson (1967). Three components must be present for the perception of echoes to take place - sound, a surface or surfaces to reflect sound, and an observer with auditory perception (Rice, 1967a, 1967c). The quality at which echoes are perceived depends upon characteristics of each of these three components, and the spatial relationship and interactions among them (Wilson, 1967). Each of these components is briefly considered, and their interactions are discussed. Sound and Echo All environmental spaces that support human life are pervaded by a diverse array of sound which vary according to five basic parameters - directionality, pitch, timbre, intensity, and envelope. Directionality refers to the degree of focus of a sound as it emanates from a source. The focus may vary from unidirectional like the narrow field of a trumpet, to omnidirectional like the broad field of a drum or cymbal. The bell of the trumpet and other horns helps to focus its blast so that most of the acoustic energy travels in a beam-like effect. The term unidirectional refers to travel primarily in one direction. The drum has no such mechanism to "beam" the sound, so its acoustic energy radiates outward more or less evenly in all directions or "omnidirectionally." Pitch simply refers to the dominant frequency of the sound as on a musical scale, but the "notes" are called "frequencies" and are measured in Hz or kHz. The lowest frequency that the human ear can typically register is about 20 Hz, where the highest is usually around 20,000 Hz or 20 kHz. In musical terms, this range is equivalent to about ten octaves. The lowest notes of a large pipe organ might reach down to 20 Hz, while 20,000 Hz is about equal to the high whine that emanates from televisions. Timbre simply refers to the unique sound that something makes. We recognize a cymbal when it crashes because it sounds like a cymbal; it has the timbre of a cymbal. In technical terms, timbre refers to the spectral composition of the sound, or, in essence, chords or clusters of frequencies. Every sounds is essentially composed of simple or complex clusters of frequencies. Simple timbres involve relatively few frequencies such as in the human whistle or a tuning fork, while complex timbres involve many frequencies as in the human voice or an automobile engine. In addition they may be narrow band where all the frequencies occur within just a few octaves like an "S" sound, to broad band where the frequencies span many octaves like a jet airplane or radio static. Intensity or amplitude merely refers to how loud the sound is, and it is usually measured in deciBels or dB. The term envelope is a little more complex. It essentially has to do with how a sound starts and ends. It refers to three temporal factors - rise time, onset, or attack (the length of time for the sound to increase from zero to peak intensity), sustain time (the length of time that the sound remains at its average intensity), and decay (the length of time for the sound to decrease from average to zero intensity). A hand clap, for example, may rise quickly (about 2.5 milliseconds), sustain briefly (about 3 milliseconds), and decay just as rapidly (about 2.5 milliseconds). A gong rises much more slowly (about 1 second), sustains briefly (about 1.5 second), and takes a very long time to decay (perhaps 30 seconds or more). All this, of course, depends upon the size of the gong - where larger gongs generally have longer values in all 3 components. For purposes of studying echolocation, these three values are often combined for a total temporal measure called duration. So, the handclap would only have a total duration of about 8 milliseconds, while the gong might last at least 30. Each of these five basic parameters is determined by the physical properties of the cause or source of the sound - always an event of some sort. This source sound represents its cause, which is how we can identify something by the unique sound that it makes. When a sound is produced, it travels in the form of waves of energy that radiate linearly from the sound's origin. We can think of them as straight lines of force that emanate more or less in all directions, or according to the focus of the cause. Remember, a trumpet tends to focus its energy forward. These sound waves are called "incident" waves. These waves actually assume parameters of physical shape and dimension as they move that embody or represent the five basic parameters of sound just described. For example, high pitched sounds are carried by short wave lengths, while long waves carry lower frequencies. Complex sounds may be carried by broad wave patterns with short and long components. These waves move through the air (or other media) just like waves of water. Sound waves are most cohesive and carry the most energy at or near their origin. As they travel further away from their source, their energy wanes until they either loose all cohesion and diffuse completely, or, more likely, until they encounter surfaces in their path. When the original sound waves or incident waves encounter surfaces, they bounce off - reflected by the surfaces - and they generally return to the original source or cause of the sound. The parameters of the reflected energy are altered from those of the original sound by the characteristics of the surfaces that the sound waves bounce off. We might think of the surface is a "cause" of the reflection or echo. Thus, the parameters of the original sound or incident wave are affected by the physical properties of the cause or event, and the parameters of the echo are, likewise, affected by the physical properties of the cause of the reflection - i.e the nature of the surface. Again, it is just like waves of water. When a wave of water strikes a stone wall, it rebounds quickly and totally in the direction from which it came. If it encounters a sandy bank or cliff, some of the water is reflected while some is absorbed by the cliff, or deflected along its face. If the surface is slanted like that of a beach, the wave watches around and over the surface - part being reflected and part passing over the shore and remaining. It is the same with sound. The process may also be analogous to bouncing a ball off a stone wall verses a chainlink fence. Different surfaces affect the way the ball bounces off, but the ball generally rebounds back toward the hand that threw it. In the case of sound waves, the waves themselves may be altered according to the nature of surfaces that they encounter. Reflected energy may occur in the form of discrete echoes of specific source sounds such as when a call is heard to reflect off the mountains or a distant building, or in the form of sustained echoes called reverberations such as the result of yelling in a gymnasium or stair well (W. Del l'Aune, personal communication, May 6, 1993). Reverberations are formed from many echoes resulting from one or more sounds cascading about and around many surfaces or surface features. Reverberations from the ongoing array of ambient source noise set up standing reflections, called reverberant fields, that are more or less continuous. This effect is well known even to those who do not depend upon echoes by the "ocean in the seashell" phenomenon. When one places a seashell near one's ear, it is said that one can "hear the ocean", as though a piece of ocean actually remains within the shell. In fact, this effect is produced by sounds in the environment which reverberate within the shell's chamber - causing a continuous "whoosh" of sound. One finds a similar phenomenon in all containers with solid surfaces such as a glass jar, a stairwell, and to a lesser extent, hallways and rooms. The ambient source noise that elicits reverberant fields may be of very high or low intensity, and can be found just about anywhere (Wilson, 1967). Except when specifically referring to discrete echoes, the term echo can be used to include all forms of reflected sound including reverberant fields (Schenkman, 1985b). The total array of original energy patterns and patterns of echoes comprise the "acoustic field" which is analogous to the optical field studied by Gibson (Scwartz, 1984). The Auditory Observer in the Acoustic Field The auditory observer abides in a sea of information communicated by sound and echo. Acoustic fields pervade both urban settings where sounds of traffic, air conditioners, and milling crowds abound, and rural settings where the lighter sounds of birds, trees rustling, and footsteps upon the gravel path may predominate. They pervade even spaces generally thought to be silent - arising from combinations of the subtlest sounds such as the gentle humm of electrical wiring, the all but diffused sounds from distant spaces, the brush of a person's clothing, the ebb and flow of breath, the merest trickle of saliva, even the soundless sounds of heart beating and blood pulsing. Myers and Jones (1958) found that 18 blind children could reliably detect a four by one foot wooden panel at a distance of four-and-a-half feet in a sound proof, anechoic chamber under environmental conditions believed to be completely silent (no objective sound pressure levels were taken). Five out of eight blind children from a separate group under identical environmental conditions were able to detect six foot cardboard strips as narrow as four inches at distances up to 8 feet. According to Wilson (1967), the occasions are most rare that ambient noise levels approach the absolute silence of zero. The ocean depths of the seashell may be heard in even the most silent places. Perceptions such as those demonstrated by Myers and Jones' participants (1958) are made possible by the interpretation of the arrays of even the subtlest ambient noise that form delicate collages of discrete echoes and reverberations which fill spaces and connect all surfaces therein by a webwork of reflected acoustical energy. De l'Aune and his colleagues demonstrated this by analyzing stereo spectrograms of straight vs. T-intersecting segments of a corridor which was unoccupied and devoid of obvious sound (De l'Aune, Gillespie, Carney, & Needham, 1974; also reported in De l'Aune, Scheel, Needham, & Kevorkian, 1974). These recordings were taken through a set of artificial ears. It was found that frequencies under 200 Hz were more intense in the T-intersection, and frequencies of 800, 1000-1300, and 1800 Hz were more intense in the straight segment of hall - with differences being most pronounced in the ear facing the side of the corridor with the T-intersection. By these subtle changes, De l'Aune, Scheel, Needham, and Kevorkian, (1974) found that many blinded veterans could use these recordings to learn to distinguish reliably between the straight segment and the T-intersection of this corridor. The Nature of Echo Information and Perception Once a sound has been reflected, it becomes an echo. The characteristics of echoes are defined largely by the same five parameters that define source sound (i.e directionality, pitch, timbre, intensity, and envelope). As with source sound, the echo parameters are shaped by the physical properties of its cause - i.e the nature of the reflecting surface. Thus, the characteristics of echoes correspond to the physical characteristics of the surfaces reflecting them. Because of this correspondence, it is possible to determine the nature of reflecting surfaces by interpreting the variations in the parameters of the echoes coming from these surfaces. It is very much like identifying an event by the sound it makes - only in this case, we are identifying the cause of an echo, rather than a source sound. Near surfaces reflect differently from those far away, high surfaces differently from low surfaces, those to the right differently from those to the left, large differently from small, hard from soft, flat from curved, rough from smooth, and so on. Surface Detection Surface detection - the ability to distinguish between the presence or absence of a surface - is the most basic element of echolocation. It may also be the most important, since no other information such as distance, location, orientation, size, or composition of surfaces can be gleaned unless the mere presence of the surface is detected. The ability to detect surface presence or absence simply relies on the observer's ability to perceive and recognize the presence of the echo cast by the surface. If an echo is present, then a reflecting surface must also be present. If there is no echo, then there is either no surface present, or the surface is only capable of casting echoes that are too weak to be heard under the given circumstance. As such, this simple ability to detect surfaces through echoes might be said to depend mostly - if not entirely - on the parameter of intensity, since the mere presence of an echo is defined by some measure of intensity. Empirical investigations into simple, nonvisual surface detection have been largely concerned with the effect of echo intensity on detection performance. The intensity of an echo depends upon the amount of sound energy reflected back to the ears of the observer. The four factors involved in varying echo intensity primarily concern target parameters (how while the target reflects sounds), the intensity and duration of the sound sources used to elicit echoes, the spatial relationship between target, sound source, and observer's ears, and the amount of other background noises that might mask echoes. Target Parameters The more reflective is a surface, the more energy it reflects, and the more intense is the echo. Target geometry and composition are probably the key factors that contribute to its quality of reflectivity, and, therefore, to the intensity of the returning echo. Target geometry. Targets of different dimensions and curvatures affect echo strength or intensity by reflecting varying proportions of acoustic energy back to the observer. Rice and Feinstein (1965b) varied the ratio of target length to width, and curvature at a constant distance of 4 feet from four blind participants. In half the trials, no target was presented. The participants reported whether or not they detected the target when prompted. All targets were sixteen square inches, but the dimensions varied from 4 by 4, 8 by 2, and 16 inches by 1 inch. Detection became poorer at this distance as the ratio of length to width increased. The thinner the target, the more difficult it was to detect, even though the distance and surface area of the target remained the same. Thinner targets tend to scatter or diffract more energy than they reflect. Thus, a smaller proportion of the echo returns to the observer. In an attempt to reduce the amount of lost energy and thereby increase that returned to the observer, the longer targets were curved to an arc matching a radius of four feet - the observer's head marking the center. This created a kind of partial reflection dish to focus rather than scatter the energy. All participants were able to detect even the thinnest targets more frequently when they were curved to reflect more energy. Target composition. Targets of lesser density tend to reflect poorly. Soft surfaces, for example, tend to absorb much of the energy, and sparse surfaces such as chainlink fences pass rather than reflect most of the energy in the same way that narrow surfaces do (Twersky, circa 1950). Juurmaa and Jrvilehto (1969; Juurmaa, 1970b), for instance, spectrum analyzed the audible output of an ultrasonic echo receiver. [Such devices emit ultrasonic waves, receive the returning echoes, and electronically translate the ultrasonic echoes into audible tones and timbres that correspond to the parameters of the echoes received.] The translated output of echoes from metal, pasteboard, and cloth were analyzed. The signal quality was distinct between all three materials - particularly between the harder surfaces and cloth. One of the key distinctions involved intensity, where echoes from cloth were the least intense. Similarly, targets of extreme smoothness such as glass or acrylic tend to reflect less energy back to the observer than do coarser surfaces such as wood or pasteboard (Twersky, 1950; 1951a). Twersky indicates that glass surfaces such as store windows proved somewhat more difficult for sighted-blindfolded participants to localize (Twersky, 1951a). Sound waves tend to slide off highly polished surfaces - causing a larger quantity of energy to be scattered away from the observer. Eighteen sighted-blindfolded and one blind participant studied by Hausfeld, Power, Gorta, and Harris (1982), for example, found it difficult to distinguish 20 centimeter diameter disks of Plexiglas and low pile carpet from each other, and from wood or cotton fabric, but wood and fabric were readily distinguished from each other. Dolanski (1930; 1931) similarly found that the distance and size at which disks of iron, glass, and cloth were perceivable did not vary according to material among 42 blind participants. Apparently smooth glass, plastic, and even iron may scatter about as much energy as cloth absorbs - causing them to resemble each other under certain circumstances. It should also be considered that the targets used in these investigations were quite small, and may have been more difficult to discern than larger targets. Juurmaa and Jrvilehto (1969; Juurmaa, 1970b) found that 7 blind participants were generally able to make clearer distinctions between metal, pasteboard, and cloth panels when the sizes exceeded 40 centimeters on a side. Kohler (1964) found very clear relationships between absorption properties of object surfaces and their detectability when ultrasmooth surfaces were not used. Distances at which cardboard, rubber, felt, or wading were first detectable diminished as absorption increased. Source Sound. A more detailed discussion of the effect of source sound variables on echolocation is reserved for a later section. Suffice it to say for now that, in order for an echo to occur, there must be a sound source to generate it. In order for a sound to be reflected, there has to be a sound to reflect. As seen earlier, very little energy is needed to generate some form of echo. However, it is not unreasonable to suppose that greater amounts of source sound would serve to generate echoes of greater amount or intensity. If echoes of greater intensity are more easily heard, then stronger source sounds may facilitate surface detection by producing stronger echoes. Supa, Cotzin, and Dallenbach (1944) conducted a series of studies in which a 48 by 58 inch maisonite screen raised 2 feet off the floor was placed before 2 sighted-blindfolded and 2 blind participants. The screen was placed at distances varying randomly between 6 and 30 feet. In an unspecified number of trials for each series, the screen, without participant knowledge, was not present. Participants walked down the path, and indicated when they first perceived the screen. Echo intensity was controlled here by varying the level of the sound of participants' footsteps as they walked. Two series of 50 trials each were run. In the first, participants walked over the hardwood floor with shoes on. In the second, they walked in stockinged feet over a strip of very thick carpet. In neither condition was the obstacle falsely detected when it was absent. When it was present under the condition of greater sound intensity (shoes over hard floor), one of the blind participants was able to detect it reliably at a little more than 17 feet; the other could sense it about 4 feet away. The two sighted participants, both of whom had received previous training for this experiment, were able to perceive the screen at a little over three feet. When walking under the less echo intensive condition where the sound of footsteps was muffled by stockinged feet over carpet, the distance at which the screen was first detected diminished by about 53 to 68 percent among all of the participants, and all detections were less certain. This finding was replicated almost without exception in three additional experiments conducted under similar conditions by these authors. Myers and Jones (1958) presented a wooden panel one foot wide by four feet tall to 18 blind participants at a distance of about four feet. Echo intensity was controlled by removing all possible noise from the test environment, and varying the amount of noise that participants made deliberately. Experiments were conducted in a sound proof, anechoic chamber under two conditions - each involving a group of nine participants. In one condition, participants had to indicate whether the panel was present or absent without making a single sound or movement including breathing. In the other, participants could make whatever noises they wished before deciding. Results show that participants detected the surface more often and reliably when they could generate their own sounds. Spatial Relationship Between Target and Observer Distance. As a general rule, echo intensity decreases as the distance that the echo travels increases. It's the same with sound sources; the further away the cause of the sound is, the weaker is the sound when it reaches the observer. Likewise, the further away the surface is when it reflects a sound, the weaker that reflected sound (echo) is when it reaches the observer. Kohler (1964), for example, found, through spectrum analysis, that the intensity of white noise and pure tones of upper frequencies decreased as a cardboard disk was moved away from the sound source. An investigation by Jerome and Prochanski (1947; 1950) varied the distance in one foot increments from three to nine feet between four blind participants and a maisonite panel three feet wide and six feet tall. No panel was presented in half of the 60 trials. Results clearly show that the panel became more difficult for all participants to detect reliably as its echo strength was diminished by the increase in distance. Detection errors involved both falsely detecting the panel when it was not present, and failing to detect the panel when it was. Correct detections fell from between 73 and 100 percent at 3 feet, to between 34 and 80 percent at nine feet. Thus, the increase in distance from three to nine feet decreased echo intensity sufficiently to impair object detection for even the most proficient of the participants. Distance and size. The effect of both target geometry (namely size), and distance on surface detection has been examined in several studies. A thin target reflects less energy because it scatters a large part of the energy away from the observer. A small target delivers a similar effect by presenting a smaller surface area to the on-coming sound wave. Most of the wave, therefore, tends to pass around the target rather than being caught by it and returned to the observer. Dolanski (1930; 1931) measured the effect of size on the maximum distance at which an object was detectable. Disks decreasing in diameter from 500 to 20 millimeters were quietly moved toward 42 blind participants until the participants reported detection. Experiments were conducted in which the disks were moved frontally (directly toward the face), and laterally (directly toward each ear). The results of both conditions show a clear relationship between diameter of target and distance of detection - with larger disks being necessary for detection at further distances. The smallest disk that could be detected at close range was about 100 millimeters frontally, and about 40 millimeters at either side. [The relationship between horizontal target position and detectability is discussed later.] Although Dolanski failed to include blank trials regularly, the relationship between size and distance of targets in echolocation has been widely reported. Rice, Feinstein, and Schusterman (1965) used stimuli similar to that of Dolanski. Aluminum disks of varying sizes were presented at distances of 2 to 9 feet from 5 blind participants. The target was omitted in half of the trials at each distance, and participants were asked to indicate whether the target was present or not. A linear relationship similar to that in Dolanski's investigation was found between size and distance. As the distance increased, disks of greater size were required for detection to remain reliable. Jones and Myers (1954) found comparable results using vary different stimuli. They tested the ability of over 30 blind participants to detect six foot cardboard strips ranging in width from 2 feet to 1 inch, and varying in distance from 3 to 6 feet. Blank trials were included in 25% of 40 trials for each participant. Though detection of the larger strips was only slightly impaired by increasing distance, the smaller strips were generally much more difficult to detect as distance increased. Finally, in a program designed to train three participants with progressive vision loss, Juurmaa, Suonio, and Moilanen (1968; Juurmaa, 1968b) found that it took longer for participants to learn to perceive a pasteboard panel 20 centimeters wide than one 40 centimeters wide, though a difference in height from 1 to 2 meters seemed not to affect detection performance. Lateral target position. Four studies have examined the effects of lateral target position on echo detection ability. In these studies of lateral position, the targets were always presented at the level of the ears. In a study by Kohler (1964) in which a 50 cm cardboard disk was presented in many locations around the heads of 20 participants, detection was most accurate when the disk was presented directly in front of the participants. Detection performance worsened gradually with presentation to side positions, and diminished further with presentation behind the head. Rice (1969, 1970) also found with 8 blind participants and 3 sighted-blindfolded participants that detection reliability rolled off as target presentation shifted from the frontal position to side positions. In Schenkman (1983), the detection performance of 4 blind participants presented from the side with cardboard rectangles ranging from 1.03 x 0.73 to 0.365 x 0.515 M were compared to that of six blind participants presented with a 0.38 M aluminum disk from the front. None of the participants in the side presentation condition were able to detect any of the targets reliably, but detections were consistent with those participants presented with targets from the front - even as far away as four M. A study by Dolanski (1930; 1931) contradicts the findings of the four studies cited above. In Dolanski's study, 42 blind participants were presented with disks made of different materials and varying in size from 20 to 500 mm diameter. These participants were able to detect all of the targets at about 50 percent greater distances from the side than in front. There are not enough data available to enable a clear understanding of the contradictory nature of these findings. Different sound sources used at different positions may have affected results. For example, the participants in the Schenkman (1983) study used cane taps as echo signals, while the echo signals used by Dolanski's participants (1930, 1931) were not specified. It may be that cane taps are not optimal for the detection of targets elevated to head level. A sound emitting device was used in the Kohler (1964) study. Its nature is also unclear, however, though other facets of the study utilized the device at chest level. It may be that lateral position of objects facilitates echolocation over frontal position under certain conditions, but those conditions are not known. Vertical target position. Studies are contradictory concerning the accuracy of echolocation as a function of vertical position. The Kohler (1964) study presented in the previous section also charted detection accuracy for positions below and above the head, and found that detection accuracy fell off as the cardboard disk moved below or above the level of the ears. However, Schenkman (1983) found with 8 blind participants that detection was more accurate for objects placed at waist than at head level. Interestingly, the difference in detection relative to object height was greater for objects placed 4 M away than those placed at 2 M distance. Again, signal characteristics may be responsible for the apparent contradiction in these findings. It may be that cane taps, as were used in Schenkman (1983), optimize detection of objects at waist level. This possibility is examined in a later section. Target obliquity. In previous sections it was made clear that target dimension greatly affects echolocation ability. Smaller or narrower surfaces scatter acoustical energy so that much of the returning energy is lost. A study by Clarke, Pick, and Wilson (1975) investigated the degree to which target obliquity also affects echolocation. With 12 blind and four blindfolded-sighted participants, the ability to detect flat surfaces of different sizes and distances tapered off sharply as the angle of rotation was increased with respect to the participants. For example, at a distance of one meter a board 90 cm wide became undetectable at an angle of approximately 20 degrees. Two elements seem to contribute to this effect. First, as surfaces become more oblique, they divert more acoustic energy away from the observer. Also, as targets are presented more obliquely, they may effectively grow thinner as the target is presented more edge-on. This results in a scattering of much of the acoustic energy so that, depending on the thickness of the target, little energy may be returned to the observer. Effects of Sound Source Position Only Schenkman (1985a; 1985b) has specifically examined the effect of sound source positioning on echolocation. Using five blind participants, detection of a 2 X 0.5 M surface at distances of 1, 3, and 5 M was tested with the noise generator located near the head, waist, and feet. It was found that detection was generally most accurate with the sound source located at the waist, and least accurate with location at the head. Reasons for this are unclear. This sound source was a continuous noise generator. It is possible that the presence of the noise too near the ears masked echoes. Object Perception The term "object perception" is generally used in the literature to refer to the assimilation of object features through tactual exploration. Here, the term refers to assimilation through echo interpretation. Distance Perception According to Schenkman (1985b), features of both envelope and pitch parameters seem to be the primary components of the perception of distance for humans using echoes. Concerning the envelope parameter, there is an additional component in echoes called "time delay." This refers to the time interval between the onset of the source sound and the beginning or onset of the perceived echo. This delay increases directly with distance between the source sound and the surface returning the echo. Inversely, as the distance decreases, so does the time delay between the source sound and the echo. As the distance becomes very small (about 2 to 3 meters) the time delay decreases to a point at which the human ear can no longer tell the sound and its echo apart; sound and echo merge. At this point, the ear comes to rely on the pitch parameter for distance judgements. As the distance decreases between the surface and the observer &/or sound source, the pitch of the echo is perceived to rise with respect to the source pitch. This perceptual change in pitch is best demonstrated by Bassett and Eastmond (1964). By spectrographic analysis they showed that the spectral characteristics of white noise changed systematically as a microphone was moved from the sound source toward a surface at which that source was pointed. This change resulted from cancelation of certain frequencies and augmentation of others in direct relation to the proximity of the surface to either the speaker (i.e the origin of the source sound), or the microphone (i.e the observer). These changes are explained by interference patterns between the reflected wave and the incident wave which is heard as a rise in pitch as the surface is approached. While participants throughout the literature report this rise in pitch to be a primary cue in distance perception - particularly in tasks that involve movement - Clarke, Pick, and Wilson (1975) present evidence which indicates that intensity may play a role in static distance perception. This seems very reasonable, since the further away a surface is from the observer, the weaker will be the echo relative to the intensity of the source sound. By listening for these parameters, impressive feats of surface detection and distance perception may be accomplished. One of the 2 blind participants in Supa, Cotzin, and Dallenbach (1944) was able to detect the presence of a maisonite screen more than 20 feet away. All four participants were usually able to move to within half a foot without touching the screen. Such findings have been widely replicated under similar procedures involving 27 blind adolescents (Worchel, Mauney, & Andrew, 1950), 20 sighted-blindfolded college students (Ammons, Worchel, & Dallenbach, 1953), three blindfolded adults with progressive vision loss (Juurmaa, Suonio, & Moilanen, 1968; Juurmaa, 1968b), and ten blind children between five and 12 years (Ashmead, Talor, & Hill, 1989). In a study of motion detection, Juurmaa and Jrvilehto (1969; Juurmaa, 1970b) moved square panels of pasteboard 50 centimeters on a side toward or away from 7 blind participants from distances of 70, 120, and 200 centimeters. Participants were asked to indicate when they perceived the target to be approaching, or receding. Levels of performance decreased linearly with distance. At 70 centimeters, most of the participants detected the target's movement within 20 to 30 centimeters - somewhat more than a third the total distance. At 2 meters, most participants fell between 70 and 90 centimeter - somewhat less than half the total distance. These authors found much better performance in a distance recognition task in which these participants had to estimate when a 60 centimeter square metal sheet reached a prescribed distance of 90 centimeters as it was moved toward each participant from a distance of 200 centimeters. Estimates typically fell between one and nine centimeters of the prescribed distance. These results are similar to those found by Kellogg (1962/1964) wherein one of two blind participants could perceive a change in distance as little as four-and-a-half inches with a 1 foot wooden disk at about 2 feet away. Perception of Size Studies in size discrimination have all followed a similar paradigm - a system of paired stimuli. The smallest and largest in a set of stimuli are presented consecutively where the size difference is greatest and most likely detectable, then the next smallest to the next largest are presented, and so on until the size difference becomes so minute as to be undetectable. Using this method, studies have generally found size discrimination to be possible to minute thresholds. For example, Rice and Feinstein (1965a; Rice, 1965) found a 95 percent success rate in the ability of four blind participants to distinguish a 10 mm difference in the diameter of a 90 mm disk presented at 60 cm distance. Juurmaa and Jrvilehto, (1969; Juurmaa, 1970b) found that seven blind participants could reliably distinguish a difference of five square cm in targets of about 60 square cm presented as far away as 2 M. Kellogg (1962/1964) using a slightly different but comparable procedure involving paired comparisons, found that one of 2 blind participants was able to distinguish a 2.5 cm difference in disks of about 22.5 cm presented at 30 cm distance. These studies also demonstrated that size discrimination ability is directly related to the distance of the object from the observer. The perceptual discrimination ability of the participants in Rice and Feinstein (1965a; Rice, 1965), fell as distance was increased. For example, at 60 mm, participants were able to discriminate 10 mm changes in a 90 mm disk 95 percent of the time, whereas at 120 mm, their discrimination ability fell to 20 mm changes in a 215 mm disk 90 percent of the time. Similar trends were found with Juurmaa and Jrvilehto, (1969; Juurmaa, 1970b), and Kellogg (1962/1964). A study conducted by Clark, Pick, and Wilson (1975) suggests that the intensity parameter may be largely responsible for perception of size as it is for distance. This would make sense, given that smaller surfaces reflect less sound, therefore less intensity, just as the intensity is less for surfaces further away. Indeed, these authors show that size and distance difference can be difficult to discern from each other under just the right circumstances. In this study, 12 blind and four sighted-blindfolded participants were presented with two pipes, one twice the radius of the other, at equivalent and different distances, one twice the other. While the participants could distinguish which pipe was which when presented at the same distance, they could not tell the difference between the small pipe presented at the closer distance and the large presented at the further distance. Other parameters that might theoretically be implicated in the perception of size differences would include timbre and directionality. Since higher frequencies reflect from smaller objects more readily than lower frequencies, the timbre of echoes from small objects would be lacking lower frequencies compared to echoes from larger objects. As for directionality, larger objects would return a broader spread of wave fronts than smaller objects. This larger spread would be perceived by the listener as reflecting from a larger surface, or, simply, a surface occupying a greater amount of space. Some studies (reported in the following section), have shown that surface shape and location can be detected using echoes. For example, Rice (1967c) found that several subjects could identify a large triangle, circle, and square, and could discern the location of surfaces to within 5 degrees. Participants vocalized continuous "hisses" and moved their heads to trace the edges of the surfaces. This example makes clear that directionality can play a significant role in perception of surface location and geometry, even though studies have not specifically addressed its relevance to perception of size. Perception of Shape In theory, directional characteristics of reflected energy combined with intensity variations should allow the perception of shape through the use of echoes. Rice (1967c) found that several blind participants could distinguish a large triangle, a circle, and a square from each other with fair reliability. This ability has been replicated in a later study by Hausfeld, Power, Gorta, and Harris (1982) which involved 18 sighted-blindfolded participants. The trick for both sets of participants involved the generation of an oral signal, and then moving the head so that the emitted sound could be used to trace the edges of the shapes presented. No investigations have been reported concerning the effect of size and distance on shape perception. However, since increasing distance and decreasing size have a negative effect on other perceptions such as size discrimination and surface presence vs. absence, it may be supposed that shape perception would suffer as well. Perception of Composition As discussed earlier, spectrographic analyses of coded, ultrasonic reflections indicate that the ability to perceive surface composition through echoes is determined largely by echo timbre - the systematic emphasis and de-emphasis in the return of certain frequencies (Juurmaa & Jvilehto 1969; Juurmaa, 1970b). Different surface textures and compositions seem to reflect certain frequencies better than other frequencies - causing the return of distinct wave patterns that denote the composite nature of surfaces. In Juurmaa and Jvilehto's study (1969; Juurmaa, 1970b), echo recognition of texture was examined with four blind participants. Three 50 centimeter square targets of cloth, pasteboard, and metal were individually presented to each participant at a distance of 120 centimeters. Participants were able to recognize the materials as much as 61 percent of the time. Cloth and metal were most easily distinguished from the other materials, while pasteboard proved somewhat more difficult. These results are somewhat comparable to those of other studies of texture recognition. Using 12 inch disks of different materials presented at a 12 inch distance, Kellogg (1962/1964) found that 2 blind participants with reputedly good echolocation skills could readily distinguish between hard and soft surfaces. Wood, glass, and mental, though virtually indistinguishable from each other, were easily distinguished from denim and velvet. Denim and velvet were distinguished from each other 86.5 percent of the time. In a similar investigation by Hausfeld, Power, Gorta, and Harris (1982) in which 20 centimeter disks of Plexiglas, wood, low pile carpet, and cotton were presented at 25 centimeters distance to 18 sighted-blindfolded participants, the participants quickly learned to recognize the wood and cotton reliably. One blind participant could distinguish wood from cotton with a reliability of 90 percent, but, like the sighted participants, was unable to distinguish the other materials. Surface Location Surface location here refers to the lateral and vertical localization of surfaces, not the distal location as has already been covered. This ability must certainly arise from the perception of the directional parameters of the reflected energy. Although studies have shown that localization of source sounds is possible in the vertical plane (see Middlebrooks & Green, 1991 for a review), no reports could be found that study the ability to localize surfaces in a vertical plane using echoes. Studies have examined surface localization in the horizontal plane. Clarke, Pick, and Wilson (1975) found that 12 blind and 4 sighted-blindfolded participants could localize a wide variety of objects in a surrounding space. Rice (1967c) found that two blind participants could localize an 8 cm disk at 1 M distance to within 5 degrees. In later studies involving 5 blind participants (Rice, 1969, 1970) it was found with 11 participants that localization accuracy fell off as the target was moved closer to 90 degrees left or right. These findings seem consistent with some echo detection studies which have shown that detection ability drops off as objects are moved from the frontal position (Kohler, 1964; Rice, 1969, 1970; Schenkman, 1983). Integrating Echo Perception Variables In order for echolocation to be of use to the auditory observer, two factors must come into play. First, the auditory observer must be capable of integrating the echo information about various characteristics of space and objects within space into a gestalt of spatial awareness. "It is one thing to distinguish among a small set of previously agreed targets, and quite another to make out the features of a totally unknown environment." (Mills, 1963, p. 135) In addition, the integration of this information must allow freedom of motion. It must provide an active gestalt that presents continuous, dynamic information about changing relationships between an auditory observer in motion and the complex network of surrounding surfaces. As Rieser put it (1990), "During locomotion, an observer's network of self to object distances and directions changes, and the accuracy of perceptual/motor coordination depends on the precision with which one keeps up-to-date on the changes" (p. 379). Unfortunately, few studies exist that begin to approach echolocation as a dynamic, complex process. In the 1960's Juurmaa (1965, 1967a, 1967b, 1969) conducted a series of studies involving over 50 blind participants to determine the relationship between echolocation and spatial orientation ability. The echolocation tasks involved surface detection at different distances, and obstacle avoidance. The orientation measure involved such tasks as having to find one's way back to a starting point after being lead circuitously away, and returning to an original orientation after being spun about. Juurmaa found that echolocation (what he called obstacle sensing) correlated very highly with participants' ability to establish and maintain their orientation. This finding suggests that participants were able to use echoes from the walls of the test cite to assist them in their orientation tasks. Another study (Mickunas & Sheridan, 1963) examined the application of echolocation to the negotiation of an obstacle course. It was found that the blind participants encountered much greater difficulty negotiating the course when their hearing was fully blocked than when their ears were free. No such difference was found in a group of sighted-blindfolded controls, indicating that echo information was being utilized by the blind to facilitate their travel. In the mid 1970's, Magruder (1974) investigated the integration of echo information in natural settings. While this was not a study of motion per soˆ, such skills of integration would seem highly salient to successful mobility. A blind adult was positioned in about a dozen distinct, outdoor locations - split up between two separate days. The participant was asked to estimate the distance, direction, and height of every object that he could perceive, and to identify each object. Each estimate was compared to discrete measurements. Out of approximately 60 possible objects, distance estimates were off by about 53%, and height estimates by about 47%. Angle estimations were only off about 20% on average, with 54 out of 56 angles estimated to within 5 degrees of true direction. The participant was able to correctly identify 74% of all objects. The accuracy of all judgements fell sharply with increasing distance. For example, distance judgements rose to about 90% accuracy with objects closer than 7 feet. Although some judgements were correct as far as 20 feet away, inaccurate judgements seemed most predominant beyond 13 feet. Also, the close presence of large objects to either side such as buildings made judgements about other objects difficult. It would have been useful in this particular case if a sighted norm had been established for comparison. Most recently Kish (1995) as part of a larger study, examined the degree to which echo cues could facilitate straightness of travel over a distance of 36 feet. Twenty-one participants with total blindness ranging in age from 4 to 15 traveled in two conditions. One involved strong echo cues wherein participants traversed a corridor 7 feet wide. The other involved weak echo cues, where participants traversed a straight of open cement with the nearest solid wall standing 20 feet to one side. Participants received no formal echo training before testing. Nonetheless, Participants traveled significantly straighter when strong parallel echo cues were present. Although the research is scant on this point, it seems likely that the interpretation of echo information can provide a complex, dynamic awareness of surrounding space. Such an awareness would seem invaluable to the process of orientation and mobility. As Ashmead, Hill, and Talor have observed, his... this perceptual ability is manifested in functionally important behavior such as goal directed locomotion, and awareness of the positions of objects in nearby space" (p. 21). If this is so, then it seems essential to examine the conditions under which the interpretation of this nonvital information can be optimized. Interpreting Echo Information If the best possible use of conventional echoes is to be sought, the variables involved in maximizing their perceptibility under the widest possible circumstances must be understood. The degree to which meaningful interpretation of echoes can be made depends on the characteristics of the echo information and the nature of the environment in which it occurs, and the physical and psychological capacities of the observer to perceive and process that information. The signals used to generate echoes are only as useful as the observer's ability to perceive the information. The parameters of sound must be interpretable by the observer, or that information is lost or meaningless. As already noted the human auditory system can receive sounds ranging in frequency from about 20 Hz to about 20 kHz. Within this range, humans can distinguish about 1400 steps in pitch. In terms of amplitude sensitivity, the human ear ranges from a sound pressure level of 0.0002 dynes per cm squared to about 130 dB above this, and it can distinguish around 350 steps in intensity within this range (Juurmaa & Jrvilehto, 1969; Juurmaa, 1970a). This should speak well of the human auditory system's ability to perceive the subtle nuances of echoes and variations of echo parameters, but the human auditory system also possesses a mechanism that decidedly hampers echolocation - the refractory period. This auditory mechanism attenuates or lowers the ear's ability to perceive a sound about 2 ms after the onset of that sound, particularly where strong or intense sounds are concerned (Wiener & Lausen, 1997). Thus, the parameters of the signal must accommodate these characteristics of the human auditory system if that signal is to be of use to human auditory observers - namely the blind. Signal Parameters Considerable research and some measure of controversy surrounds the application of echo signal parameters to the elicitation of echoes useful to humans. Different investigations employ different perceptual tests, and measure the results in different ways. Also, different researchers come from different backgrounds (natural scientists, psychologists, educators), and hold different perspectives on the matter. Very few have, themselves, been users of echoes. Nevertheless, some sense can be made of the varied results if all the information is considered holistically, and with a degree of sensibility. In understanding these findings, we will draw some inferences from our knowledge of the signal characteristics used successfully by echolocating mammals. For this purpose, of course, we stick to airborn sonar, because sound waves behave somewhat differently in air as opposed to water (Ayrapetyants & Konstantinov, 1974; L. Kay, personal communication, December 4, 2000). Still, there are two factors that distinguish how echolocation might be used most effectively by animals vs. humans which prevent us from drawing hard analogies between them. First, the environmental demands upon animals may differ from those faced by modern humans. Griffin (1986), for example, argues that the environments occupied by humans may be considered more complex and place more rigorous demands upon humans than those occupied by bats. While a bat may have to dive into a bush for a particularly scrumptious sounding insect, they do not need to spend day in and day out negotiating the press of crowds, automotive traffic, shopping malls, etc. Second, the auditory systems of animals may differ in important respects from that of humans. Some species of bats, for example, are capable of registering sound into the ultrasonic spectrum beyond 70 kHz (Griffin, 1986), whereas humans typically cannot hear beyond 20 kHz (Carison-Smith & Wiener, 1996; Wiener & Lausen, 1997). This gives bats access to the advantages offered by very small sound waves such as detection of very minute objects and details. The echolocation ability in bats is also not hampered by a refractory period. However, as Griffin also points out, the auditory cortex of the human brain is many times larger and more complexly designed than the entire brain of a bat - suggesting that humans might be capable of processing more complex auditory information. Either way, the point is simply that, while much can be learned from our knowledge of echolocation in mammals, we must maintain caution. Frequency As already noted, the echo signals emitted by bats are comprised of frequencies from about 30 to about 70 kHz (Griffin, 1986; Ifukube, Sasaki, Peng, 1991). Even within this range, bats vary the frequencies that they emit according to the task at hand - using lower frequencies (between 30 and 50 kHz) for orientation and cruising flight, and higher frequencies (between 40 and 70 kHz) for the interception of tiny targets (Griffin, 1986). This indicates that bats find lower frequencies sufficient for detection of larger objects such as cave openings, trees, and ground topography. These lower frequencies may also be prefered by bats as they will carry somewhat further - allowing them to detect objects at greater distances for orientation purposes. However, they seem to prefer higher frequencies for tracking tiny insects. Many have argued in favor of the need for high frequencies to carry the most pertinent echo information for humans. Riley, Luterman, and Cohen (1964) found strong positive correlations between mobility performance and frequency sensitivity from 500 Hz to 8 kHz in 27 blind participants. This positive relationship grew stronger concerning frequencies up to 14 kHz in 13 of these participants who were specially selected for high frequency sensitivity. This makes theoretical sense. Though high frequencies don't travel as far as low frequencies, the energy that they carry reflects more completely from surfaces that they encounter. Higher frequencies correspond to smaller sound waves, and small sound waves are necessary for good reflections from small objects and small features of surfaces. This is one of the reasons that bats are able to detect and intercept objects smaller than a millimeter. Ifukube, Sasaki, and Peng (1991) found that even humans could detect and localize acrylic poles as thin as 2 mm when ultrasonic echoes between 40 and 70 kHz were brought down into the audible range by a down-coding device. For detection of a 17 mm object, 20 kHz wavelengths might be needed for an adequate amount of information to be reflected. Kohler (1964), for example, presents oscillograms which show that a 50 Hz pure tone changes very little in intensity as a 50 centimeter cardboard disk is moved away from it, but the intensity level drops notably when a 1 kHz tone is used, and still further with a 16 kHz tone. Cotzin and Dallenbach (1950) found that only pure tones of 10 kHz could be used to perceive a large obstacle with any reliability. Rice (1967a) points out that 3 of his participants with moderate hearing loss in the upper frequency regions showed poor performances where detection of small targets and fine discriminations between targets were required. In an investigation by Ammons and Worchel (1953) of the ability of sighted-blindfolded participants to learn to perceive obstacles while walking, all of the several participants with hearing losses of upper frequencies took longer to learn the task. However, the role of pitch in the perception of obstacles is more complicated than a simple relationship between wavelength and performance. Rice's participants with hearing deficits, for example, were able to perform nearly as well as unimpaired participants where larger targets were used (Rice, 1967a, 1967c). Likewise, Clarke, Pick, and Wilson (1975) found that, of a group of 16 participants, 2 who were mildly hearing impaired at higher frequencies did not demonstrate significantly poorer performance in the detection of a wide variety of objects. In the Ammons and Worchel investigation (1953) the participants with hearing loss were able to perceive the obstacle as well as the others, even though it took them longer to learn the task. In a recent study by Carison-Smith and Wiener (1996), participants with hearing loss above 8 kHz were able to perform better than some participants with no hearing loss in detecting a 4 by 7 ft board, and identifying open doorways in a hall. Participants in Supa, Cotzin, and Dallenbach (1944) performed quite well listening through headphones to the experimenter walking toward a wall, even though the microphone had a reported upper frequency cut-off at 9 kHz. Laufer (1946), found that the performance of a sighted-blindfolded participant using an oscillator to detect plywood panels of various widths and heights performed equally well with frequencies of 250 Hz and 15 kHz. A similar result was reported by Myers and Jones (1958) concerning a blind participant using pure tones ranging in ten steps from 250 Hz to 14 kHz. The ability to detect a 6 by 2 foot target at four-and-a-half feet distance was unaffected by the frequency. Finally, research with bats shows that bats, under optimal conditions, can detect a target even smaller than the length of the sound waves used (Griffin, 1986). Griffin further suggests that a human using frequencies as low as 12 kHz might be able to detect a wire as thin as an 8th of an inch (3 mm) at close range, even though according to Rice (1967a) the physical properties of this frequency would seem to correspond more suitably to a disk slightly more than an inch (27 mm) across. Investigations thus far have not demonstrated the ability in humans to detect surfaces as minute as Griffin suggests, but Rice, Feinstein, and Schusterman, (1965) did find a few participants able to detect a segment of quarter-inch metal square-rod at 18 inches distance with the corner or apex of the rod oriented toward them. In this connection, four investigations have indicated that minimum intensity threshold sensitivity at high frequencies does not have a marked affect on many echo detection tasks. Juurmaa (1965), in an examination of 52 blind participants, found that echolocation correlated much more highly with pitch discrimination ability than stimulus intensity threshold measures from 125 Hz to 8 kHz. Kohler (1964) found in 48 participants that their awareness of fluctuating frequency and intensity correlated highly with the obstacle sense. Kohler (1964) found in an additional study of 267 participants that detection of 50 cm cardboard panels did not correlate with absolute threshold data in tests that ranged up to 8 kHz, or with age in participants 4 to 85 years old. Furthermore, De l'Aune, Scheel, and Needham (1974) found no correlation between age in a group of high school students and elderly veterans, and their ability to detect a T-intersecting corridor. De l'Aune and Gillespie (1974) also found no correlation between absolute threshold sensitivity up to 8 kHz and the ability of the veterans to perceive the T-intersection (also reported in De l'Aune, Scheel, Needham, & Kevorkian, 1974). These findings concerning age are relevant, because high frequency hearing in the elderly is almost invariably poor compared to that in younger people. All these findings were replicated by Carison-Smith and Wiener (1996). With nine, sighted-blindfolded participants, detection of small variations in stimulus frequency and intensity correlated highly in the detection of a 4 by 7 ft maisonite board and the identification of open doors, while no correlation was found with sensitivity thresholds measured from 250 Hz to 12 kHz. From these reports, it appears that the ability to distinguish small variations in sound (namely frequency and intensity) is more salient to echolocation than whether or not a sound can actually be heard at a given frequency. In interpreting these seemingly contradictory results, it must be remembered that different tests of echolocation were performed under different circumstances for different purposes. Cotzin and Dallenbach (1950), for example, used a dynamic task with the sound transmitted to the participants under highly artificial conditions. All of the other studies were conducted under more natural conditions, and the specific tasks involved have been quite variable. The Carison-Smith and Wiener (1996) study which showed no correlations between echo performance and hearing sensitivity to high frequency, also conducted this study under conditions where high frequencies were of low presence, and therefore might not have been usable to any effect. Sound pressure levels in this study showed most of the acoustical energy predominant below 2,000 Hz, both with and without footsteps. Also, the participants were requested not to make any vocalizations of their own. Yet, many blind individuals are reported to use various tongue clicks. Of these, this author has noticed the alveolar and palatal clicks to be the most common. According to Ladefoged and Traill (in press), alveolar clicks produce high intensity energy predominantly between 2 and 4 kHz - palatal clicks at 4 to 6 kHz. It is possible that the presence of high frequencies enables the ability to use high frequencies to higher effect for echolocation. Bats cannot fly or catch insects if they are not permitted to chirp (Griffin, 1986). Yet, their use of and need for ultrasonic signals to enable their performance is indisputable. It may also simply be that high frequencies are more efficient for performance in some tasks such as the detailed perception of small targets or target features, but that they are less efficient or less necessary for performance in other kinds of tasks such as perception of larger features, or perception at greater distances. Though the processing of high frequencies has certainly shown its advantages, there are limitations as well. The short sound waves that correspond to high frequencies tend not to reflect well from tilted surfaces for purposes of providing clear echo cues. Kohler (1964) found by the use of oscillograms that much less tilt of a cardboard panel was required to negate the intensity fluctuations of high frequency reflections than those of low frequencies. In other words, a slight tilt of the cardboard caused it to disappear from high frequencies, but much more tilt was necessary before the cardboard could no long be detected by low frequencies. Smaller sound waves may be more likely to be scattered or diffracted by tilted surfaces. Also, as Kohler (1964) and Juurmaa and Jrvilehto (1969; Juurmaa, 1970a) point out, high frequency sounds are much more likely to be obscured or buried by low frequency sounds than the other way around (Wegel and Lane, 1924). This means that echo signals of low frequency may be more effective than high frequencies in situations of high ambient noise such as traffic or construction. Further, pitch and intensity discrimination, the most salient process enabling echolocation, tends to be poor at high frequencies. Kohler (1964), for example, found that discriminability of sound fluctuations such as those caused by the presence of objects was greatest at about 1.5 to 3 kHz. Lastly, as Kohler (1964) and De l'Aune, Scheel, Needham, and Kevorkian (1974) point out, absolute threshold sensitivity and discrimination sensitivity become poorer with age at the higher frequencies, so it may be fruitless for older people to try to depend solely on high frequency information for echolocation. The use of midrange frequencies for echolocation seems reasonable when one considers that standard orientation and mobility tasks rarely require the need to detect the minutest of details. Studies such as Carison-Smith and Wiener (1996) and others mentioned, have clearly established that individuals can accomplish very functional echolocation tasks quite well with poor sensitivity to high frequencies. Griffin (1986) and Rice (1967b) nevertheless argue that the echo image of the environment is made sharpest and most clear by the presence of higher frequencies in source sounds. Wiener & Lausen (1997) point out that frequencies above 4 kHz may be easiest to localize vertically. Laufer (1946) reports the worst performance for a sighted-blindfolded participant at frequencies of 1 and 4 kHz as did Cotzin and Dallenbach (1950). This finding was not replicated by Myers and Jones (1958) with their blind participant, but their's was an entirely static task of presence vs. absence detection, while those of Cotzin and Dallenbach (1950), and Laufer (1946) were dynamic tasks wherein participants made judgements of obstacle distance and location as they walked. It may be that simple detection of medium or large obstacles is little affected by frequency, but that more complex tasks such as localization and distance measurement are. Timbre According to Griffin (1986) and Ifukube, Sasaki, and Peng (1991), bats use complex, broad band timbres to comprise their signals. Usually, their chirps span almost an octave, which means that they combine lower and upper frequencies in the same signal. Studies of timbre seem to agree that complex, wide band timbres yield more useful echo information than simple wave forms of narrow band. When comparing the use by a sighted-blindfolded participant of a buzzer vs. pure tones ranging from 250 Hz to 16 kHz as source signals, Laufer (1946) found that the buzzer allowed fewer collisions and more detections of various sized panels at further distances than did the pure tones. The participant also reported that the buzzer was easier and more pleasant to work with. Dallenbach and his associates found performance with pure tones transmitted to participants through a microphone and headphones to be greatly inferior to footsteps (Supa, Cotzin, & Dallenbach, 1944) and wide band noise (Cotzin & Dallenbach, 1950). Finally, Kohler (1964) found that oscillograms of pure tones vs. white noise aimed at a receding cardboard panel clearly show intensity decreases that are much more marked with the noise than the pure tones. Kohler explains that the advantage of complex over simple timbres probably lies in the fact that they combine properties of many frequencies into one composite signal. This may elicit the sharp detail that high frequencies afford while allowing maximum intensity discriminability with the midrange frequencies. Intensity According to Dr. Johnathan Fritz of the National Institute of Health (personal communication, May, 2000), bats generally emit their pulses at intensities exceeding 120 dB. By human standards, this is considered extremely loud. Twersky (1953) has reported that sounds of medium intensity yield better object perception than sounds of high intensity. On the surface, this would seem counter-intuitive, since louder sounds should produce louder and therefore more audible echoes. There are two factors, however, that explain why very intense sounds may not allow good echolocation for humans. The first involves the fact that echo information is always much quieter than the sound or signal that produces it - particularly echoes from small or far away objects. If the signal is too loud, the echo cannot be heard over the volume of the signal. The signal blots out the echo; it is said to "mask" the echo. The second issue is more complex. It has to do with the auditory constraints of the human observer. Unlike bats, humans possess mechanisms in the auditory system which dampen reception immediately after the beginning or onset of a sound. These include the stapedious reflex and the neural refractory period (Carison-Smith & Wiener, 1996). This means that a sound seems to get quieter right after it starts - particularly very loud sounds. The actual intensity of the sound does not change, just the perception of its intensity. These mechanisms serve to protect the ear from damage resulting from very loud sounds, and also to increase speech intelligibility by causing each phonetic articulation to seem discrete and somewhat distinct from the others. Otherwise, all speech would seem to blur together. Unfortunately for the human echo user, these sound dampening mechanisms tend to diminish the extent to which echoes - which always occur after the onset of a sound - can be received and processed. In view of these problems, it is essential that other parameters be considered carefully so that a maximum of useful echo information is made available to those who need it. Envelope According to Griffin (1986), bats and nocturnal birds emit chirp that have very sharp attacks, and last no more than a few milliseconds, or even less than a millisecond. It seems that in order for a signal to elicit useful echoes, it should allow the majority of the echo to be heard by the observer. Twersky (1951a) and Kohler (1964) report that signals of brief duration (pulsed signals) were more pleasant to work with and enabled better object localization than signals of lengthy duration. Shortening the duration of the signal gets the signal out of the way quickly so that the echo information can best be heard. If a signal is intense but over very quickly, most of the echo information returns after the pulsed source signal is finished, and is therefore not masked by the source signal. The echo may still be somewhat suppressed by dampening mechanisms in the auditory system, particularly if the source signal is very loud, but the shorter the signal, the more audibly clear the echo will be in any event. Griffin (1986) suggests that a pulsed signal of less than 10 milliseconds duration would be optimal for good echolocation in humans. He points out that bats often use pulses of less than one millisecond. Most tongue clicks used by humans fall between five and ten milliseconds in duration (Ladefoged & Traill, in press). In addition to short duration, there is good theoretical support for the use of a signal with a very rapid rise and decay time (W. De l'Aune, personal communication, May 6, 1993), such as used by the bats. A signal with a rise time of under 2 milliseconds, for instance, generally yields a special component of complex frequencies that may extend high into the spectrum. This is called a "click transient". It amounts to a very brief burst of white noise at the rise time of the signal which can yield very high frequencies depending on the physical nature of the signal. Even if the signal itself is only comprised of low frequencies, a very quick rise and decay time provides a complex spread of frequencies to a very high range. This is significant, because many signals that have anecdotally been found useful for echolocation such as finger snapping or tongue clicks (discussed later) would not contain especially high frequencies if it were not for their quick rise and decay. There are, however, two investigations that call the supremacy of pulsed signals into question. Rice (1967b) found no differences in performance at most tasks between participants who used orally produced click vs. hiss signals. These findings held when oral signals were substituted for electrically generated clicks of 4 milliseconds duration, and electrically generated white noise, except that participants tended to do better with the artificial signal that most resembled their orally produced signal. However, in a shape recognition task involving several blind participants, those using an orally pulsed signal such as a tongue click did somewhat worse than the one participant who used an oral hiss sound. Rice conjectures (1967b) that the use of a continuous signal allowed the participant to trace the edges of the target more effectively than with pulsed signals like those used by the other participants. Unfortunately, Rice does not provide specific data as to the types of tongue clicks used by his participants, except that they had slow rise times, and ranged from about 25 to 75 milliseconds in duration. Also, it should be noted that the participant who did so well on the shape discrimination task by using a hiss signal later indicated that he might have improved his performance on distance perception and size discrimination tasks if he had used a tongue click instead (W.A. Gerrey, personal communication, April 12, 1993). With five blind participants Schenkman (1985a) compared electronic clicks of 1.5 milliseconds with white noise signals of one second in detecting a two by one-half meter maisonite board at distances of one, three, and five meters. The white noise was generally found to be somewhat superior, but these results are not clear. The difference seems dependent on individual participant performance and the distance to the target. One participant showed better performance with the click signal, and, interestingly, this one was the most proficient of the five at object detection for all distances. Perhaps the more proficient one is at echolocation, the better use one is able to make of "ideal" information and optimal cues. Directionality In order for a signal to elicit useful echoes, it must allow the greater portion of the reflected energy to return to the ears of the observer. The bats, for example, have evolved a mouth shaped to deliver highly focused echo signals, and they move their ears to catch the minutest reflections (Griffin, 1986). For purposes of echolocation, directionality can be divided into two related components with respect to the ears of the observer - the primary direction of the source signal, and the primary direction of the reflected energy. Concerning the direction of the signal, Laufer (1946) and Twersky (1953) found highly directed signals to yield better performance than undirected signals. Directed signals should be most useful, because the primary energy of the signal is focussed away from the ears of the observer. The signal remains the same volume as if it were undirected, but the ears receive it at a lower intensity because most of the signal's energy is directed away - much as the sound of a trumpet seems quieter when standing behind the trumpeteer than directly at the mouth of the trumpet. By thus shielding the ears from the primary energy of the signal, a more intense signal may be used to elicit stronger echoes. These echoes are then quite audible, because the ears are shielded from the bulk of the source signal's energy, and therefore more exposed to the reflected energy. Because the energy of the source signal is quieter to the ears, the auditory system is less apt to engage suppressive mechanisms that interfere with hearing the echoes. The primary direction of the reflected energy is determined by the direction of the source signal relative to the reflecting surface or surfaces (Wilson, 1967). In turn, the degree of reflected energy reaching the ears of the observer depends upon the relative position and orientation of the observer's ears to the position and direction of the source signal and to those of the reflecting surfaces. Thus, a signal emitted at or near the ears of the observer and directed at a perpendicular surface is expected to yield the strongest and most detailed perception of that surface - provided that the echo signal does not mask the echo. This theoretical perspective is born out by practice. According to Griffin (1986), bats seem to rely primarily on the chirps that they produce rather than noise emanating from elsewhere in the environment. When bats are prevented from flying by gags, no known force can convince them to fly. When they are required to fly in the presence of artificially produced ultrasonic noise, they continue to use their chirps for navigation. There is no evidence to suggest the bats flying in their sworms give their chirps a rest while gathering environmental information from the many chirps happening around them from their airborn neighbors. Two investigations into the relationship of source signal to listener have found mixed but interpretable results. Schenkman (1983, 1985) studied the effect of object detection with the object and echo signal varying in their locations with respect to the listener. In the first study (1983) Schenkman found that, using cane taps, a group of 6 blind participants was able to detect a small target placed in front of them much better than a group of 4 blind participants to whom the target was presented to either side. Also, this group of 6 was able to detect a 38 cm aluminum disk more easily with oral signals such as clicks and hisses than with cane taps. This later result finds corroboration with a later study by Schenkman and Jansson (1986) of the effectiveness of different types of cane taps in producing echoes. In this study, the authors had to exclude the data from one participant who would not refrain from using tongue clicks, and whose consistently scored well above the other participants. While cane taps and hisses share few spectral characteristics, the spectral characteristics between cane taps and tongue clicks are not dissimilar (Schenkman & Jansson, 1986). Taking these two findings together, it seems fairly reasonable to attribute some of the discrepancy in performance to the different relationships between echo signal, target, and observer. When targets were presented to the side rather than in front, much of the acoustic energy radiating from the cane taps simply missed the target allowing little energy to be reflected. When the target was in front, more of the acoustic energy struck the target, and was, therefore, returned. When signals were produced orally, the amount of reflected energy was further increased. The acoustic energy traveled more or less straight from the participant's head, struck the target, and returned more or less directly to where it originated. When canes were used the acoustic energy followed different lines. It radiated in all directions from the cane tip which delivers an entirely unfocused signal - sending only a small portion to the target located somewhere above the source. Much of the energy would pass beneath the target. The angle at which the acoustic energy struck the target was oblique, causing that energy to be deflected rather than reflected. As in the experiment wherein targets were presented laterally, relatively little of the reflected energy would ultimately have reached the ears of the observer. This interpretation is somewhat born out by an additional study in this investigation. Using cane taps only, 8 blind participants were able to detect a small target more easily when it was presented at waist level than at head level. In this scenario, the acoustic energy emanated from the cane tip as before, but much more of it struck the target in the lower position than in the higher position. Thus, more of it had an opportunity to return to the ears of the participants. In fact, detections of the lower target were even a little better with the target 3 M away than 1 M away. With the closer distance, it seems that much of the acoustic energy passed beneath the target, and could, therefore, not be reflected. When the target was further away, the path of travel of the acoustic energy was more direct, since the angle between incident and reflection was wider. These findings bare resemblance to those of Clarke, Pick, and Wilson (1975), who found in a study of 16 participants that detection of curbs less than 20 cm high decreased markedly as the distance decreased to less than 50 cm from the participants. A later investigation by Schenkman into the issue of directionality (1985) found results which seem on the surface to contradict those just reported. This investigation examined the ability of 5 blind participants using artificial signals originating at head, waist, and ground level to detect the presence of a target. The target measured 2 M tall by 0.5 M wide, and was presented at 1, 3, and 5 M distance. For all distances detection reliability was highest when the signal originated from the waist, and lowest when the signal originated from the head. This would seem contradictory to both theoretical predictions and empirical findings in favor of source signals originating near the ears, but two factors must be considered. First, this report does not make clear the directional characteristics or volume of the signal. It may be that the signal, when presented too near the head, served to mask or otherwise dampen the perception of returning echoes. Also, and perhaps more importantly, the nature of this target was different from that used in other studies. The other targets were quite small - occupying only a small region of vertical and horizontal space. They were especially susceptible to acoustic energy passing around or beneath them. The target in this later investigation was quite tall and relatively wide. Though the patterns of returning acoustic energy differed depending on the location of the signal, signals aimed at the target from any vertical position always struck the target. In this scenario, the least energy striking the target would emanate from the head position, since much of the energy would pass over the target. Signals presented from the ground might have been largely absorbed or deflected from the target by the ground. The location offering the most returned energy would logically have been the waist where energy would not pass too freely over the target, or be deflected from it. It should also be remembered that, since this was a continuous signal, unlike cane taps as discussed earlier, it is possible that the signal itself masked echoes. When the signal originated further from the ears (i.e., at waist or feet), the signal would be directed away from the ears, and therefore quieter. This would reduce the masking of echoes, thereby improving performance. Signal Consistency In discussing the conditions that optimize echolocation, a brief note is needed concerning the consistency of a signal. Rice (1967a, 1967b) found that blind participants were able to use a variety of artificial signals to accomplish given tasks, but performance was always highest when those artificial signals resembled those to which the participants were accustomed through long, previous practice. This is an important consideration, because echoes of some sort are elicited to varying degrees by just about any type of sound. It behooves a blind listener to know what signals can be relied upon for the best information. For this, it would seem reasonable to suppose that familiarity with the use of certain combinations of parameters would increase the reliability of such a signal. If the blind observer should be inclined to elicit echoes by deliberate means, it would seem prudent to develop such a familiarity. In other words, familiar feedback should be easiest to interpret. This also implies that the signal should be self-produced and under the control of the user, rather than haphazardly produced by random events in the environment. Just as we learn to recognize a familiar voice in a crowd or the familiar sounds of particular utterances against a background of noise (Ladefoged & Traill, in press), so the echo user learns to pull echoes of familiar signals out of the environment. We can reiterate the bat's preference to travel by its own signal rather than relying on the signals of others. The Ideal Echo Signal The ideal signal should quickly and easily provide useful information about the greatest variety of objects and surfaces under the widest possible circumstances - noisy or quiet, cluttered or open. It should be clear from the foregoing discussion of signal parameters that it is fruitless to consider a single parameter isolated from all other parameters, since all of them integrate to provide optimal conditions for echolocation. Taken as a whole, the ideal signal would incorporate acoustic parameters that make use of frequencies throughout the audio spectrum, and maximize the return of echo information to the ears. The preponderance of experimental evidence on both humans and bats together with theoretical considerations implicates a pulsed, directed, complex signal of variable intensity and quick duration originating near the ears to be optimal for use by humans. Further, the signal should be an active or deliberately produced signal that is relatively consistent in its acoustic characteristics. Active signals fall into two categories - artificial and organic. Artificial signal production requires the use of an external signaling device. Such devices tend to be cumbersome and obtrusive. They typically require an off hand to operate, and the noise they emit calls attention to the user (Beurle, 1951; Greystone & McLennan, 1968). However, producing signals by artificial means can offer the advantage of allowing signal parameters to be designed with precision to optimize echo information. Signals designed by electronic or mechanical means can incorporate many of the optimal characteristics. Many types of electronic signals have been used for echolocation including buzzes and high frequencies (Cotzin & Dallenbach, 1950; Laufer, 1948; Myers & Jones, 1958; Witcher & Washington, 1954), pulsed and continuous white noise (Clarke, Pick, & Wilson, 1975; Cotzin & Dallenbach, 1950; Mills, 1963; Rice, 1967a, 1967b; Schenkman, 1985a), and transient clicks (Beurle, 1951; Greystone and McLennan, 1968; Rice, 1967a, 1967b; Schenkman, 1985a, 1985b). Electronic generation offers the broadest flexibility in signal design, but this method of production tends to be costly, and requires a power source and periodic maintenance. Mechanical devices typically take the form of snappers and clickers. Such devices have been used occasionally to train the blind in echolocation. The first was developed by Griffin in 1944 (reported in Witcher & Washington, 1954). It was a metallic snapper housed in a parabolic shell to focus the sound and direct it away from the ears, and it was used successfully to train blinded veterans. A similar but smaller device was developed by Twersky in the 1950's (reported in Griffin, 1986), which found similar success. Recently Boehm (1986), found that five blind participants could use a hand-held clicker without prior training to identify correctly most of 25 pre-designated features in a 160 by 20 foot hallway. The particular clicker that they used is marketed in the form of toys shaped as frogs or insects. Mechanical devices such as these are less costly than electronic devices. However, they require frequent replacement, and they can not be designed with maximum flexibility. User control over intensity, for example, is typically limited. Furthermore, in the most portable, least cumbersome devices, the emitted signal is not well focused. Cane taps and footsteps might fall into the category of mechanically produced sounds. While possessing none of the disadvantages of other forms of artificial signal production such as expense, maintenance, etc., they do not necessarily possess any of the advantages either. While such signals can and do generate useful echoes (Carison-Smith & Wiener, 1996; Schenkman, 1983; Schenkman & Jansson, 1986; Supa, Cotzin, & Dallenbach, 1944), neither cane tips nor shoe soles necessarily generate signals that optimize echo information. In particular they are nondirectional, they originate far from the ears, the energy they emit is largely absorbed or deflected by the ground and changes according to random ground variation, and the spectral components cannot be effectively optimized. Footsteps, for example, may not produce energy in the upper portion of the spectrum (Carison-Smith & Wiener, 1996), so blind observers would not be able to make use of high frequency information (if they were so inclined) by relying on footsteps alone. Also, cane taps over grass, carpet, or soil are likely to produce little in the way of usable echoes. Organic signals hold few of the disadvantages of artificial production. They need not require extra manipulation, they are always available to the user, they need not be cumbersome or unwieldy, servicing requirements are minimal, and they are free of charge. They may not offer the full flexibility that electronic signals may deliver, but organic signal generation does constitute a broad array of parameters nonetheless. Blind echo users are known to self generate a wide variety of signals from hand claps and finger snaps, to vocal and oral signals. Hand clapping and finger snapping have the advantages of strong intensity, medium spectral complexity, and quick onset and duration, but these signals are unfocused, and require the use of the hands which are often not conveniently available. Oral signals, on the other hand, require no extra manipulation, are more directional, and are quite flexible. The most common type of signal referred to in the human echolocation literature is the oral click. Nearly every work that deals with echolocation in the blind mentions the oral click as a common signal (e.g., Kellogg, 1962/1964; Kish, 1995; Magruder, 1974; McCarty & Worchel, 1954; Myers & Jones, 1958; Rice, 1967a; Schenkman & Jansson, 1986). Information is rarely provided as to the type of click, but the scant information that is available suggests that a variety of clicks are used. Jones and Myers (1954) and Myers and Jones (1958), for instance, mention "lip clicks", and Rice (1967a; 1967c; circa 1970) indicates that the tongue clicks used by his participants varied in duration from about 25 to 75 milliseconds. McCarty and Worchel (1954) who studied a blind boy's ability to ride a bicycle with great facility, indicate that the click that he used to accomplish this feat resembled that of a toy cricket. Phoneticians classify oral clicks into five distinct types according to how the click is physically produced (Ladefoged & Traill, in press). Each type of click has different envelope, intensity, and spectral characteristics. Theoretically, clicks in general should form good signals for eliciting echoes, and empirical evidence demonstrates that they can be used quite effectively (Rice, 1967a, 1967c). They may be of fairly short duration (down to around 5 milliseconds), complex, fairly directional, and their intensity can be varied by the user from very quiet to quite strong. Ladefoged and Traill show clicks to be more intense than other normally spoken sounds. In addition, these authors report a study in which 10 native speakers of African dialects found tongue clicks to be more easily distinguished than other consonants from a background of white noise presented through headphones. These findings hold special significance to echo users in light of a study by Kohler (1964) which showed that high background noise drastically reduced echo performance for 20 participants. It is clear that an echo signal must possess sufficient intensity, distinction, and consistency to elicit echoes that are distinguishable from background noise. Depending on the oral click used, primary spectral frequency is reported to vary from about 0.9 kHz to about 8 kHz. Rise times range from about 1.2 ms to about 8 ms, with duration ranging from about 6.6 ms to about 20 ms. Theoretical considerations would implicate the click with the sharpest rise time, shortest duration, greatest intensity, and highest mean frequency as having the greatest utility for echolocation. However, little empirical evidence is available on this point. In fact, the only study that may be applicable does not actually examine differences between oral clicks, but rather differences between the spectral characteristics of taps from different canes (Schenkman & Jansson, 1986). With 2 blind participants no differences were found in an obstacle detection task relative to the differing spectral characteristics of 10 distinct canes. Hard conclusions regarding the relationship between spectral characteristics and echo performance are impossible to draw from this study. It may be that spectral differences in echo signals must be greater in order for impact on echo performance to be appreciable. Or, much more sensitive measures of performance may be necessary to find differential impact. Spectrograms presented in this report do not bare striking differences to those of various oral clicks (Ladefoged & Traill, in press). If broader spectral differences in echo signals are necessary for echo performance to be appreciably affected, than the use of different oral clicks may result in little variation in performance. Generally speaking, the pulsed, complex, and directional nature of oral clicks would seem to make them highly effective echo signals. The spectral and parametric differences between them may further enhance their utility. The control of parameters such as intensity, timbre, and directionality make oral clicks easily adjustable to fit the requirements of varying situations. An increase in intensity, for instance, can help cut through heavy ambient noise so that echoes from distant or small objects may be elicited and perceived. Decreasing intensity may be needed to eliminate extraneous echoes in highly reverberant environments, or to keep the click unobtrusive in quiet, close environments where others do not wish to be disturbed. Its direction may be focused downward to locate curbs, steps, or grass lines, or focused upward for overhangs. If the effective use of echolocation is to be optimized by an active, deliberately produced signal, there is good reason to consider the oral click as a prime candidate. While oral clicks have not been directly compared to other sounds in terms of effectiveness, an excellent example of their use can be found in the oil bird which, according to Griffin, skillfully navigates the absolute darkness of deep caves (cited in Witcher & Washington, 1954). According to these authors reports, the acoustic parameters of the click produced by the oil bird strongly resemble those comprising some oral clicks commonly produced by humans - perhaps most resembling the palatal click in duration and mean frequency (Ladefoged & Traill, in press). Among humans McCarty and Worchel's (1954) examination of a blind boys' bicycling skill serves as a most impressive demonstration of echo-mobility by oral clicking. Likewise, this author and his students shown bicycling at moderate speeds through complex and unfamiliar terrain emitted intense, sharp tongue clicks with a frequency of more than one per second. When interviewed, one of the students said "the click is used just to focus in on an object." (Cowger, 2000) While the environmental demands on a blind human probably surpass those of the oil bird by a fair margin, the preponderance of theoretical support and empirical evidence, together with apparent examples of success, point to the oral click as instrumental in facilitating the mastery of echolocation. ACQUISITION OF ECHOLOCATION SKILL Studies of hundreds of humans strongly suggest that all hearing persons can learn to perceive and interpret echoes to some degree - either by active or passive learning. It is not, as once believed (Hayes, 1938), a special endowment that may be appreciated by only a fortunate few. In fact, though it is commonly found that the ability to perceive and interpret echoes is highly variable among the blind, it has nevertheless been shown to manifest to some degree in the majority, and to a high degree in many. In a study of 52 blind participants in Helsinki Finland, for instance, Juurmaa (1965) found 87 percent able to demonstrate some ability to sense the presence or absence of panels of various sizes at various distances, and six of these showed perfect performances at a distance of 2.5 meters. Although few investigations have been reported concerning the specifics of training echolocation, most investigations have indicated improvement in the participants studied regarding the given task. Training and practice trials are common, and always show improvement. For example, Hausfeld, Power, Gorta, and Harris, (1982) report considerable improvement for all 18 of their sighted-blindfolded participants on both the shape and texture discrimination tasks. Magruder (1974) found that, in a two day study of distance, direction, and object perception, her participant's estimates of distance improved over 38% from one day to the next given practice and feedback. Those investigations that do specifically examine the issues behind training echolocation have generally found very positive results. Among the first of these can be attributed to Worchel and Mauney (1950) who studied the effects of practice on the ability of seven blind children to perceive a maisonite board like that used by Dallenbach and his associates (Supa, Cotzin, & Dallenbach, 1944). The procedure was also the same as in the Dallenbach studies, with first perceptions and final appraisals of target distance being used as indices of perception, together with frequency and force of collisions. Initially, participants' perceptions of the target were erratic and inconsistent. Collisions were frequent and forceful. Over the course of 210 trials spread over four days, all participants showed markedly increased consistency in the perception of target proximity. Final appraisals dropped from as high as 150 cm down to less than 30 cm for all participants, and the frequency of falsely perceiving the target decreased by more than 75 percent. Frequency of collisions between the pre- and post-test runs decreased from 56 to 19, and the force of collisions decreased very markedly as well. All of the participants showed the majority of their improvement over the first 30 to 60 trials, indicating an asymtotic learning curve. The asymtotic nature of echo training was replicated a few years later by Ammons, Worchel, and Dallenbach (1953). This experiment involved 20 sighted-blindfolded participants, and used the same classic procedure as in other Dallenbach studies. Again, participants' ability to localize the target and avoid collision with it decreased substantially over the course of a few day's practice. With these participant, however, progress was quite slow for the first few trials, then, picked up suddenly. Participants indicated a sudden awareness of the parameters of the task - of what to pay attention to and how. Once this insight was achieved, learning progressed rapidly before tapering off. These trends are similar to those found by Kohler (1964) in which 20 participants learned to increase their ability to judge distance over a six week training period. Juurmaa, Suonio, and Moilanen (1968; Juurmaa, 1968b) trained three individuals with progressive vision loss in several skills areas - avoidance of different sized and multiple obstacles, and determination of height and breadth. The participants walked down a path on which one, two, or zero obstacles of varying size were placed. The participant was instructed to indicate when he first perceived each obstacle, to stop 0.5 m before reaching the obstacle, and to provide an estimate of the obstacle's dimensions. Sessions ran about 30 minutes a day for four weeks. Participants learned to avoid collisions quickly and in a similarly insightful manner as previous studies have demonstrated. However, first perception increased more evenly and gradually over the course of training. Perception of dimension was the most difficult skill of all to learn. While there was improvement for all participants in all skill areas, it was noticed that the participant who had the best initial performance made the least progress relative to the others. This study thus suggests that those who have less to learn, learn less. This phenomenon was born out in a study by Greystone and McClennan (1968) of 26 blind children. Participants were instructed to navigate an obstacle course with the assistance of an electronic clicker. The obstacle course consisted of a series of walls with an opening at a different point along each wall. The effect was a maze of off-set openings through which the participants had to travel. After the participants had completed the task, they were given the electronic clicker, and told to practice at home over the summer. When the school year resumed, the children were tested again. It was found that participants who had done well to begin with did not improve, but those who had done poorly to start with improved markedly. Collisions and hesitant stops were reduced by about 50 percent, and time to complete the course was reduced by about 16 percent. No data were available regarding the nature of practice that took place over the summer. Finally, Clarke, Pick, and Wilson (1975) studied 16 participants in a course of training to improve participants' ability to negotiate a complex obstacle course with and without the use of a signaling device. Forty minute training sessions took place twice weekly for eight weeks. Participants were introduced to a variety of object perception tasks involving a diversity of objects including curbs, furniture, pipes, etc. For example, in one task, participants were asked to rotate about a room full of objects, and describe any object they sensed around them. Feedback was provided regarding accuracy. All participants improved on all tasks with and without the signal generator between pre- and post-assessments of skill. The research is clear that anyone without severe hearing loss can learn at least basic echolocation, and many appear to be able to learn more complex skills as well. Moreover, much insight into how echolocation might best be learned can be gleaned from this information. As Ammons, Worchel, and Dallenbach put it in 1953 "The implications ''' are far reaching ''' that all persons, blind but otherwise normal, are capable of learning to perceive obstacles, and that there is no reason other than the lack of courage or the will to learn for any of them leading a vegetative existence in which he has to be lead about." (p. 551) If echolocation can be passively or actively learned under appropriate conditions, then it stands to reason that, given the right conditions, echolocation can be actively taught. Developing A Comprehensive Training Program The research to date yields clues that can be used to facilitate the development of an echolocation training program. The primary issues include what needs to be taught, and how is the teaching to take place. In order for a training program to be worthwhile, it must be practical. Exploring the limits of echolocation and establishing psychophysical measurements certainly has its places, but if a training program cannot teach perceptual skills that will apply to the enhancement of a person's functioning, that program has little immediate, practical utility. The most useful application of echolocation for a bat is in the facilitation of its ability to survive - i.e to hunt, roam, and find shelter. In essence, for them, echolocation serves the same purpose that vision does for other animals; it enables them to carry out their lives. The same may be said for humans. In order to survive, people must be able to meet their needs, or see that their needs are met. One of the most instrumental aspects of this process involves the ability to transport oneself from one place to another. The inability to move can be said to curtail sharply a person's ability to obtain and apply needed resources. Therefore, an echolocation training program should hold its primary focus on the development of skills that will enhance the processes of movement and navigation. Two key aspects of movement and navigation may be argued - security, and efficiency. According to Jansson, (1989), the process of blind movement can be divided into two functions: walking toward, and walking along. Walking toward involves the process of maintaining one's orientation toward a goal. This may be a proximate or distance goal. Walking along refers to the on-going process of controlling one's locomotion - processing environmental features and acting in accordance with them. The ability to maintain one's orientation and good control over one's locomotion constitutes efficient travel, but efficiency must also mean security. Studies in blind mobility have pointed to three factors that constitute secure travel (Leonard, 1972; Armstrong, 1975): the ability to stay on a path without accidental departure, the ability to avoid bodily contact with objects, and the ability to cross streets quickly and directly without incident. Barth and Foulke (1979) discuss variables of security and efficiency in terms of "preview" - the ability to perceive adequately the features of an environment in advance of one's position. They argue compellingly that advanced awareness allows for effective planning and appropriate responses to conditions ahead. Given these elements it seems reasonable that, if an echo training program is to be practical, it must develop skills that facilitate the maintenance of orientation to and among near and distant objects and environmental features, the ability to identify surrounding objects and features, and the ability to control locomotion among objects. Although there is some president for the inclusion of echolocation into mobility curricula (Amendola, 1991; Carison-Smith & Wiener, 1996) very few specific techniques for teaching are available. It is clear that development of echo skills can occur through practice and feedback, but that's about all that is clear. The development of specific training techniques is, therefore, much needed, and wide open. In devising techniques for training echo skills, it would seem essential to keep in mind the unique needs of the population being served. For example, while deficits in spatial awareness and comprehension are not necessarily pervasive among those blind early in life (Jones, 1975; Loomis, Klatzky, Golledge, Cicinelli, Pellegrino, & Fry, 1993), they are, nonetheless, common (Hart, 1980; Hill, Rieser, Hill, Hill, Halpin, & Halpin, in press; Warren, Anooshian, Bollinger, 1973). It is, therefore, necessary that a program specializing in the apprehension of space be sensitive to such issues. For example, many of the blind, particularly the young, establish manual groping or sweeping gestures that are fundamental to object contact or acquisition (Martinez, 1977). In the preliminary implementation of an echo training program for young blind kids, it may, for some students, be necessary to devote some attention to the instruction of directed reaching, or to design alternate exercises that do not require reaching responses. Moreover, this author has observed that head centering behavior is often lacking in the blind, particularly the early blind. The head is often oriented obliquely to sound, favoring one ear. Other postural anomalies are also common (Martinez, 1977) which may make head orientation difficult. The elicitation of head pointing responses may not be appropriate at first. It may be best to instruct students to turn their chest or back to the relevant stimuli by way of response. This is not to suggest that blind people normally exhibit these characteristics. Blind people can and often do exhibit highly coordinated movements and keen body awareness. However, the characteristics mentioned above occur with sufficient frequency to warrant appropriate accommodations to optimize instruction. Another aspect in which instruction must be sensitive to student factors concerns age. It seems reasonable to suppose that different skills might be appropriate to different ages, and that forms of instruction would have to vary in order to optimize instruction to a wide age range. For example, younger students may not possess a grasp of basic spatial concepts such as right vs. left, above vs. below, near vs. far, and so on (Garry & Ascarelli 1960; Warren, 1989). Some blind children may not understand "facing" or "reaching for" something, or their performance at such tasks may simply be poor. Juurmaa (1967a, 1967b) indicates that development of spatial skills continues to occur after the age of 10. Techniques should be designed to at once circumvent and develop comprehension of such spatial concepts. For example, spatial terminology (right, left; up, down; near, far; etc.) may be used in conjunction with tactual cues (touching corresponding body part - shoulder, top of head, leg, etc.) and interaural and distal cues (positioning experimenter's voice in space to correspond to spatial vocabulary). For some students in the beginning, it may also be helpful to pair source sounds with echo stimuli. A student may find it easier to respond to something that seems more concrete by its source noise than abstract by its reflective properties. Though echolocation alone tends to be a phenomenon that is consciously "felt" more than "heard", echo users nonetheless use auditory scanning techniques for orientation (Kellogg, 1962/1964), so skills learned in this way may generalize with practice to genuine echo tasks. They may also help to acquaint students with lesson parameters and procedures. EVALUATION OF THE CURRENT PROGRAM Although the specific mechanisms underlying the technical aspects of echolocation in humans have been fairly well studied and are well understood, particularly concerning blind humans, no systematic study of comprehensive training for complex echo-mobility has been reported. Most of the studies in this area are based on simple trial and error methods that concern very basic skills. They may address the question of whether or not echolocation can be learned. However, the application of echo skills to complex mobility, and the question of how such skills should be actively taught for optimal effect, remain to be answered. Kish (1995) explored these avenues through the implementation of a pilot program of echo instruction. The study involved 23 totally or functionally blind students in 4 regular public schools aged 4 to 15. Causes of blindness were varied. All students were mainstreamed for at least part of their day, and all were functioning at or near grade level. The students received between 6 and 8 hours of echo training over a 12 to 16 week period according to a carefully designed program that was applied consistently to all. Due to attrition, only 12 of the students received both pre- and post-tests. The test examined improvements in two tasks. The first was straight line travel down an 7 foot by 36 foot transparent corridor. The second was the approach and location of two transparent targets from a distance of 10 feet. The large target was 4 feet by 4 feet square, and elevated about 2 feet above the ground. The smaller target was 1 foot by 4 feet, and likewise elevated. Although this study did not reach significance it allowed the collection of comprehensive qualitative and quantitative data relevant to the teaching of echo skills. This information has enabled the establishment of stronger, more effective programs of echo instruction which have since been tried by Kish and colleagues in their daily practice. They have also enabled the conceptualization of testing procedures that are believed to be much more sensitive and robust. A systematic, comprehensive training manual is underway which will be open to further field testing and refinement. The qualitative findings of this project are detailed in "ECHOLOCATION: What It Is, and How It Can Be Taught and Learned," found on this web site. The complete study can be requested from California State University, San Bernardino. REFERENCES Amendola, R. (1991). Vidiation and Spatial Orientation. Carroll Center for the Blind: Newton, Massachusetts. Ammons, C.H., Worchel, P., & Dallenbach, K.M. (1953). "Facial Vision": The perception of obstacles out of doors by blindfolded and blindfolded-deafened subjects. American Journal of Psychology, 66(4), 519-553. Armstrong. (1975). Evaluation of Man-Machine Systems in the Mobility of the Visually Handicapped. in R.M. Pickett & T.J. Triggs (Eds.) Human Factors in Health Care. Lexington Books: Lexington. Ashmead, D.H., Hill, E.W., & Talor, C.R. (1989). Obstacles Perception by Congenitally Blind Children. Perception and Psycho-Physics, 46 425-433. Ayrapetyants, E.Sh. @and Konstantinov, A.I. (eds.) (1974). Echolocation in Nature. Joint Publications Research Service: Arlington, Virginia. Barth, J.L. & Foulke, E. (1979). Preview: A neglected variable in orientation and mobility. Journal of Visual Impairment and Blindness, 73(2), 41-48. Bassett, I.G. & Eastmond, E.J. (1964). Echolocation: Measurement of pitch vs. distance for sounds reflected from a flat surface. Journal of the Acoustical Society of America, 36, 911-916. Beurle, R.L. (1951). Electronic Guiding Aids for Blind People. Electronic Enginering, January, 2-7. Bilsen, F.A., Frietman, E.E., & We-illems, W. (1980). Electroacoustic Obstacle Simulator (EOS) for the Training of Blind Persons. International Journal Rehabilitation Research, 3(4), 553-557. Boehm, R. (1986). The Use of Echolocation as a Mobility Aid for Blind Persons. Journal of Visual Impairment and Blindness, 80(9), 953-954. Brim, V. (1997). Sincerity in Action. Times Communicator, May. Clarke, ation.V., Pick, G.F., @and Wilson, J.P. (1975). Obstacle Detection with and without the Aid of a Directional Noise Generator. American Foundation for the Blind, Research Bulletin, (29), 67-85. Cotzin, M. (1942). The Role of Audible Frequencies in the Perception of Obstacles by the Blind. Psychological Bulletin, 39, 503-504. Cotzin, M. & Dallenbach, K.M. (1950). Facial Vision: The role of pitch and loudness in the perception of obstacles by the blind. American Journal of Psychology, 63(4), 485-515. Cowger, E. (2000). Riply's "Believe it or not" - batman. August 16. De l'Aune, W., Gillespie, G.M. (1974). Examination of Possible Pure Tone Audiomotric Correlation with Successful Acoustic Environmental Analysis by Blinded Veterans. IRCS Journal of International Research Communications, 2, 1110. De l'Aune, W., Gillespie, G.M., Carney, J., & Needham, W. (1974). Analysis of Ambient Noise Frequencies Used by Blind Patients in the Detection of Open and Closed Spaces. 45th Annual Meeting of the Eastern Psychological Association: Philadelphia, April. De l'Aune, W., Scheel, P., & Needham, W. (1974). A comparison of acoustic environmental analysis by blinded veterans and sighted high school students. IRCS Journal of International Research Communications, 2, 1213. De l'Aune, W., Scheel, P., Needham, W., & Kevorkian, G. (1974). Evaluation of a Methodology for Training Indoor Acoustic Environmental Analysis in Blinded Veterans. Proceedings of the Conference on Enginering Devices in Rehabilitation. Tufts University, New England School of Medicine. Diderot, D. (1749). Lettre sur led aveugles, Texted litteraired francais: Geneva, Croz, 1951, 10-11. Dolanski, V. (1930). Led Aveugled Possedent-Ils le "Sens ded Obstacles". Annee Psychologique, 31, 1-51. Dolanski, V. (1931). Do the Blind Sense Obstacles. in And There was Light ea 1, 8-12; (reference to be completed). Erismann, T.H., & Kohler, I. (1953). Forschengsfilm ber den Fernsinn der Blindenwith. Warnung in Dumkeln, innsbruck, Austria: Filmproduktion pacher & Peithner. Felts, W. (1909). Supersensitiveness of the blind. Scientific American, (101), 215. Ferber, L. (2001). The Sally Show. February 23. Foulke, E. (1971). The Perceptual Basis for Mobility. American Foundation for the Blind, Research Bulletin, 23, 1-8. Garry, R.J. & Ascarelli, A. (1960). Teaching Topographical Orientation and Spatial Orientation to Congenitally Blind Children. Journal of Education, 143(2), 1-48. Graham, M.D., Robinson, R.L., Lowrey, A., Sarchin, M.M., & Tims, F. (1968). Eight Hundred Fifty-one Blinded Vetereans: A success story. American Foundation for the blind: New York. Greystone, P. & McLennan, H. (1968). Evaluation of an Audible Mobility Aid for the Blind. American Foundation for the Blind, Research Bulletin, (17), 173-180. Griffin, D.R. (1953). Acoustic Orientation in the Oil Bird (Steatornis). National Academy of Science, 39(8), 884-893. Griffin, D.R. (1986). Listening in the Dark: Acoustic orientation of bats and men. Comstock Pubishing Associates, Ithaca, New York. Hanson, S. (2001). Seeing with Sound. Discovery Break-throughs Inside Science. Hausfeld, S., Power, R.P., Gorta, A., @and Harris, P. (1982). Echo Perception of Shape and Texture by Sighted Subjects. Perceptual Motor Skills, 55(2), 623-632. Hayes, S.P. (1935). Facial Vision or the Sense of Obstacles. Perkins Publications, 12, Perkins Institution: Watertown, Mass. Hayes, S.P. (1938). The Psychology of the Blind. in H. Lynn (ed.) What Are the Blind. (to be completed) Henson, O.W. @and Schnitzler, H.-U. (1980). Performance of Air-Born Sonar Systems II: Vertebrated other than micro-kyroptra. in R.G. Busne. @and J.F. Fish (eds.) Animal Sonar Systems. Prenum: New York, 183-195. Hill, EW., Rieser, J.J., Hill, M.-M., Hill, M., Halpin, J., & Halpin, R. (in press). How Do Persons with Visual Impairments Explore Novel Spaces? A study of strategies used by acceptionally good and acceptionally poor performers. Journal of Visual Impairment and Blindness Ifukube, T., Sasaki, T., & Peng, C. (1991). A Blind Mobility Aid Modeled After Echolocation of Bats. IEEE Transactions on Bio-Medical Engineering, 38(5), 461-466. Jansson, G. (1989). Early Mobility Training of Children Born Blind: Needs and prerequisits. in M. Brambring, F. Losel, Skowronek, Berlin, de Gruyter (Eds.) Children at Risk--Assessment, Longitudenal Research, and Intervention. 416-426. Jerome, E.A. & Prochanski, H. (1947). Research on Guidence Devices and Reading Machines: Final Report. Haskins Laboratories: New York, Dec. 31. Jerome, E.A. & Prochanski, H. (1950). Factors in the Assay and Use of Guidence Devices. in P.A. Zahl (ed.) Blindness: Modern approaches to the unseen environment. Hafner Press: New York, 462-494. Jones, B. (1975). Spatial Perception in the Blind. British Journal of Psychology, 66(4), 461-472. Jones C.G.E.F. & Myers S.O. (1954). Obstacle Experiments. The Teacher of the Blind, 42(6), 153-158. Juurmaa, J. (1965). An Analysis of the Components of Orientation Ability and Mental Manipulation of Spatial Relationships. Reports from the Institute Occupational Health, Helsinki, Finland, (28). Juurmaa, J. (1967a). The Ability Structure of the Blind and the Deaf: Final report. American Foundation for the Blind, Research Bulletin, (14), 109-121. Juurmaa, J. (1967b). The Interrelations of Auditory, Tactual, and Visual Spatial Performances. Reports from the Institute of Occupational Health, Helsinm,, Finland, ea 54. Juurmaa, J. (1968a). The Effects of Training on the Perception of Obstacles without Vision, Part I. New Outlook for the Blind, 64, 65-72. Juurmaa, J. (1968b). The Effects of Training on the Perception of Obstacles without Vision, Part II. New Outlook for the Blind, 64, 104-118. Juurmaa, J. (1969). Analysis of Orientation Ability and Its Significance for the Rehabilitation of the Blind. Scandinavian Journal of Rehabilitation Medicine, 1(2), 80-84. Juurmaa, J. (1970a). On the Accuracy of Obstacle Detection by the Blind Part 1. New Outlook for the Blind, 64(3), 65-72. Juurmaa, J. (1970b). On the Accuracy of Obstacle Detection by the Blind Part 2. New Outlook for the Blind, 64(4), 104-118. Juurmaa, J. (1972). The Spatial Sense of the Blind: A plan for research. American Foundation for the Blind, Research Bulletin, (24), 57-64. Juurmaa J. & Jeverrvilehto, S. (1969). On the Accuracy of Obstacle Detection by the Blind. Reports from the Institute of Occupational Health, Helsinki, Finland, (67). Juurmaa, J., Suonio, K., Moilanen, A. (1968). The Effect of Training in the Perception of Obstacles without Vision. Reports from the Institute of Occupational Health, Helsinki, Finland, (55). Kaff, J. (1997). Clicking Tongue Helps Blind Man See the Way. Los Angeles Times, July 21. Kellogg, W.ation. (1962). Sonar System of the Blind. Science, 137(3528), 399-405. Kellogg, W.ation. (1964). Sonar System of the Blind. National Founderation for the Blind, Research Bulletin, (4), Kish, D. (1995). Evaluation of an Echo-Mobility Training Program for Young Blind People. Unpublished master's thesis, California State University, San Bernardino. Kohler, I. (1964). Orientation by Aural Clues. American Foundation for the Blind, Research Bulletin, (4), 14-53. Kohler, I. (1967). Facial Vision Rehabilitated. in R.G. Busnel (ed.) Animal Sonar Systems. Biology and Bionics: Jouy-en-Josas, Laboratoire de Physiologie Acoustique, volume I, 89-114. Ladefoged, P. & Traill, A. (1993). Clicks and Their Accompaniments. Journal of Phonetics. Laufer, H. (1948). The Detection of Obstacles with the Aid of Sound Directing Devices. The Biological Review, 10, 30-39. Lee, D.ation., van der Weel, F.R. (Ruud), Hitchcock, T., Matejowsky, E., Pettigrew, J.D. (1992). Common Principles of Guidance by Echolocation and Vision. Journal of Comparative Physiology, 171(5), 563-571. Lende, H. (1940). Books About the Blind: A bibliographical guide to literature relating to the blind. American Foundation for the Blind: New York. Leonard, J.A. (1972). Studies in Blind Mobility. Applied Ergonomics, March, 37-46. Magruder, M. (1974). The Use of Sound in the Perception of Objects. Unpublished master's thesis, California State University, Los Angeles. Martinez, F. (1977). Does Auditory Information Permit the Establishment of Spatial Orientation? Experimental and Clinical Data with the Congenitally Blind. Annee Psychologique. 77(1), 179-204. McCarty, B. & Worchel, P. (1954). Rate of Motion and Object Perception in the Blind. New Outlook for the Blind, 48, 316-322. Mendoza, R. (1999). Beyond Chance. Triage Entertainment, December 6. Mickunas, J. & Sheridan, T.V. (1963). Use of an obsticle course in evaluating mobility for the blind. American Foundation for the Blind, Research Bulletin, 3, 35-54. Middlebrooks, J.C. & Green, D.M. (1991). Sound Localization by Human Listeners. Annual Review of Psychology, 42, 135-159. Mills, A.W. (1961). Sonic Object Sensing. Proceedings of the Mobility Research Conference. Massachusetts Institute of Technology: Washington, D.C., -. Mills, A.W. (1963). Auditory Perceptions of Spatial Relations. in L.L. Clark (Ed.) Proceedings of the International Congress on Technology and Blindness. American Foundation for the Blind: New York, Volume II, 111-139. Myers, S.O. & Jones, C.G.E.F. (1958). Obstacle Experiments: Second report. The Teacher of the Blind, 46(2), 47-62. Nicolosi, M. (1994). Sound Technique Aids the Blind. Orange County Register, December. Norris, M., Spaulding, P.J., & Brodie, F.H. (1957). Blindness in Children. University of Chicago Press, Chicago. Ono, H., Fay, A., @and Tarbell, S.E. (1986). A "Visual" Explanation of Facial Vision. Psychological Research, 48(2), 57-62. Potuschak, B.M. (1956). Ber die Entstehung der Hautempfindungen beim Fernsinn der Blinden. Dissertation an d. Phil. Fak. d. University Innsbruck. Rice, C.E. (1967a). The Human Sonar System. in R.G. Busnel (ed.) Animal Sonar Systems. Biology and Bionics: Jouy-en-Josas, Laboratoire de Physiologie Acoustique, volume II, 719-755. Rice, C.E. (1967b). Human Echo Perception. Science, 156, 656-664. Rice, C.E. (1967c). Quantitative Measures of Unaid Echo Detection in the Blind: Auditory echo localization. in R. Dufton (ed.) Proceedings of the International Conference on Sensory Devices for the Blind. St. Dunstan's: London, England, 89-102. Rice, C.E. (1969). Perceptual Enhancement in the Early Blind. Psychological Record, 19, 1-14. Rice, C.E. (1970). Early Blindness, Early Experience, and Perceptual Enhancement. National Federation for the Blind, Research Bulletin, (22), 1-20. Rice, C.E. & Feinstein, S.H. (1965a). Sonar System of the Blind: Size discrimination. Science, 148, 1107-1108. Rice, C.E. & Feinstein, S.H. (1965b). The Influence of Target Perameters on a Human Echo Detection Task. Proceedings of the American Psychological Association: Washington, D.C., 65-66. Rice, C.E., Feinstein, S.H., & Schusterman, R.J. (1965). Echo Detection Ability of the Blind: Size and distance factors. Jounal of Experimental Psychology, 70(3), 246-251. Rieser, J.J. (1990) Development of Perceptual/Motor Control While Walking without Vision: the celebration of perception and action. Riley, L.H., Luterman, D.M., & Cohen, M.F. (1964). Relationship Between Hearing Ability and Mobility and a Blinded Adult Population. The New Outlook for the Blind, 58, 139-141. Schenkman, B.ation. (1983). Human Echolocation as a Function of Kind of Sound Source and Object Position. Uppsala Psychological Reports, (363). Schenkman, B.ation. (1985a). Human Echolocation with Different Types of Sound and Different Positions of Sound Source. Uppsala Psychological Reports, (373). Schenkman, B.ation. (1985b). Human Echolocation: A review of the literature and a theoretical analysis. Uppsala Psychological Reports, (379). Schenkman, B.ation. & Jansson, G. (1986). The Detection and Localization of Objects by the Blind with the Aid of Long-Cane Tapping Sounds. Human Factors, 28(5), 607-618. Schwartz, M. (1984) The role of sound for space and object perception in the congenitally blind infant. Advances in Infant Research 3, 23-59. Shephard, R.H. & Howell, D.A. (1980). "I Can No Longer Hear the Silence of Lamp Posts". Lancet, 2(8196), 706. Shingledecker, C.A. (1983). Measuring the Mental Effort of Blind Mobility. Journal of Visual Impairment and Blindness, 77(7), 334-39. Strelow, E.R. (1985). What is Needed for a Theory of Mobility: Direct perception and cognitive maps - lessns from the blind. Psychological Review, 92(2), 226-248. Strelow, E.R. & Brabyn, J. (1982). Locomotion of the Blind Controlled by Natural Sound Cues. Perception, 11, 635-640. Supa, M., Cotzin, M., & Dallenbach, K.M. (1944). "Facial Vision": The perception of obstacles by the blind. American Journal of Psychology, 57(2), 133-183. Taylor, J.G. (1962). The Behavioral Basis of Perception. Yale: New York. Twersky, V. (circa 1950, to be determined). On the Scattered Reflection of Scalar Waves from Obsorbant Surfaces. Reports from the Mathematics Research group, (EMblebb), Washington Square College of New York University. Twersky, V. (1950). On the Theory of Non-Specular Reflection of Sound. Journal of the Acoustical Society of America, 22, 539-546. Twersky, V. (1951a). On the Physical Basis of the Perception of Obstacles by the Blind. American Journal of Psychology, 64, 409-416. Twersky V. (1951b). Flashsounds and Aural Constructs for the Blind. Physics Today, 4, 10-16. Twersky, V. (1953). Auxiliary Mechanical Sound Sources for Obstacle Perception by Audition. Journal of the Acoustical Society of America, 25, 156-157. Warren, D.H. (1989). Issues in Assessment and Intervention with Blind Infants and Children. in M. Brambring, F. Losel, Skowronek, Berlin, de Gruyter (Eds.) Children at Risk--Assessment, and Longitudinal Research and Intervention. 119-135. Warren, D.H. (1974). Early vs. Late Vision: The role of early vision in spatial referent systems. New Outlook for the Blind, April, 157-163. Warren, D.H.; Anooshian, L.J.; Bollinger, J.G. (1973). Early vs. Late blindness: The role of early vision in spatial behavior. American Foundation for the Blind, Research Bulletin. 26, 151-170. Warren, D.H. & Kocon, J.A. (1974). Factors in the Successful Mobility of the Blind: A review. American Foundation for the Blind, Research Bulletin, (28), 191-218. Wegel, R.L. & Lane, C.E. (1924 or 1925 - to be determined). The Auditory Masking of One Pure Tone by Another and Its Probable Relation to the Dynamics of the Inner Ear. Pysiological Review, 23 266-285. Welch, J. (1964). A Psycho-acoustic Study of Factors Affecting Human Echolocation. American Foundation for the Blind, Research Bulletin, (4), 1-13. Wiener, W.R. and Lausen, G.D. (1997). Audition for the Traveler Who Is Visually Impaired. in B.B. Blasch, W.R. Wiener, & R.L. Welsch, (eds.) Foundations of Orientation and Mobility. A.F.B. Press: New York, 104-169. Wilson, J.P. (1967). Psychoacoustics of Obstacle Detection Using Ambient or Self-generating Noise. in R.G. Busnel (ed.) Animal Sonar Systems. Biology and Bionics: Jouy-en-Josas, Laboratoire de Physiologie Acoustique, volume I, 89-114. Witcher, C.M. & Washington, L.Jr. (1954). Echo-location for the Blind. Electronics, 27, 136-137. Worchel, P. & Berry, J.H. (1952). The Perception of Obstacles by the Deaf. Journal of Experimental Psychology, 43, 187-194. Worchel, P. & Dallenbach, K.M. (1947). "Facial Vision": Perception of obstacles by the deaf-blind. American Journal of Psycholoy, 60, 502-553. Worchel, P. & Mauney, J. (1950). The Effect of Practice on the Perception of Obstacles by the Blind. Jounal of Experimental Psychology, 41, 170-176. Worchel, P., Mauney, J., & Andrew, J. (1950). The Perception of Obstacles by the Blind. Journal of Experimental Psychology, 40, 746-751.