byBotao Hu, Jiabao Li, Danlin Huang, Jianan Johanna Liu, Xiaobo Aaron Hu


EchoVision is an immersive art installation that allows participants to experience the world of bats using sound visualization and mixed reality technology. With a custom-designed, bat-shaped headset from the open-source HoloKit project, users can simulate echolocation, the natural navigation system bats use in the dark. They do this by using their voices and interpreting the returned echoes with the mixed-reality visualization.

The exhibit adjusts visual feedback based on the pitch and tone of the user's voice, offering a dynamic and interactive depiction of how bats perceive their environment. This installation combines scientific learning with empathetic engagement, encouraging an ecocentric design perspective and understanding between species. "Echoing Bat" educates and inspires a deeper appreciation for the unique ways non-human creatures interact with their ecosystems.

The EchoVision mixed-reality experience

In the EchoVision project, the visualization of sound waves is meticulously designed to mimic bat echolocation, illustrating the process of sound propagation in a mixed reality (MR) environment. Utilizing the open-source HoloKit MR SDK and hardware, this experience allows participants to engage with the world through the sensory perspective of bats.

The visual effects are primarily influenced by 3 key attributes: volume and pitch of the sound, and the perception of the environment. These attributes have been translated into the visual domain to create a dynamic and immersive experience:

  • Propagation and Brightness: The visualized sound waves propagate forward over time, and their behavior changes based on volume. Louder sounds result in a wider propagation angle and increased brightness of the wave lines. This simulates the expansive nature of intense sound waves in a visual format.
  • Pitch and Line Thickness: Higher-pitched sounds travel farther distances and are represented by finer wave lines. This correlation helps in demonstrating the relationship between pitch and the reach of sound waves.
  • Particle Effects and Color Gradients: Particle effects have been incorporated to compensate for the challenge of depicting sound wave reflections. When sound waves collide with objects, particles burst in the direction of the reflected wave, providing an intuitive visual cue of the wave's progression. The color of these particles transitions from red to blue over time, signifying a shift from bright to dark. Particles on human bodies are uniquely colored in gold to distinguish them from particles on other surfaces.

In designing these visual parameters, we aimed to adhere to physical intuition while also incorporating artistic enhancements for visual appeal. Based on user feedback, we designed the golden color and an additional particle effect when sound waves interact with human bodies. This feature not only outlines the human form but also encourages users to engage with their surroundings by shouting towards nearby individuals, thereby enriching the interactive and experiential quality of the installation.

By merging these elements, EchoVision provides a scientifically grounded yet artistically enriched visual representation of echolocation, fostering a deeper understanding and appreciation of the echolocation phenomenon.

"What Is it Like to Be a Bat?"

In "What Is it Like to Be a Bat?" Thomas Nagel explores the unique, subjective experience of consciousness, using bats as a primary example due to their use of echolocation, a sensory perception vastly different from human experiences. He argues that understanding what it is like to be another creature is inherently limited by our inability to fully grasp their subjective experiences. This concept directly relates to EchoVision that allows participants to simulate the echolocation of bats using sound visualization and mixed reality technology. By adjusting visual feedback based on the user's voice, EchoVision bridges the gap between human and bat perception, offering a deeper, empathetic understanding of non-human sensory worlds and emphasizing the profound differences in how various species interact with their environments.

Misconception of bats

People often harbor misconceptions about bats, viewing them as dangerous, dirty, or associated with negative superstitions. Common myths include the belief that bats are blind, that they commonly get tangled in human hair, or that they are primarily carriers of diseases (especially during the COVID-19 pandemic). These misconceptions contribute to a general fear and misunderstanding of bats, overshadowing their ecological importance and unique biological features.

Bats are ecologically vital, providing essential benefits such as pest control by consuming large quantities of insects, pollination of numerous plants, and seed dispersal that aids in forest regeneration. Their guano enriches soil fertility, contributing to nutrient cycling. These roles help maintain ecosystem balance and biodiversity.

EchoVision aims to challenge and transform these misconceptions by providing participants with an immersive, empathetic experience of the world through a bat's perspective. By simulating echolocation using sound visualization and mixed reality technology, EchoVision allows users to understand how bats navigate and perceive their environment with remarkable precision. This interactive experience highlights the sophistication and elegance of bat echolocation, fostering a greater appreciation for their role in ecosystems, such as pest control and pollination.

Through EchoVision, participants can see bats not as menacing creatures but as fascinating and integral parts of the natural world. The installation encourages scientific learning and empathetic engagement, promoting an ecocentric design perspective that values the diverse ways non-human creatures interact with their surroundings. By offering a new perspective, EchoVision helps dispel myths, reduces unfounded fears, and inspires a deeper connection to and respect for bats and their vital ecological contributions.


Bat echolocation is a sophisticated biological process that allows bats to navigate and hunt in the dark by emitting high-frequency sound waves and listening to the echoes that bounce back from objects in their environment. When a bat emits a sound, it travels through the air and reflects off surfaces, returning to the bat as an echo. By analyzing the time delay and changes in the frequency of the returning echoes, bats can determine the distance, size, shape, and texture of objects, as well as the speed and direction of moving prey.

In "An Immense World," Ed Yong describes how bats' echolocation abilities are finely tuned and highly specialized. He explains that bats can emit sounds at frequencies higher than the human ear can detect, often using calls that range from 20 to 200 kHz. The bat's brain processes these echoes with remarkable speed and accuracy, allowing it to create a detailed auditory map of its surroundings. This process is so efficient that bats can detect tiny insects in complete darkness and avoid obstacles with precision. Yong also highlights the diversity among bat species in their echolocation strategies, noting that different bats have evolved various adaptations to suit their ecological niches. Some bats have highly directional calls for pinpointing prey, while others use more omnidirectional calls to navigate cluttered environments like dense forests. These adaptations demonstrate the incredible versatility and effectiveness of echolocation as a sensory tool for survival.

In addition to bats, several other animals use echolocation to navigate and hunt, including dolphins, porpoises, whales (such as sperm and beluga whales), certain species of shrews, and swiftlets. One notable human who uses echolocation is Daniel Kish. Blind since infancy, Kish developed the ability to navigate his surroundings by making clicking sounds with his tongue and interpreting the echoes that bounce back from objects around him. This technique, known as human echolocation, allows him to "see" the world through sound, enabling him to walk, ride a bike, and engage in various activities independently. Kish has also founded the organization World Access for the Blind to teach echolocation to other visually impaired individuals.

LiDAR sensor VS echolocation

The relationship between LiDAR sensors and bat's echolocation lies in the fundamental principles of how they both detect and map their surroundings through the reflection of emitted signals.

Both LiDAR sensors and bats emit a signal that interacts with objects in the environment and then capture the reflected signal to gather information—LiDAR emits laser pulses, while bats emit ultrasonic sound waves. The time it takes for the signals to return provides data about the distance and shape of objects. Both technologies use the returned signals to create a map or image of their surroundings; LiDAR creates detailed 3D models of environments by calculating the distance between the sensor and objects, while bats interpret the time delay and intensity of the echoes to construct a mental map of their surroundings, aiding in navigation and hunting in the dark. Both methods allow for high accuracy and precision in detecting objects and their positions, with LiDAR measuring distances with great accuracy and producing high-resolution 3D models, and bats detecting tiny insects and avoiding obstacles in complete darkness.

EchoVision leverages this similarity by using sound visualization through LiDAR sensor’s reconstruction of the environment to simulate the echolocation experience, drawing a parallel between human-made technology and natural biological processes.

The Contemporary Austin - Host: Fusebox Program

We premiered EchoVision at The Contemporary Austin as part of the Fusebox Festival. The Congress Avenue Bridge in Austin, Texas, famously hosts the largest urban bat colony in the world, with roughly 1.5 million Mexican free-tailed bats residing beneath it every summer. Our performance started under the bat bridge with our collaborator, the bat conservationist Merlin Tuttle, who saved the 1.5 million bats under the Austin Bridge from extermination. About 500 people came. They listened to Merlin’s captivating stories with bats, watched the huge number of bats flying out of the bat bridge after sunset, and experienced EchoVision to be like a bat and to have a rare glimpse into the bat's perceptual universe.

Then, the adventure progressed with a walk straight down Congress Avenue to The Contemporary Austin Museum, where Jiabao Li and Matt McCorkle performed a live bat-inspired soundscape event called "Nocturnal Fugue." Nocturnal Fugue uses bat vocalizations—from social calls to mating rituals—to artistically create a mesmerizing sonic experience. This soundscape is combined with visual projections that represent the environment in which the specific bat call is being made or in response to. At the rooftop, we served Tequila. Bats are agave pollinators. They are part of the key process that turns agave flowers into tequila.

Open: design bat.png _resources/EchoVision/87def6de176b0aaeaf660a98490ea6ca_MD5.png *Photo credit: Merlin Tuttle's Bat Conservation ( _resources/EchoVision/ab130447c53b5e4657444395569c9b5e_MD5.png _resources/EchoVision/39fd18d4f0886cf18552c92591ceaddd_MD5.png _resources/EchoVision/7ee6208ca33742f0fc250a1a4b6294fa_MD5.png _resources/EchoVision/2c0d5e59d438e03f1ce6d2819cbbc147_MD5.png _resources/EchoVision/f9f57ed278dd9794ea5965b7676233b8_MD5.png _resources/EchoVision/3a577dc82708872523c41c616c7c4008_MD5.png
_resources/EchoVision/892809879df7b4d076aef7ebdb7e64f5_MD5.png _resources/EchoVision/642127242fa105036016599790a6773a_MD5.png _resources/EchoVision/29953a1579dae16d27df02ad376518e9_MD5.png _resources/EchoVision/7fa5aea4c2f8625879f34b1e16da94df_MD5.png _resources/EchoVision/ce31b5c65625425af477dcd9a8f30aa6_MD5.png _resources/EchoVision/b6be0e8555599796e553ca87c38f987e_MD5.png _resources/EchoVision/9a9690707c0e741899fa2fcf0f4f23b6_MD5.png


Media: Headworn AR

Technologies: HoloKit SDK, Unity3D

Research Topics: Democratizing Access to Mixed Reality


Director: Botao Amber Hu

Director: Jiabao Li

Interaction Engineer: Botao Amber Hu

Interaction Designer: Jiabao Li

Technical Artist: Xiaobo Aaron Hu

Product Designer: Jianan Johanna Liu

Product Designer: Danlin Huang