GravField

2024

byBotao Amber Hu, Yuemin Huang, Mingze Chai, Yilan Tao, Xiaobo Aaron Hu

Overview

We present "Gravitational Field" (GravField) an experimental and participatory live performance set in a co-located mixed reality environment. Participants wearing AR headsets are invited to join and collectively create real-time music through their collaborative body movements, guided by the programmed settings from live-coders. This innovative experiment harnesses intercorporeal signals derived from participants' body movements, including factors like distance, area formation, relative height differences, and head motion synchronization. Live-coders dynamically map these signals into metaphorical audiovisual patterns in augmented reality, providing participants with cues about the relationships between players. The live-coded patterns shape and influence participants' improvisation of body movements, while adjustments in live coding dynamically alter how these movements generate sound, influencing the overall improvisational experience. In addition, spectators outside the live performance area can enjoy the mixed reality performance through projection screens and TVs, enhancing audience engagement. GravField aims to explore the vast potential of intercorporeal signals, creating a communicative, playful, and co-creative space where players' interconnected bodies become the instruments of expression.

_resources/GravField/67e103390c4ef7392291f71674603c87_MD5.

_resources/GravField/6f5a7dc39d2459103d78da9404d64f84_MD5.

_resources/GravField/9f6c91d790804e5b128059ba64b001d8_MD5.

_resources/GravField/62ab604430f8629cc5d28f1263075af0_MD5.

_resources/GravField/5d3e34a9b3ec37170940b9a4435b2435_MD5.

_resources/GravField/078f7ac4f07a98cca4341ea97aa89ecc_MD5.

_resources/GravField/7a8edbe0ab12150edadb287ef8ed2c37_MD5.

_resources/GravField/201e72bfd51aee6212627d06c58744ee_MD5.

_resources/GravField/4a5a048d8b68d6c710e443e49d8d0882_MD5.

_resources/GravField/910476ca27270feaa95cf72714d83c5a_MD5.

_resources/GravField/27031980e6d6887eb5478311e2baeec6_MD5.

_resources/GravField/01db5eeb4498a468eb314c8f91d3500a_MD5.

_resources/GravField/33edfc7ac5be34b887d73cc2ace676d3_MD5.

_resources/GravField/a9d4ca5ac24368f2330fc983471d0b65_MD5.

_resources/GravField/c7e5ab0805102ad04f963521f7385ddb_MD5.

_resources/GravField/e19436202136c99aca336ddd05447f53_MD5.

_resources/GravField/affa895c3bee18cfdae056e93b22931b_MD5.

_resources/GravField/4db121ffdb3c98e5faa7841bfcd6d399_MD5.

Publications

Botao Amber Hu*, Yuemin Huang, Mingze Chai, Xiaobo Aaron Hu, Yilan Elan Tao, and Rem RunGu Lin. 2024. "GravField: Live-Coding Bodies through Mixed Reality". In Adjunct Proceeding of SIGGRAPH Asia 2024 (SA '24). XR. []

Botao Amber Hu, Yuemin Huang, Mingze Chai, Xiaobo Aaron Hu, and Yilan Elan Tao. 2024. "GravField: Towards Designing an Inter-bodily Live-Coding Performance System within Collocated Mixed Reality Field". In Companion Proceedings of the Annual Symposium on Computer-Human Interaction in Play (CHI PLAY Companion '24). Work-in-Progress. []

Botao Hu*, Yuemin Huang, Mingze Chai, Yilan Tao and Xiaobo Hu. 2024. "GravField: A Participatory Performance Exploring Intercorporeality as Live-Coding Instruments within a Co-located Mixed Reality". In Proceedings of the 2024 International Conference on Live Coding (ICLC ‘24). Live Performance. []

Exhibitions

SIGGRAPH Asia 2024 XR. "GravField". Tokyo, Japan.

The International Conference on Live Coding 2024 Live Performance. "GravField". Shanghai, China.

Citation

copy to the clip board for citation

@inproceedings{10.1145/3665463.3678807,author = {Hu, Botao Amber and Huang, Yuemin and Chai, Mingze and Hu, Xiaobo Aaron and Tao, Yilan Elan},title = {GravField: Towards Designing an Inter-bodily Live-Coding Performance System within Collocated Mixed Reality Field},year = {2024},isbn = {9798400706929},publisher = {Association for Computing Machinery},address = {New York, NY, USA},url = {https://doi.org/10.1145/3665463.3678807},doi = {10.1145/3665463.3678807},abstract = {In this Work-in-Progress (WiP), we present GravField (short for "Gravitational Field"), an experimental and participatory inter-bodily live-coding performance system in a collocated Mixed Reality (MR) environment. Participants wearing MR headsets are invited to join and collectively create real-time music through their collaborative body movements, guided by the programmed settings from live-coders. This innovative experiment harnesses intercorporeal signals derived from participants’ body movements, including factors like distance, area formation, relative height differences, and head motion synchronization. Live-coders dynamically map these signals into metaphorical audiovisual patterns in MR, providing participants with cues about the relationships between players. The live-coded patterns shape and influence participants’ improvisation of body movements, while adjustments in live coding dynamically alter how these movements generate sound, influencing the overall improvisational experience. Additionally, spectators outside the live performance area can enjoy the mixed reality performance through projection screens and TVs, enhancing audience engagement. This WiP aims to explore the vast potential of intercorporeal signals, creating a communicative, playful, and co-creative space where players’ interconnected bodies become the instruments of expression.},booktitle = {Companion Proceedings of the 2024 Annual Symposium on Computer-Human Interaction in Play},pages = {124–130},numpages = {7},keywords = {Augmented Reality, Collocated Multiplayer Mixed Reality, Dance Improvisation, Embodied Interaction, Intercorporeality, Live Coding},location = {Tampere, Finland},series = {CHI PLAY Companion '24}}

Metadata

Media: Headworn AR

Technologies: HoloKit Headset, Unity3D, Multipeer Connectivity

Research Topics: Collocated Mixed Reality, Intercorporeality

Credits

Concept Designer: Botao Amber Hu

Concept Designer: Yuemin Huang

Interaction Designer: Botao Amber Hu

Sound Designer: Mingze Chai

Technical Artist: Xiaobo Aaron Hu

Project Assistant: Yilan Tao