The bug lead the audience from 360 videos into the real (projection) and flat world. At this point, the phone will prompt the viewer to upload a photo. The images are taken through the camera, like the eyes of bugs, and are captured by the viewer’s own perspective and consciousness. As more and more images overlap, the performer and the audience build a self-contained world view together. After that, the database of consciousness will eventually expand and gradually decompose into scattered pixels.
The particleized consciousness shrinks and processes, and becomes a membranous substance that envelops the human form/
The viewer gradually takes control of all the virtual and real interfaces and becomes the medium of control over the consciousness of all groups, and is released in the process of projecting and controlling the forms of the masses, revealing the pathology of the bugs. Every 10 seconds, one phone will be able to rotate the angle of view of the projection screen through the phone’s gyroscope. The selected person will be prompted by the phone vibration and flash, while the remaining person’s phone screen will show the xyz coordinate data of the current screen. Finally, all sounds and influences will return to the forest soundscape that entered the field in the first place.
1. This work is characterized by a two-way feedback between the audience and the performers.
2, Using the A-Frame architecture, a web page can be created for viewing 360 videos; performers can use commands to switch between videos
3 the audience can upload images to the server; these images will be selected by the performers and will appear on the live projection screen in real time
4. using the Web Vibration API, the performer can use commands to control the audience's cell phone vibration (currently only available for Android)
5. using the Web Motion related API, the audience can send tri-axis sensor values to influence the live projection content in real timeThe bug lead the audience from 360 videos into the real (projection) and flat world. At this point, the phone will prompt the viewer to upload a photo. The images are taken through the camera, like the eyes of bugs, and are captured by the viewer’s own perspective and consciousness. As more and more images overlap, the performer and the audience build a self-contained world view together. After that, the database of consciousness will eventually expand and gradually decompose into scattered pixels.
The particleized consciousness shrinks and processes, and becomes a membranous substance that envelops the human form/
The viewer gradually takes control of all the virtual and real interfaces and becomes the medium of control over the consciousness of all groups, and is released in the process of projecting and controlling the forms of the masses, revealing the pathology of the bugs. Every 10 seconds, one phone will be able to rotate the angle of view of the projection screen through the phone’s gyroscope. The selected person will be prompted by the phone vibration and flash, while the remaining person’s phone screen will show the xyz coordinate data of the current screen. Finally, all sounds and influences will return to the forest soundscape that entered the field in the first place.
1. This work is characterized by a two-way feedback between the audience and the performers.
2, Using the A-Frame architecture, a web page can be created for viewing 360 videos; performers can use commands to switch between videos
3 the audience can upload images to the server; these images will be selected by the performers and will appear on the live projection screen in real time
4. using the Web Vibration API, the performer can use commands to control the audience's cell phone vibration (currently only available for Android)
5. using the Web Motion related API, the audience can send tri-axis sensor values to influence the live projection content in real timeThis interactive audio-visual performance used “bugs” as the theme, observing moths to flames and asking the public about the psychology of blind obedience. Cell phones would be the main medium of this interaction. With the cell phone as the body of the bug and the human as the soul of the bug, the audience and the performer tug on each other to control and deeply trapped into a new world of chaos. The content design between the cell phone and the projection leads the audience’s consciousness from the virtual to the reality, and then to their own world view.
At the entrance, the audience needs to connect to the intranet and scan the qrcode to enter our interactive website.
After entering the website, each phone will be given its own bug ID and will play a different bug sound. When the audience gradually enters ,the performance site will be built with a forest soundscape created by the built-in speaker of multiple cell phones. The music goes into a mini-peak with the sound of insects chirping on cell phones and goes straight into the performance. In this part, the audience can watch a 360-degree film of the forest through the cell phone, just like a real bug entering the forest.
The bug lead the audience from 360 videos into the real (projection) and flat world. At this point, the phone will prompt the viewer to upload a photo. The images are taken through the camera, like the eyes of bugs, and are captured by the viewer’s own perspective and consciousness. As more and more images overlap, the performer and the audience build a self-contained world view together. After that, the database of consciousness will eventually expand and gradually decompose into scattered pixels.
The particleized consciousness shrinks and processes, and becomes a membranous substance that envelops the human form/
The viewer gradually takes control of all the virtual and real interfaces and becomes the medium of control over the consciousness of all groups, and is released in the process of projecting and controlling the forms of the masses, revealing the pathology of the bugs. Every 10 seconds, one phone will be able to rotate the angle of view of the projection screen through the phone’s gyroscope. The selected person will be prompted by the phone vibration and flash, while the remaining person’s phone screen will show the xyz coordinate data of the current screen. Finally, all sounds and influences will return to the forest soundscape that entered the field in the first place.
1. This work is characterized by a two-way feedback between the audience and the performers.
2, Using the A-Frame architecture, a web page can be created for viewing 360 videos; performers can use commands to switch between videos
3 the audience can upload images to the server; these images will be selected by the performers and will appear on the live projection screen in real time
4. using the Web Vibration API, the performer can use commands to control the audience's cell phone vibration (currently only available for Android)
5. using the Web Motion related API, the audience can send tri-axis sensor values to influence the live projection content in real time