Home

Section 2 : #Worldview Reconstruction #Upload_Image

The bug lead the audience from 360 videos into the real (projection) and flat world. At this point, the phone will prompt the viewer to upload a photo. The images are taken through the camera, like the eyes of bugs, and are captured by the viewer’s own perspective and consciousness. As more and more images overlap, the performer and the audience build a self-contained world view together. After that, the database of consciousness will eventually expand and gradually decompose into scattered pixels.

Section 3

The particleized consciousness shrinks and processes, and becomes a membranous substance that envelops the human form/

Section 4 : #Gyroscope

The viewer gradually takes control of all the virtual and real interfaces and becomes the medium of control over the consciousness of all groups, and is released in the process of projecting and controlling the forms of the masses, revealing the pathology of the bugs. Every 10 seconds, one phone will be able to rotate the angle of view of the projection screen through the phone’s gyroscope. The selected person will be prompted by the phone vibration and flash, while the remaining person’s phone screen will show the xyz coordinate data of the current screen. Finally, all sounds and influences will return to the forest soundscape that entered the field in the first place.

Technical Flow

1. This work is characterized by a two-way feedback between the audience and the performers.

2, Using the A-Frame architecture, a web page can be created for viewing 360 videos; performers can use commands to switch between videos

3 the audience can upload images to the server; these images will be selected by the performers and will appear on the live projection screen in real time

4. using the Web Vibration API, the performer can use commands to control the audience's cell phone vibration (currently only available for Android)

5. using the Web Motion related API, the audience can send tri-axis sensor values to influence the live projection content in real time

Final Audio Visual Version

 

 

Section 2 : #Worldview Reconstruction #Upload_Image

The bug lead the audience from 360 videos into the real (projection) and flat world. At this point, the phone will prompt the viewer to upload a photo. The images are taken through the camera, like the eyes of bugs, and are captured by the viewer’s own perspective and consciousness. As more and more images overlap, the performer and the audience build a self-contained world view together. After that, the database of consciousness will eventually expand and gradually decompose into scattered pixels.

Section 3

The particleized consciousness shrinks and processes, and becomes a membranous substance that envelops the human form/

Section 4 : #Gyroscope

The viewer gradually takes control of all the virtual and real interfaces and becomes the medium of control over the consciousness of all groups, and is released in the process of projecting and controlling the forms of the masses, revealing the pathology of the bugs. Every 10 seconds, one phone will be able to rotate the angle of view of the projection screen through the phone’s gyroscope. The selected person will be prompted by the phone vibration and flash, while the remaining person’s phone screen will show the xyz coordinate data of the current screen. Finally, all sounds and influences will return to the forest soundscape that entered the field in the first place.

Technical Flow

1. This work is characterized by a two-way feedback between the audience and the performers.

2, Using the A-Frame architecture, a web page can be created for viewing 360 videos; performers can use commands to switch between videos

3 the audience can upload images to the server; these images will be selected by the performers and will appear on the live projection screen in real time

4. using the Web Vibration API, the performer can use commands to control the audience's cell phone vibration (currently only available for Android)

5. using the Web Motion related API, the audience can send tri-axis sensor values to influence the live projection content in real time

Final Audio Visual Version

Creation Team: Sin-Yu, Deng, Jr-Ling Chen, Yu-Chin Chen
Instructor: Chih-Yung Aaron CHIU
University graduation project for graduation exhibition, 破ㄩ.
Performed at 濕地|Venue in Taipei

Core Concept

This interactive audio-visual performance used “bugs” as the theme, observing moths to flames and asking the public about the psychology of blind obedience. Cell phones would be the main medium of this interaction. With the cell phone as the body of the bug and the human as the soul of the bug, the audience and the performer tug on each other to control and deeply trapped into a new world of chaos. The content design between the cell phone and the projection leads the audience’s consciousness from the virtual to the reality, and then to their own world view.

Insects, a species with tropism, are based on latent instincts and passively interfere with environmental factors, resulting in blind, controlled and other behaviors; Human, who has been invisible in social groups for a long time, is used to following the crowd to measure value, and pursues social focus and dream belonging. Consciousness and body cannot be separated, and you have also intervened in the new generation of species alternation. . . .

Teaser

Documentary

Performance Commentary

Entrance

At the entrance, the audience needs to connect to the intranet and scan the qrcode to enter our interactive website.

Section 1 : #Soundscape #360video

After entering the website, each phone will be given its own bug ID and will play a different bug sound. When the audience gradually enters ,the performance site will be built with a forest soundscape created by the built-in speaker of multiple cell phones. The music goes into a mini-peak with the sound of insects chirping on cell phones and goes straight into the performance. In this part, the audience can watch a 360-degree film of the forest through the cell phone, just like a real bug entering the forest.

Pre-performance soundscape

Web 360 immersive video

 

 

Section 2 : #Worldview Reconstruction #Upload_Image

The bug lead the audience from 360 videos into the real (projection) and flat world. At this point, the phone will prompt the viewer to upload a photo. The images are taken through the camera, like the eyes of bugs, and are captured by the viewer’s own perspective and consciousness. As more and more images overlap, the performer and the audience build a self-contained world view together. After that, the database of consciousness will eventually expand and gradually decompose into scattered pixels.

Section 3

The particleized consciousness shrinks and processes, and becomes a membranous substance that envelops the human form/

Section 4 : #Gyroscope

The viewer gradually takes control of all the virtual and real interfaces and becomes the medium of control over the consciousness of all groups, and is released in the process of projecting and controlling the forms of the masses, revealing the pathology of the bugs. Every 10 seconds, one phone will be able to rotate the angle of view of the projection screen through the phone’s gyroscope. The selected person will be prompted by the phone vibration and flash, while the remaining person’s phone screen will show the xyz coordinate data of the current screen. Finally, all sounds and influences will return to the forest soundscape that entered the field in the first place.

Technical Flow

1. This work is characterized by a two-way feedback between the audience and the performers.

2, Using the A-Frame architecture, a web page can be created for viewing 360 videos; performers can use commands to switch between videos

3 the audience can upload images to the server; these images will be selected by the performers and will appear on the live projection screen in real time

4. using the Web Vibration API, the performer can use commands to control the audience's cell phone vibration (currently only available for Android)

5. using the Web Motion related API, the audience can send tri-axis sensor values to influence the live projection content in real time

Final Audio Visual Version