Sound Sculpt

Perception Enhancement through Playful AR MIDI

The SoundSculpt is a generative MIDI tool for enhanced perception performance transforming the practice of sculpting into Audio AR technology.

The tradition and practice of sculpting days back to ancient Greece and ancient China time, within the long evolution of sculpting, various tools and methodologies are developed. From hand shaping to digital sculpting with 3d software, sculpting gets easier and more convenient, but the sensory of volume, movement and massing tend to get eliminated. To regain the rich and pleasant experience, the AR MIDI tool SoundSculpt provides the user a chance to do sculpting in a novel medium: sculpting a virtual bubble with multi-sensory from inside out while doing MIDI performance. In this experience, the user can perform sculpting movement spatially with hearing and touch at the same time of generating life music and visual art. The SoundSculpt, which can be deployed to an AR app for individual use or web-based software for larger visual projection, is developed using BOSE AR Frame.
‍KEYWORDS
Audio AR; Cognitive Enhancement; User Interface; Interaction; MIDI; Experience Design; H; Digital Media; Responsive Environment; Game; Product Design; Visual art; Performing Art; MIDI; Responsive Environment; User Experience; Sculpt; Sound Research; Game Design
‍
Yiqi Zhao
Note: this research is supported by BOSE and MIT Game Lab

‍

1   BACKGROUND

The dialogue of this research is between humans, design, and tools. The product focuses on digital times audience who are designers, music and art lovers under high pressure, or daily anxiety, and would appreciate a light generative interface for resting time. This research proposes an XR MIDI tool that can not only be used in large stage life music performance but also for individuals’ daily use. The broader goal to start this research falls into the vision of providing convenient, playful digital products for emotion regulation and anxiety relief. The theoretical approach falls on two parts, the synesthesia simulation, and emotion regulation strategy.

1.1 Synesthesia Simulation

Synesthesia refers to the production of a sense impression relating to one sense or part of the body by stimulation of another sense or part of the body. According to multiple pieces of research, the existence of a lower, unconscious degree of synesthesia in non-synesthetes is found widely for music perception. [1] Vision, hearing and other senses are strongly connected. There are also many musicians claims that they can hear the color of one piece of melody and can describe details as thick, dark, sweet, white...Based on the theory of synesthesia, sound cues could be intimately effective to enhance the act of touching and perception for motion, volume, acceleration.

1.2 Immersive Experience for Positive Reappraisal

Games with identical outputs and visual rewards are considered to have significant psychological impacts on the user. During the game experience, the user often enables the various type of cognitive emotion regulation like reappraisal, acceptance. The experience also reveals self-control and self-confidence building and fulfilling. Thus, the highly multi-sense game can be effective for emotion regulation towards specific mental health and well-being seeks. The main purpose of this research is to examine new possibilities for perception enhancement within the design in digital platforms, specifically how can we enhance the experience of performing by transforming tangible notions into the intangible medium. [2]

‍

DIGITIZING Sculpting Aspects into an audio interface

‍

2   CONCEPT DEVELOPMENT

The initial concept of digitizing sculpting into an audio interface comes from the breakdown of four aspects: cross-culture appreciation, the creation of individual imagination, materiality as medium and methodology, and focus as a mental aspect. The concept engages multi-perception into a closed-loop interface by preserving the user attachment of art practice and transforming the materiality and medium of the creation process. The loop is constantly happening during playing among “listen”, “sculpt”, and “art generation feedback”.

‍

3 DESIGN IMPLEMENTATION | IDEATION

Overview

In this game, the performer will sculpt a virtual bubble from inside out according to the sound cues. At the initial stage, the performer will be placed inside a translucent small bubble, which is visible in the app, but the performer can’t see the visual during the performance. When the game starts, the performer can push out the bubble surface while pressing the sculpt button on the app and tilting the phone, the bubble will display manipulated surface with pattern gradience according to life soundtrack playing on the Bose Frame. When the performer is navigating in the space and manipulating the bubble, live music will be generated according to the action.

3.1 Experience Ideation

Action Design:

The surface manipulation modeling, Table3, was studied in rhino with grasshopper and c#. Several forms of visual output were generated using mesh, vertices and move commands. These studies are deployed as simple mouse click interactions. In table 3, each interaction visuals is captured at 20ms/frame.

B. Push slowly
A. Slide over
C. Poke punch

‍

‍User Scenarios for app daily use:

• daily lightweight play on a smartphone, between working sessions, during the coffee break

• effective for a positive reappraisal for emotion regulation, reduce anxiety and back to work

• refresh body and brain with multi-sensory activities in the space, also a body stretching time

User Scenarios for performance use:

• indoor life music performance with projection of life visual arts generated by the performer

• cross-location and cross-time viewing of the performance in Virtual Reality

• revisits of recorded performance art piece

Sound Testing:

A group of 5 designers and 3 non-designers were presented with a list of 55 sound cues and three simple surface manipulation visuals. Testers were asked to pick 5 sound cues to represent the surface manipulation visuals.

• Clothes Friction (weakest)

• Balloon Inflate

• Bubble Pop

• Water Drop

• Squeeze (strongest)

Outcome sound effect reveals a linear like match of sound pitch and surface deformation. Sound with stronger pitch and clear uniqueness is matched with stronger deformation.

‍

‍

4 DESIGN IMPLEMENTATION | INTERACTION

4.1 Interaction Layer A: MIDI Orientation

In layer A, there presents a constantly playing soundtrack. Figure 1. The BOSE Frame is used to orientating the performer with head rotation data. The data comes from gyro sensors. The coordinates of the performer’s head rotation are mapped to sound pitch and volume with c# scripts in unity. [Head Up & Down] is mapped to [Pitch Up & Down], [Head Left & Right] is mapped to [Volume/Stereo Up & Down]. At the horizontal rotation plane, the true north is at the fastest and loudest sound cue.

Fig 1. MIDI Layer A: Bubble track (continuance)

‍

Initial design decision mapped pitch to horizontal plane but during testing, testers tend to forget to move heads up&down because pitching is more playful to navigate. Thus, the final decision mapped pitch to vertical plane and mapped volume/stereo/beats to horizontal plane.

Frame: Orientation Final Decision

•Head Up & Down: Pitch Up & Down

•Head Left & Right: Volume/Stereo Up & Down

‍

4.2 Interaction Layer B: MIDI Sculpt

Play testing prototype is generated using BOSE Frame and mouse right or left click on web display using unity with c#. There are three scripts to control the sculpting manipulation on a virtual bubble: mousebeat(c#)[A], mouseinput(c#)[B], and sculpt(c#)[C]. Table 5.

4.3 Play testing 1

Proposed Sculpt Behavior with smartphone based interface:

•Behavior 1 - slide over (location) - phone vertical

•Behavior 2 - stay pushing (size) - press button

•Behavior 3 - push harder (depth) - phone horizontal

Play testing interface based on Web/VR:

•On Mouse Left Down - /bubble inflation/

•On Mouse Right Down - /rocket beats/

4 participants participated in the  play testing based on web interface display on projector and Bluetooth mouse as push input tool. while one participant is sculpting, the other three will examine viewing experience. All 4 participants indicated that a pattern bubble is more matching to the sound. Participants also indicated that they would like to manipulate more bubbles or more complex visual materials so as to be awarded more complex visual outcome at the end.

Fig 2. transparent solid bubble has strong poking effect but the outside view not good
Fig 3. pattern pass-through bubble has more dynamic pattern distortion, while also provides seeing through while pushing
‍
Fig 4. starting UI

‍

‍

5 DESIGN IMPLEMENTATION | DISPLAY

5.1 Viewer Side

Camera 1: rotating around outside the bubble

Camera 2: tracking the center of manipulation, displaying from inside, Figure 5

Fig 5. camera 2 displaying from inside the bubble

Virtual bubble display can be viewed in VR or recorded together with the soundtrack generated from users, which combines visual sharing, performing, and performer-viewer interaction. Figure 6.

Fig 6. virtual bubble experience from inside to outside
Fig 7. distortion of pattern based on user's motion behavior

‍

6 CONCLUSIONS

In summary, the engagement of multi-sense performance in close-loop interface with audio augmented reality can successfully enhance people’s perception of spatial volume, sound, motion, and orientation. The experiment also indicates that users are easier to orient themselves on the horizontal plane rather than vertical plane, and easier to associate material deformation with sound pitches. Future play testing may be done for measuring the playful quality and sound-form association with data monitoring. The testing methodology to determine emotion activities might include a before-after bio-metric data measuring using pulse sensors, GSR sensors, or EEG headband.

‍

REFERENCES

1. Bragança, Guilherme Francisco F., et al. “Synesthesia and Music Perception.” Dementia & Neuropsychologia , vol. 9, no. 1, 2015, pp. 16–23., doi:10.1590/s1980-57642015dn91000004

2. Jerčić, Petar, and Veronica Sundstedt. “Practicing Emotion-Regulation through Biofeedback on the Decision-Making Performance in the Context of Serious Games: A Systematic Review.” Entertainment Computing, vol. 29, 2019, pp. 75–86., doi: 10.1016/j.entcom.2019.01.001.

3. Stanger, Nicholas, et al. “The Role of Preperformance and In-Game Emotions in Cognitive Interference During Sport Performance: The Moderating Role of Self-Confidence and Reappraisal.” The Sport Psychologist, vol. 32, no. 2, 2018, pp. 114–124., doi:10.1123/tsp.2017-0001.

4. Troy, Allison S., et al. “Cognitive Reappraisal and Acceptance: Effects on Emotion, Physiology, and Perceived Cognitive Costs.” Emotion, vol. 18, no. 1, 2018, pp. 58–74., doi:10.1037/emo0000371.

5. Aldao A, Nolen-Hoeksema S, Schweizer S. 2010. Emotion-regulation strategies across psychopathology: A meta-analytic review. Clin Psychol Rev. Retrieved May 17, 2019 from https://www.ncbi.nlm.nih.gov/pubmed/20015584

6. Ewa Domaradzka* and Małgorzata Fajkowska. 2018. Cognitive Emotion Regulation Strategies in Anxiety and Depression Understood as Types of Personality Front Psychol 9:856

‍


Yiqi, designer, technologist, artist.

about me
contact me
⇑ TOPBACK TO WORKS


Have a good day !
HOME
Contacts
WA, USA
vanyiqi@gmail.com
sirrandom
Follow
/Long dream stretches, because of you.
Copyright © 2022 Yiqi Zhao. All rights reserved. 0_0