Cubing Sound Ensemble

Yichen Wang, Charles Martin

Details

Abstract

Head-mounted mixed reality presents new opportunities for natural musical control in three dimensions for facilitating musical creativity, yet existing works are limited to using hand-held controllers. In OzCHI this year, we present a free-improvised performance consisting of a novel 3D musical expression in AR. Audiences can see how a NIME (new interface for musical expression) in AR activates mobility, space and sound by the performer in musical performance.

Bio

Yichen Wang

Yichen Wang is a PhD candidate in computer science at The Australian National University, where she explores the relationship between HCI, art and augmented reality. Her recent works focus on augmented reality new interfaces for musical expression. You can check out her previous works here: https://yichenwangs.github.io/yichen/work. Photo Credit: Qichao Lan

Charles Martin

Charles Martin is a computer scientist specialising in music technology, musical AI and human-computer interaction at The Australian National University, Canberra. Charles develops musical apps such as MicroJam, and PhaseRings, researches creative AI, and performs music with Ensemble Metatone and Andromeda is Coming. At the ANU, Charles teaches creative computing and leads research into intelligent musical instruments. His lab's focus is on developing new intelligent instruments, performing new music with them, and bringing them to a broad audience of musicians and performers.