An old way to interact with new musical interfaces
Empathic Computing Lab UoA: Ryo Hajika (JP/NZ), Prasanth Sasikumar (IN/NZ), Amit Barde (IN/NZ), Yun Suen Pai (MY/NZ)
The Radarmin is an experimental musical interface that uses the Google Soli as a means to control a musical track in real-time. It enables the user to “play” a virtual musical instrument, and is intended as a demonstration of the capabilities of emerging touch-less interaction interfaces to facilitate new means of musical expressions. The Radarmin uses a touch-less interaction technique made popular by a unique musical instrument developed in the 1920s, the Theremin. While the Theremin works on the principle of change in capacitance, the Soli is a millimetre wave radar that works on the Doppler shifts it detects when objects are moved around it. This project is an attempt to showcase the abilities of the Soli as a creative, musical/auditory interface.
Empathic Computing Laboratory, Auckland Bioengineering Institute, University of Auckland – Concept.
Development and Implementation Google ATAP – Technical Support.
Ryo Hajika: Ryo is a HCI (Human-Computer Interaction) researcher, a prototype designer and a programmer, who currently works as a research assistant at the Empathic Computing Lab at the University of Auckland. He explores the intersection of media art and academic research to unearth hidden human factor that moves people from the data using code, and create new value.
Prasanth Sasikumar: Prasanth is a researcher and a programmer with a primary focus in cross reality development and user experience. He is currently a PhD student at the Empathic Computing Lab. His research interests include remote training and collaboration, and also incorporating empathy in remote solutions.
Amit Barde: Amit is a researcher and sound designer with extensive experience in the field of sound design for short films, games and other forms of visual media. He is currently a Research Fellow at the Empathic Computing Lab. His research interests include the role of sound in information delivery in everyday life, and also the part it plays in empathy.
Dr. Yun Suen Pai: Dr. Pai is currently a Postdoctoral Research Fellow at the Empathic Computing Laboratory, University of Auckland. His research interests include the effects of augmented, virtual and mixed reality towards human perception, behavior, and physiological state. He has also worked on haptics in AR, vision augmentation, VR navigation, and machine learning for novel input and interactions.