OSCar: A non-visual multi-touch controller (software demonstration)

Patrick McGlynn (NUI-Maynooth)

Assista o vídeo: http://vimeo.com/41570616

Abstract: OSCar is an iOS application, currently under development, for realtime music control. The software employs a unique non-graphical approach which fully-utilizes the rich data generated by MT user input. The primary motivation behind this project is to enable musicians to harness the power of MT gestures – not just as blunt tools used to interact with GUIs, but as highly expressive and nuanced actions in their own right. OSCar works by analyzing raw touch information on the mobile device itself. The resulting high-level data is sent via a wireless connection in an easily-readable OSC format which describes the properties of commonly-used MT gestures (parameterized multi-tap, swipe, pinch, pan), orientation sensors (accelerometer values) and other abstract properties (number of touches, centroid, discreet zones). Different gesture recognizers may be toggled on and off to accommodate easy testing and mapping of data to musical parameters. The corresponding preferences menu is external to the application itself in order to prevent accidental changes during live performance. The visual aspect of OSCar is minimal and serves only to provide distinctive cues for the user. There are no widgets or menus - the priority is to provide rich, modeless feedback in a non-prescriptive application. This software demonstration will show OSCar being used to create some basic performance interfaces.