Wednesday, May 18, 2011

Assignment 3 - test 1 - mouse interaction

View my test sketch at openProcessing.org.

This a low sound quality (web download friendly) version of my mouse interaction testing sketch I have up and running. Its pretty self explanatory. Each section in the screen represents a position and the mouse replicates a person walking in and out of these positions.

My next step is to start using OpenKinect to get 'interfaceless' interaction!


View my test sketch at openProcessing.org.

Tuesday, May 17, 2011

Assignment 3 - Coding...

Currently I am still trying to get my head around how my sketch will work. How will each sound lead on to the next? What will trigger the different sounds? How will the participant be able to explore the interactive work? Below is my first attempt at Object Oriented Programming in an attempt to sort out the order in which all the components of the work will fit together.


My next order of business is to write out each of the functions listed above and to get a mouse interaction demo up and running. See you next time!

Wednesday, May 11, 2011

Assignment 3 - Sound Ideas

This is what I've been working on in the sound department. Enjoy!

SOMA2607 - InteractivityTrack 02 by Timothy Watt

Assignment 3 - The 'Full Body DJ'

Last week I gave a presentation explaining my idea for my final project and this is more or less a transcript. Enjoy!

The 'Full Body DJ'

Full body DJ-ing is about 2 things. Firstly it is about ‘body music’. What I mean by this is interacting with a sample audio track without an immediate interface. This means navigating, effecting, triggering additional tracks and adding your own samples to create a mix. Secondly it is about ‘embracing the glitch’. Delays, pops, and other imperfect aspects of digital audio manipulation will be embraced. Their inherent faults will be used for musical aesthetics.

My first inspiration is ‘Subcycle’: a work developed by a guy called Christian Bannister. In my opinion, with his 'Minority Report' style, multi-touch interface, he has taken interactive electronic music to a whole other level. He is able to navigate, reverse, bit-slice and granulate sample tracks in real time. His dynamically generated visuals are also very impressive.

My other major inspiration is a work called ‘Twill’ by Matt Pearson or ‘zenbullets’ as he likes to be known. It one of 100 works in his 100 Abandoned Artworks series which can be found on his website. I really like how this work looks. The simple black and white is very effective and the flowing form in the middle is mesmerizing.

I really like the idea of a dynamic waveform that follows the participant’s (similar to Petra’s digital calligraphy) and I also really like the look of the black and white forms from Matt Pearson’s series so I might aim for something along the lines of this for the visualisation.

The audio that I am working on is a short piece of ambient electronic music. It will be around 20seconds in length and is based around the one key so that the different tracks can start at different times but still work together. Once the piece is built up by the participants interaction I want it to sound rhythmical, looped and glitchy.


The participant will interact with the program by their movements in front of an Xbox Kinect. Changes in the soundscape will be trigger by four parameters:
1.       Presence (background subtraction) – for example to start a rhythm cycle.
2.       Position (frame differencing) – to control audio effects, scrubbing, etc.
3.       Depth (infrared via Xbox Kinect) – visuals
4.       Volume (dB level) – don’t really know if I could do it as of yet but it’s more interactivity that could be utilised.

My plan from now until the project is due in four weeks is as follows:

-          For the rest of this week (and maybe next) I will finalise the projects sound library. This will really just involve deconstructing a segment from one of my current tracks and formatting them into 20 second intervals.
-          Starting next week I will divide my time between focus on the sketch’s functionality and the visuals. As for functionality, I currently have a simple mouse interaction prototype in the works so the next step will be incorporating data collected by the Xbox Kinect into this interaction. After that the obvious testing and tweaking.
-          As for visualisation, I plan to borrow code from Petra’s ‘Simple Digital Calligraphy’ and have a dynamic waveform that follows the movement of the participant. I haven’t started this and I’m guessing that a lot of experimentation is in order.