Module 3 Activity Research

Weekly Activity Template

Mikhael Ledger


Project 3


Module 3

Together with Bre, Belinda, and Zaynab, we created a working prototype that enhances live music experiences by making performances more tangible and interactive. This system allows audiences to literally play with the music the performer plays on stage. We achieved this using live reactive visuals in TouchDesigner connected to MIDI inputs from instruments, and body-movement data from the Kinect. Audience can interact with a digital stage environment, creating a visual dialogue between performer and audience. We centred our design around the song “A LONG DREAM” by SE SO NEON. As a larger vision, we planned to install and perform this setup in a real stage venue—preferably the Marquee at Trafalgar Campus—to invite more people to listen, interact, and play together.


Activity 1: Developing Drum Visuals

I brainstormed what different visual variables existed to be changed by the drums.
Looking at the drum tabs for “A Long Dream,” the hi-hat (closed), snare drum, and bass drum were consistently played throughout the song. I brainstormed denotative/connotative feelings that each drum part gave. Then, taking visuals from the music video of “A Long Dream,” I matched different textures to each drum part that I thought matched their qualities. I also had to consider the frequency at which each part was played, the more frequent a part was played, the faster the visuals could become cluttered and overwhelming. For these visuals, I had to figure out how to combine Belinda and Zaynab’s existing visuals with mine, without it becoming overwhelming. I went forward with the water motif, echoing Bre’s keyboard water design. 
Here, I considered the amount of space a person might take up on the floor. I then created a mask SVG in Illustrator from my sketches to be used in TD. Applying concept to the overhead projection.

Activity 2: Testing Drum Input

When trying to access the MIDI data input from my drum kit, we encountered some challenges. To get around being unable to use the MIDI In CHOP, we used a DAT operator to access the values as they were registered. 
Main roadblocks:
How do we get the data of drum parts that are played at the same time (since only one instance is read at a time, a channel of each drum part is not retained like with a CHOP)?
How do we isolate parts of the drum to affect different variables of the visual? Playing with HSV Adjustment TOP to shift the hue of the tide pools.
The shift of the hue was triggered by the index number mapped to the hue value range. However, since the index numbers were close together in value (47-54), there was not a noticeable difference in the hue change. Initially, I used the index of each drum part hit to affect the hue of the design, as this would allow each drum part to have a unique difference. For the final version, the visuals were affected by the velocity of the hit of a drum part.
This allowed for a more noticeable difference in the hue shift and a better connection between the musician’s actions and the visuals. However, it was not drum part independent, meaning all the drum parts created the same change (only dependent on velocity.)

Project 2


Project 3 Final Prototype

Final protoype, with floor and wall visuals, and the Marquee model.

Final floor projection prototype. Further development of the model of the Marquee was made by Belinda using provided measurements. She went on to add details to the windows and doors, as well as adding the overhead truss, using wood and laser cutting techniques. Bre worked on the visuals for the keyboard input, which would ideally be projected onto the walls of the Marquee. This visual combined the research from Zaynab and Belinda, using a off a water motif, playing off of lyrics from “A Long Dream.”.
×

Powered by w3.css