Background
The first time I used touch designer was in class. I was taught how to import movie files and map them to the geometry of various surfaces and objects.
Though I was only taught the basics of Touchdesigner, it was apparent to me the potential this software had in regarding projection mapping and visual creation. Touch seemed a more suitable tool for PM than the others I had tried up to this point, adobe after effects for example.
Though Learning a new piece of software is challenging and often frustrating, Touchdesigner's appeal was large, I knew if I put the time in, I could create something really interesting and unlike anything I have created before. I knew I could complete a projection mapping project using programs I am already familiar with, such as premiere pro and after effects, the educational opportunity that would come with learning Touch, and the potential results it could provide were hard to ignore.
Testing
Testing 1
For my projection mapping project, I had a fairly ambitious idea that required learning a multitude of tools and effects within Touchdesigner.
I knew this would likely be a difficult task for me as touch designer was new to me and so was node-based workflow in general.
With this in mind, I decided to take an approach that allowed me to learn the basics of Touchdesigner while also gaining insight into how difficult it would be to pull off the idea I had for the project. This is how i did it:
I created a test project with the sole aim of using an audio file to animate a basic shape, as using audio to control visuals would be a key part of my project idea.
To my test project, I added a sound file in, specifically, a drum and bass song. I then added an audio device so I could listen to the song as I worked on it.
I added an EQ Chop to isolate the song's kick drum.
I added a Math Chop (average) to turn the stereo signal of the song to mono.
Connected the mono signal to an Audio spectrum Chop and an Analyse Chop to simplify the waveform data, then I added a Envelope and Math Chop (divide) to normalise the waveform.
I connected a Trigger to the now normalised waveform. This Trigger toggled from zero to one, and back to zero again the instant the waveform hit its highest point, in this case, that was when the kick drum hit.
I wanted this trigger to control a circle so I added a Math Chop to better control the size of the circle.
It worked but this also gave the circle a very fast and binary expansion. To fix this, I added a lag module between the trigger and the math. this slowed down the circle's size change.
This presented another problem however. The lag was affecting the range of motion of the circle as the normalised 0 to 1 signal was slowed down, affecting its ability to max out and thus lessening the circles range of motion. To fix this, I added another Math (divide) and Envelope Chop to re-normalise the signal.
This is the completed test project.
Testing 2
In class, Anthony and I had a play with the actual projection mapping side of Touchdesigner via Kantan mapper and Stoner. After being taught the basics, Anthony and I challenged ourselves by projecting multiple video files onto a somewhat complicated shape.
After completing test number one and learning the basics of working with audio waveform data, then learning how to map visual data via kantan mapper and stoner, I now felt I had a good basis of knowledge to move forward with my main project.
Creating the Projection Mapping Project
Planning
Not long after learning what Touchdesigner is capable of, I had the idea of projecting audio-controlled visuals onto a set of speakers as a means to enhance the music-listening experience.
I took inspiration from live music shows where the artists utilise projectors and lights to bring energy to their shows; example here - https://www.youtube.com/watch?v=lvQlupjVYfE&t=174s&ab_channel=ChicagoProjectionMapping
I toyed with the idea of creating a completely automated, real-time system that would work on any song, however, this meant the visuals would have to be fairly generic. Instead, I wanted something more specific and fitting to a song and its particular narrative.
The selection process for choosing a song was fairly easy. I started by looking at EDM music, I liked the high energy of this kind of music and i also thought the consistent percussion-based sounds would make isolation of the percussive data easy to achieve; this is important as I wanted to control at least some of the visuals with the percussion of the song.
In the end, I scrapped the EDM idea, not because it was a bad idea, but because I found something else that had a stronger narrative that I was more excited to get behind.
I knew this new song would have been more difficult to work with as it was heavy metal, thus the drums were muddier than most EDM, however, this song and the projection mapping idea I came up with after listening to it was too interesting to not work on.
The song I chose was 'Hip to be scared' by 'Ice Nine Kills' - https://www.youtube.com/watch?v=ozOb5FcnDf4&ab_channel=IceNineKills
After selecting a song, I moved forward with the planning.
I came up with a concept and sketched it down.
The plan was to project audio-controlled visuals onto two speakers, then for the narrative section in the middle of the song, I would project a re-enactment of the story and display that in a silhouette style. The plan was to project this narrative section onto a panel between the two speakers.
I also considered projecting onto cut-outs of furniture to add a tree dimensional element to the scene, however, in the end, I felt this may have been visually distracting so I scrapped the idea.
After the concept was decided, I began thinking about the overall theme of the song, and how to translate that visually. for this reason, I created a document to get some of my ideas down for the theme and aesthetic of the project.
I broke the song down into its separate parts and came up with Ideas for each of the visual elements I wanted and when I wanted them to occur. I organised the song via song structure terminology: verse, bridge, chorus, etc. I also made sure to include timings for the change in parts in both seconds and beats. I referenced back to this table often as I created the project in Touch.
Creating
Following this tutorial I made an organic growth-type visual. Using noise as my base, I was able to create a feedback loop using a blur chop that continuously emanated from a circle chop at its centre. Information on how to make this was found at - https://www.youtube.com/watch?v=BKiK9G53WOw
I felt this effect would fit well with the quitter/slower parts of the song, such as the intro bridges, and verses.
I duplicated this feedback effect, and I replaced the noise with an image of a pentagram, this created a self-drawing image effect which I though would work well with the Pre-choruses
For the choruses I wanted some energy, this is were I used the songs waveform to control the visuals.
I reused the expanding and contracting circle I made in ‘test 1’, and modified it to work with this song by changing the audio file in, and adjusting the EQ to detect only the songs kick drum.
I added a background to the beating circle by using a video I found online. I adjusted the colours of this video using a ramp and added it behind the circle by using a composite chop.
I then decided I wanted the video in the background to have more life, so automated its play speed based on the beats o the song. I did this via a Beats chop.
In the song I chose for this project, at its centre is a homage to the movie American Psycho, in the form of the band recreating one of the movies most iconic murder scene. I wanted to include this in my project. On the speakers I wanted blood splatters to fit the timing of the stabbing part of the scene. I created this in Davinci resolve using an existing video I found online of various blood splatters on a greenscreen. After Keying out the green, I layered the splatters over one another and fit them to the correct timings in relation to the audio.
I had the idea of recreating this murder scene as a silhouette that I wanted to place at the in between the 2 speakers I was projecting onto. I felt a silhouette would fit the horror movie theme of the song quite well.
To create this scene, I filmed my self as the killer and the victim separately against a greenscreen.
This process proved more challenging then I though it would. As evident from the above photo, my green screen was not big enough to cover my whole body, and I was set on having the entire body included in the silhouette. To fix this I ended up having to rotoscope my legs manually. This was a tedious process but on that needed to be done.
Fully keyed out, I placed my 2 shots side by side and created the silhouette effect.
The keying was not perfect, but it was good enough for this project as detail was not a top requirement. If I wanted a better key I would have spent more time roto scoping, or I would have looked into AI background removal tools such as Gen1.
Finally, I added speed ramps between each stab to make it more impactful and give it a more animated/over the top look. I felt this worked to add more energy to the performance, more in-line with the intensity of the music behind it.
With the silhouette video completed, all that was left to do is export and add it to my touch designer project.
Putting it all together
I ran all of my different visual elements into a single ‘switch’. Using an animation chop, I was able to control the timing for each visual element. This way I could easily place the visual elements in time with the correct song parts. I referenced my pre-planning table when doing this.
At this point I was almost ready to map my visuals to my speakers and complete the project, however, I had a serious problem with lag. At this point in the project, there was quite a lot of elements cooking, and for some unknown reason, my switch did not work in switching off any signal that was not currently being displayed. This meant that everything was cooking all at once regardless of what I could see in the final output.
I had to come up with a workaround for this problem. I decided to render out the project as video file, and then reimport that video file back into the project for mapping and performance.
This is the final sequence I would be outputting to the projector
One video file for the right speaker, a flip chop for the left speaker, and the murder scene video for the middle screen.
The Stage
The speaker I had for projecting onto were black. This was an issue as projects are not as vibrant against this colour. To solve this issue, I used masking tape and paper to make the speakers white.
I used a piece of white foam board centre, between the two speakers. this is where I wanted to project the murder scene.
This is how I used Kantan to map each video to the board and speakers. Due to the simple geometry of the objects that made up my stage, this was a fairly straightforward process.
Photos of the projected setup
Here is the finished result:
'Hip to be Scared' Projected Music Visualisation
Link to full quality video and project files here: https://drive.google.com/drive/folders/1z_Kq6FqWUdFPFjtfD-NwJJDRTGc_3kNX?usp=sharing
Review
I knew going into this project I would have a difficult time. As previously stated, node-based workflow is fairly new to me, furthermore, the vast amount of things that can be done on touch designer was a little intimidating, however, through online research via tutorial videos, forums, and a lot of just playing around with the tools until I could get them to work, I was able to create something I am very happy with. This project turned out almost exactly as I had it in my mind during the planning of this project.
If I were to do it again or work on this project more with the knowledge I have now, I would have focused more on making more audio-reactive visuals and more narrative elements to fit with the story being told throughout the song. I feel these new audio-reactive and narrative visuals would have added more variety to the experience, while also making the narrative stronger. These new visuals would also give me more to add to the centre panel of the stage as this is left blank for the majority of the experience.
Final thoughts
The ability to generate interesting abstract visuals is something I have never known how to do and is something I have always been fascinated by. Through this project, I feel I have gained a base understanding of how some of these abstract patterns are created. Furthermore the ability to use various types of data to control visual elements is especially interesting due to its limitless potential. I am sure the skills I have learned form this project will be taken utilised later in my career as a content creator/videographer.
Comments