Guest Post

 

A detailed post of the preset building approach for my thesis installation

TouchDesigner | Case Study | Custom Parameters and Cues

by Matthew Ragan

I recently had the good fortune of being able to collaborate with my partner, Zoe Sandoval, on their MFA thesis project at UC Santa Cruz – { 𝚛𝚎𝚖𝚗𝚊𝚗𝚝𝚜 } 𝚘𝚏 𝚊 { 𝚛𝚒𝚝𝚞𝚊𝚕 } Thesis work is strange, and even the best of us who have managed countless projects will struggle to find balance when our own work is on the line – there is always the temptation to add more, do more, extend the piece a little further, or add another facet for the curious investigator. Zoe had an enormous lift in front of them, and I wanted to help streamline some of the pieces that already had functioning approaches, but would have benefited from some additional attention and optimization. Specifically, how cues and states operated was an especially important area of focus. I worked closely with the artist to capture their needs around building cues / states and translate that into a straightforward approach that had room to grow as we made discoveries, and needed to iterate during the last weeks leading up to opening.

du5a0952.jpg

The Big Picture

{ remnants } of a { ritual } is an immersive installation comprised of projection, lighting, sound, and tangible media. Built largely with TouchDesigner, the installation required a coordinated approach for holistically transforming the space with discrete looks. The projection system included four channels of video (two walls, and a blended floor image); lighting involved one overhead DMX controlled instrument (driven by an ENTEC USB Pro), and four IoT Phillips Hue lights (driven by network commands – you can find a reusable approach on github); sound was comprised of two channels driven by another machine running QLab, which required network commands sent as OSC. The states of each of these end points, the duration of the transition, and the duration of the cue were all elements that needed to both be recorded, and recalled to create a seamless environmental experience.

Below we’re going to step though some of the larger considerations that led to the solution that was finally used for this installation, before we get there though it’s helpful to have a larger picture of what we actually needed to make. Here’s a quick run-down of some of the big ideas:

  • a way to convert a set of parameters to python dictionary – using custom parameters rather than building a custom UI is a fast way to create a standardized set of controls without the complexity of lots of UI building in Touch.

  • a reusable way to use storage in TouchDesigner to have a local copy of the parameters for fast reference – typically in these situations we want fast access to our stored data, and that largely looks like python storage; more than just dumping pieces into storage, we want to make sure that we’re thoughtfully managing a data structure that has a considered and generalized approach.

  • a way to write those collections of parameters to file – JSON in this case. This ensures that our preset data doesn’t live in our toe file and is more easily transportable or editable without having TouchDesigner open. Saving cues to file means that we don’t have to save the project when we make changes, and it also means that we have a portable version of our states / cues. This has lots of valuable applications, and is generally something you end up wanting in lots of situations.

  • a way to read those JSON files and put their values back into storage – it’s one thing to write these values to file, but it’s equally important to have a way to get the contents of our file back into storage.

  • a way to set the parameters on a COMP with the data structure we’ve been moving around – it’s great that we’ve captured all of these parameters, but what do we do with this data once we’ve captured it? The need here is thinking through what to do with that data once you have it captured.

Cuing needs

One of the most challenging, and most important steps in the process of considering a cuing system is to identify the granularity and scope of your intended control. To this end, I worked closely with the artist to both understand their design intentions, as well their needed degrees of control. For example, the composition of the projection meant that the blended floor projection was treated as a single input image source; similarly, the walls were a single image that spanned multiple projectors. In these cases, rather than thinking of managing all four projectors it was sufficient to only think in terms of the whole compositions that were being pulled. In addition to the images, it was important to the artist to be able to control the opacity of the images (in the case of creating a fade-in / out) as well as some image adjustments (black level, brightness, contrast, HSV Offset). Lighting, and sound had their own sets of controls – and we also needed to capture a name for the cue that was easily identifiable.

As lovely as it would be to suggest that we knew all of these control handles ahead of time, the truth is that we discovered which controls were necessary through a series of iterative passes – each time adding or removing controls that were either necessary or too granular. Another temptation in these instances is to imagine that you’ll be able to figure out your cuing / control needs on your feet – while that may be the case in some situations, it’s tremendously valuable to instead do a bit of planning about what you’ll want to control or adjust. You certainly can’t predict everything, and it’s a fool’s errand to imagine that you’re going to use a waterfall model for these kinds of projects. A more reasonable approach is to make a plan, test your ideas, make adjustments, test, rinse, repeat. An agile approach emphasizes smaller incremental changes that accumulate over time – this requires a little more patience, and a willingness to refactor more frequently, but has huge advantages when wrestling with complex ideas.

read more…

(ノ◕ヮ◕)ノ*:・゚✧