The Interactive & Immersive HQ

Advanced API Management in TouchDesigner

Do you want to know how to get the most from APIs to boost your creativity? Let’s learn some tips to effectively master API management in TouchDesigner with a cosmic twist.

Before we jump in, some basic information is needed. Starting with: what are APIs?     

API stands for Application Programming Interface and is a set of rules and protocols that enables applications to communicate with each other. Basically, APIs are the building blocks of the digital world.

APIs are a great tool for creativity as well. We can get most of them into our beloved TouchDesigner environment, so let’s start from scratch with a blank patch. Let’s go!

API management in TouchDesigner

There are several ways to use APIs in TouchDesigner. The simplest one is the Web Client DAT, that allows you to send or receive information from another application. We can write a Python script as well to manage APIs via Script DAT, CHOP Execute DAT or Text DAT to name a few.

To make things real, we will develop an audiovisual generative space-themed patch.

Ingredients:

  • NASA APOD API: APOD stands for Astronomy Picture of the Day, and retrieves pictures taken form the space
  • NASA NeoWs API: NeoWs stands for Near Earth Object Web Service and retrieves information about near Earth asteroids
  • We will use the StreamDiffusion custom tox that will generate images based on the information contained in the NASA APOD API
  • Finally, why not to add a soundtrack to our cosmic installation? To do this, we will use the Stability AI Text-to-Audio API

NASA APOD API

First of all, we visit the NASA Open APIs website and generate the API key. Then we can move to the Browse APIs section, open the APOD tab and get the API URL. There are several query parameters we can add. For our project we will include just the start and end date. The API URL looks like this:

https://api.nasa.gov/planetary/apod?api_key=INSERTYOURAPIKEY&start_date=2025-01-01&end_date=2025-03-28

Now let’s go into TouchDesigner. Create a new project, create a Web Client DAT, change the Request Method to GET and insert the API URL in the parameter box. If we click on the Request button, the API retrieves the data. It works!

To parse data and to turn them into a JSON-readable format, create a Text DAT and add this line of code in the Web Client Callback inside the def onResponse function:

op('text1').text = data.decode()

Data will be then stored into the Text DAT. Next, we connect it to a JSON DAT and to other JSON DATs to format the information in the proper way.

Now we have the images links and the images description. How do we use them?

  • We create a Select DAT to loop through all the images – via an LFO+Count CHOP – then we create a Movie File In TOP to display the images
  • We create another Select DAT to loop through all the images information – via the same LFO+Count CHOP – and later we will use them to write the prompt in the StreamDiffusion tox

Et voilà, the first step is complete!

Visual programming interface with multiple interconnected nodes, chains of boxes, and flow lines on a grid. A properties panel is open on the right side.

NASA NeoWs API

In the NASA Open APIs website, we can find the NeoWs API URL to be used to retrieve information about near Earth asteroids. The API URL looks like this:

https://api.nasa.gov/neo/rest/v1/feed?start_date=START_DATE&end_date=END_DATE&api_key=API_KEY

Insert your API key and select the start and end date.

We can apply the same flow adopted for the images parsing to image information. Create a Web Client DAT, change the Request Method to GET, insert the API key in the URL parameter and click on Request.

This API returns several information about the astheroids. To simplify things, we use three data:

  • The astheroid name
  • Its speed in terms of kilometers per second
  • Its potential hazard for the Earth (in terms of true/false)

We use this data to add a data visualization layer to our project. To do so, loop through the JSON file via an LFO+Count CHOP at high speed and parse data to three Text TOPs. Add titles, adjust font, size and position and composite the Text TOPs with the Movie File In TOP.

Et voilà, the Enterprise space station is taking off!

Get Our 7 Core TouchDesigner Templates, FREE

We’re making our 7 core project file templates available – for free.

These templates shed light into the most useful and sometimes obtuse features of TouchDesigner.

They’re designed to be immediately applicable for the complete TouchDesigner beginner, while also providing inspiration for the advanced user.

StreamDiffusion

Since AI is our new best friend, we want to integrate it inside our project. How? With StreamDiffusion.

StreamDiffusion is a pipeline-level solution for real-time interactive generation and is conceived for the real time generation of visual assets. It is a high-end tool with complex controls and extensive modularity. To get acquainted with StreamDiffusion, I suggest you to read the dedicated tutorial article.

Integrating StreamDiffusion from scratch in TouchDesigner can be tricky and requires a lot of time and coding skills. That is why in this patch we’ll use the StreamDiffusionTD tox developed by DotSimulate. It encapsulates all the StreamDiffusion features in a single tox and it is available on the DotSimulate Patreon channel for a small fee.

So, after having downloaded the tox, we put it inside our patch. We go to the Install tab and follow the Installation Steps. It will take a few minutes.

After the installation, go to the Settings 1 tab and press Start Stream. Then we proceed as follows:

  • Go to the Settings2 tab and select txt2img in the Sd mode box
  • Go back to the Settings1 tab and create an expression to connect the Prompt parameter to the information retrieved from the NASA APOD API
  • Connect the StreamDiffusion tox to the Composite TOP

Feel free to experiment with all the parameters. In my case, I added some movement by looping via LFO in the Denoise Schedule and the Seed parameters.

Et voilà, the overall patch is here!

I think that mixing the wonderful NASA images with the AI ones create a nice visual landscape. Sometimes the information retrieved from the images and sent to the AI as prompt are quite mysterious, so the images generation can be weird but funny.

Adding sound

Let’s add the soundtrack. To do this, we can take advantage of the Stability AI Text-to-Audio APIs. They allow us to create up to three minutes long music (at 44.1kHz) via a text prompt. It is important to underline that the system is still in Beta phase, so results may vary.

As far as APIs are concerned we will use a Python script. First of all, we need to get the API key in the Stability AI Developer Platform. Then we can copy and adapt to our needs the basic script shown below:

import requests

response = requests.post(
    f"https://api.stability.ai/v2beta/audio/stable-audio-2/text-to-audio",
    headers={
        "authorization": f"Bearer sk-MYAPIKEY",
        "accept": "audio/*"
    },
    files={"none": ''},
    data={
        "prompt": "INSERT THE PROMPT HERE",
        "output_format": "mp3",
        "duration": 20,
        "steps": 30,
    },
)

if response.status_code == 200:
    with open("./output.mp3", 'wb') as file:
        file.write(response.content)
else:
    raise Exception(str(response.json()))

The script creates a track and download in our project folder. So, to use it inside our patch, we can create an Audio File In CHOP, select the correct file path in the File parameter and connect it to the Audio Device Out CHOP. To make things easier, create a Text TOP to write the prompt and refer to it in the Python code.

Et voilà, we have our music as well!

Wrap Up

API management in TouchDesigner paves the way for advanced integration of external sources into our projects. From data visualization to advanced AI asset generation – mastering APIs is a great way to enhance our creativity and imagination. As usual, the sky is the limit.

Download the patch here