Move mouse or touch to interact

To me, the most beautiful aspect of generative design is its inherent infinite amount of unique details that can be discovered in each and every system and its iterations.

Sadly, my previous work with Processing and its underlying Java-Applet-structure left this aspect of interactive discovery and exploration untapped due to the fact that modern browsers just don’t allow this kind of technology to be executed anymore (and for very good reasons, I might add).

After developing over a dozen interactive, audio-reactive generative design-systems for my AV-performances with my good friend INJUVIK, I felt the need to move those ideas beyond the projector and allow those generative outputs to be explored by everyone.

What you see here is my first approach to bringing those interactive systems to the web. But before greatness can arise, there’s technologies to be conquered (:

This experiment thus focusses on some primary areas of interest:

  • (Pre-)Loading and playing back sound in the browser
  • Analyzing both the amplitude and (smoothened) spectrum in real-time
  • Using the analyzed audio-data to create and animate objects and particle-systems in 3D-space
  • Creating a level of interactivity for both mouse- and touch-input (this area certainly needs a bit more work)
  • Creating global controls for manipulating (and layering) the audio-loops

The underlying framework I initially chose for this WebGL-experiment (due to my prior work with Processing) is P5.js and while it has many great aspects to it, I feel like I need to move more towards the realms of three.js for an even more defined framework. To be continued…

Audio-loops by INJUVIK