loading sounds
AURAL
AUTOMATA 7 | CLOUDS
This is an attempt to model clouds in HTML5 canvas using JavaScript.
There are many ways to simulate cloud movement in generative/procedural art. The most common method is to take pre-rendered images of clouds, adjust their alpha levels (transparency), place them on the canvas in random locations, and then control their movement via JavaScript. The advantage of this approach is that it is computationally inexpensive. The disadvantage is that this approach doesn't lead to clouds that inspire you to interpret the patterns; it lacks the generative qualities that make generative art less predictable.
The approach used here is to generate quasi-random noise in a 3-dimensional space and use that as a basis for simulating cloud patterns. Specifically, this experiment uses an algorithm for 3D Perlin Noise. The advantage of this specific way of creating noise is that it is capable of generating quasi-random patterns that retain local similarity while also retaining global variation. In this demo, each frame represents the gradual 2D movement through a 3D Perlin landscape, taking one 2D slice at a time. In addition, the specific 2D slices taken from the 3D space drifts to the left over time, thereby creating the illusion that the clouds are moving across the sky.
The Perlin algorithms used here were coded by
Kas Thomas. His blog contains an excellent description of how these algorithms work and how they can be used.
Because this experiment requires calling the algorithm multiple times a second, this particular experiment has the potential to run slowly on your machine, depending on its age.
The volume of each sound source is controlled by the proximity of that source to the drifting white dot on the canvas. The dot isn't a meaningful part of the visual scene, but I wanted to make it visible rather than hide it because it is easier to play around (e.g., moving portals close to or far from the dot) with the sound generation process. The small black dot controls some of the audio filters and the direction of movement for the clouds. I considered getting rid of the cloud direction controls, but, again, it is fun to play with. The movement of the clouds depends on whether the small black dot is moving toward or away from the central-most portal.
As with some of my other sound experiments, this one contains six audio sources, each of which is placed in a random location on the canvas. In this particular experiment, the sound sources are represented as rectangles. I wanted to create the appearance of portals floating in sky. The portals drift into the distance as the volume of the sound source decreases. The alpha value of the portals also shifts to create the appearance that the clouds are passing in front of them when the portals are further away. The sound portals can be dragged to new locations. There are a few other miscellaneous tweaks that allow the colors and patterns to evolve over time.
The sound sources are acoustic guitar samples that are being processed in real time using the Web Audio API and JavaScript. I wanted to make them sound more like acoustic drones rather than discrete sounds. The primary algorithm involves taking the sound buffer and passing it through a convolution filter that contains random values. The result is that the information in the original signal gets shifted and sustained in random ways. In other words, the sounds themselves have a cloud-like quality--they drift and shape-shift slowly over time.
R. Chris Fraley
2017 August 17
More audio/visual experiments