Horizon

9.12.15

Initial research

- Early inspiration came from the concept of exploring the space which we occupy.


MAPS AND JOURNEYS: INSTALLATION RESEARCH

'New York City Apartment Corridor/Ground Floor plus Staircase' - Do Ho Suh, 2000, Shown at Bristol Museum (2015)
When considering initial research for Maps and Journeys, the concept of exploring the space that we are actively present in fascinated me and instantly reminded me of this exhibit by Korean artist Do Ho Suh.
I love this work because it actively immerses the audience into a physical space that has almost become a shadow, or a thinly veiled representation of a space that exists thousands of miles away.

RACHEL WHITEREAD


Untitled (Stairs) - Rachel Whiteread, 2001
- Whilst researching further into Suh's works, I came across an English sculptor, draughtsman and printmaker named Rachel Whiteread. Makes sculptures of the spaces in, under and on every day objects.

- I wanted to experiment with the perception of space in an immersive way. I wanted to experiment with light and space to try and immerse someone in an environment, and gently make them aware of their own presence and how much control and power they have over the room around them.




MORE DEVELOPMENTS


This lead me on to the exploration of light.

After this I started to begin research light, its properties and its many, many types.

"Light is the connection between us and the universe"

Kurzgesagt


Photons can only be created or destroyed - not split

other forms of light are invisible to the human eye (shown on the electromagnetic spectrum)


What is Light? - Kurzgesagt

This got me thinking about the necessity of light in daily life. I thought of light as an enabler and its essential part of human life on earth, and I wanted to explore how light can affect human emotional response.  This was the catalyst for my research.
All kinds of light travel at the same speed. They only differ in their wavelengths.

During my research, I came across a sonic artist called Jereboam Fenderson who ran an audio signal through the X and Y channels on a oscilloscope, which means you can see the wave of the sound you are hearing. This visual representation of raw data can be manipulated in software and visualised into surreal patterns and drawings.



My idea was starting to take shape
At this point I wanted explore the relationship between light and sound and how raw data can be transformed into video art. My intention was to represent a journey of light and sound and their interaction and effect on the immediate environment they exist within.
My idea was to use a digital (or physical)l oscilloscope and translate physical, mathematical data into  sound that forms patterns of light based on the in putted data

I also wanted to create an interactive installation  where I could program each key of a keyboard to play a sound that makes a different shape on an oscilloscope and allow people to mess around with it and make their own combinations of patterns and sounds.

Or create video art using mathematical input data to visually represent sound using light
As I couldn't get my hands on an oscilloscope, I had to find a way to do it digitally. I decided I wanted to teach myself to use programming tools such as Puredata, or mathematica to create different visuals. 

There is also ways to visualise these soundwaves digitally. Using a program called FL Studio and a plugin called 'Wavecandy', one can input any audio (found or created) and output it in the format of an oscilloscope.

I found open source software called pure data, a music generation program that uses different objects and expressions to generate and visualise different sounds. 



Here I am now showing you my first experiments generating sin and cosine waves using PureData (Pd).

after several hours of playing around and using tutorials I managed to come up with this rudimentary audio visual generator.

 The sliders basically control the volume of the frequencies. you can slide these up and down to regulate how loud your sound is. The numbers (101 and 158) represent the Hz of the frequency. e.g. 101 represents  101 Hz and 158 represents 158 Hz. 

The 'Phasor' item is basically our playback engine, and creates our oscilloscope. The 'dac' item at the bottom stands for 'digital to analogue converter) and is the sound output. The Sin and Cos values are our wavelengths we are using to generate the sound. Their levels are both controlled by the frequency item and the volume slider as mentioned before. The object next to the 'dac' object is how we visualise our sine and cosine waves and how we get the neat pattern.

.

The values of the sine are also modifiable, as shown in the diagrams below.

Before

After
I learnt so much from this program  and realised the potential of interactivity, and so I intend to extend my knowledge of it in the future


After this experiment, I wanted to create some kind of interactive art piece where two individuals could use sliders or a midi controller to modify the pitch, frequency and values of the sin waves to essentially collaborate, experiment to create their own unique patterns together by using sound. I wanted to portray the unique relationship we share with light and sound, and I want the individuals to interact with it in one of its most primal levels. 


HOWEVER, I TOOK THIS ONE STEP FURTHER

I started thinking about other ways of visualising sound as light, and found a video by flight 404 on vimeo called audio-generated landscapes.
Audio-Generated Landscapes
It is a music video which uses the volumes and peaks of the waveform to generate a landscape full of plateaus, plains, ridges and mountain ranges. 

The video was produced in Processing; and open source java-script editing program for visual artists and graphic designers. It got me intrigued in the nuances of coding and I was fascinated with the way expressions and lines of code could create something so aesthetically diverse. 

I started reading books about Processing, and even attempted to use it myself, but It is a very complex program and the code I created was nowhere near to the standard that I expected of myself.

 It got me thinking about the expression of landscapes and how to use them in conjunction with sound. One of my ideas was to go into a landscape, record the sound in the landscape and generate that sound to create a new landscape. I looked into the work of Dr Nigel Helyer, A sonic artist and professor of Sonic Art in Sydney, where he creates sonic objects that change and are shaped by their environment.
I also looked at immersive, interactive environmental projects such as Refik Anadol's Inifinity Room and one of his other projects, LADOT | City of Bits. 

This left me wanting to create some kind of audio-reactive installation that was immersive as well as reflective. Furthermore, part of my research was influenced by one of our lectures about psychogeography and the emotional impact landscapes have on us. In the end I was planning to turn a sound into a new landscape.

THE COMBINATION OF IDEAS

After discussing ideas with Klara, we found our ideas to be very similar. Klara wanted to turn a landscape into sound whilst I wanted to turn sounds into digital landscapes. We agreed that the we were to explore how the environments make us feel and how they express themselves. 

We thought the best solution for this was to make a twin screen. One screen with a mirrored landscape which offers a new perspective of the environment, and the second screen as the visualisation of the sound.

Screenshot of landscape

SOUNDTRACK & LANDSCAPES

Of course, the soundtrack is very important, and is the most important component as it is the catalyst of the piece (it generates the visualisation). Looking at other forms of landscape art, we were recommended a few music videos from the likes of Radiohead and The Chemical Brothers where the music is perfectly synced with the landscape going by. For example, in The Chemical Brothers' music video for 'Star Guitar' elements of the environment, such as a street light or a house are synced to a drum clap or a specific element of the musical composition, really expressing the relationship between environment and music. 



We created an individual soundtrack for each landscape, with each landscape being completely unique from the other. We chose two cityscapes and two natural landscapes. We shot two landscapes in Guildford, one in Ubley Warren Nature Reserve (North Somerset) and finally in Cheddar Gorge Using a combination of field recordings from location and virtual instruments in ProTools, we attempted to map the volume of the soundtrack to the peaks of the landscapes and capture the atmosphere that we were immersed in and attempting to immerse the audience in.

ProTools - virtual instruments

Syncing audio and video

This was a relatively simple experience and it was interesting playing with different and unique sounds, frequencies and volumes. 

VISUALISATIONS - VJ PROGRAMMES

The beautiful thing about sound is its freedom to just be, and we wanted to capture this in our visualisations, and so we wanted them to just be.
We used so many programs to attempt to visualise the sound. We used PureData (which was a failure), after which we researched different VJ programmes such as vvvv, Millumin, Processing, Narratives 2.0 and Resolume. Out of all these, 
Resolume proved the most successful. We managed to get a good visualisation to a bit of looped footage and sound, as shown below.

Ubley Warren | Resolume Visuals
This showed alot of promise, however our trial expired and the full program costs around £300, so that one was ticked off the list.

Klara then contacted her friend who is very familiar with VJing and VJ programs, and has even developed his own software called ModV. He was very kind and introduced us to his program, even giving us a tutorial video for us to follow and to show us the basics.  We spent hours experimenting with different effects and I think it is definitely something that I want to pursue outside of university projects. However, these experiments seemed to be straying from the simplicity of the piece. Our tutor suggested that this would not be very wise sicne the very nature of the project is to see what the landscapes and sound can produce, and we had strayed into the dangerous territory of trying to shoehorn in lots of different visuals. This would have been jarring and disorienting for the audience. 

ModV

ModV

ModV

We also used free open source visualisers to try and create some audio reactive effects, however, these were too hard to control as they were randomly generated visuals and colours.
WinAmp


In the end we agreed on using and audio reactive effect in After Effects called 'AudioWave' and Wave Warp' to create the visuals. This worked tremendously well and fit in wonderfully with the message we are trying to communicate.



 We then created a twin screen composition, and looped each landscape three times so people in the gallery can reflect upon what is happening, giving time to reflect and observe.


We are using a Roland R0-5, a looper, projector and a pair of headphones to display our piece. The sound on the Roland will be synced manually with the video.




You Might Also Like

0 comments

Stay Updated