Elder Hall, University of Adelaide
Adelaide/Kaurna
For ACMC 2017, I performed a live version of Beringforboding, using a combination of live vocals processed through Ableton Live, a fixed media backing track, music visualisation created in Adobe AfterEffects and video processing using MaxMSP.
From the conference proceedings:
Bering(forboding) is an audiovisual piece created from a combination of field recordings and data taken from field footage. Bering(forboding) is an example of using multiple data elements in the composition process, and
using data elements as methods of conveying narrative
through the creative interaction between composer and
data.
The creative process behind this work involves the use of
data structures as ways of driving sonification and
compositional processes, through collected digital media
and computer-assisted composition. The work is also
intended to explore how the visual and aural can be used
to influence the other within the creative process. For
Bering(forboding), the initial field footage and recordings
were taken from a night spent at Beringbooding Rock,
roughly 360km north-east of Perth.
To create the audio track, values from a histogram
showing the red, blue, green and luminance channels of a
photograph were converted into sound frequencies, and
these frequencies were combined with a series of layered,
progressively time-stretched recordings (between 200% –
1600%) of a crow recorded in the same area. Three vocal
tracks from the composer were also added to this
composition, and the finished audio track was used to
drive a musical visualisation where audio frequencies
were used as keyframes for the animation movements.
With this practice, the visual and aural are linked through
a combination of sonifying digital information taken from
digital documentation, and using aural information (via
combining field recordings with compositions created
from various methods of sonification) to re-create visual
information through the use of music visualisations.