Skip to main content

SWAN PANDA

JORGE RAMIREZ - MEDIA ARTIST
  • Technique: Audio-visual performance, Weather data, mixed reality, livecoding
  • Year: 2015






By critically engaging with weather data, Swan Panda places an emphasis on the Anthropocene and how this can be represented in artistic practice. This research practice aims to contribute new knowledge to the important project of imagining new and critical forms of art production in the age of the Anthropocene, which directly engage the viewer with the topics that surround it.


What does the Anthropocene look like? Should it be described as a singular decisive moment, or a series of banal everyday events? How implicit can an individual feel within such an evolving and accumulative system? With recent growth in natural and human caused catastrophes, there seems to be need for refinement and redefining of how art and culture responds to such questions. The artist collective Swan Panda explores the phenomenon of catastrophe, particularly The Anthropocene, through the lens of performative digital arts practice, presenting projects that interface the weather with digital systems, remediating atmospheric data with audio-visual performance tools, to reconfigure how we access and therefore, analyse such phenomena.


Since industrialisation, weather has been perceived in the general public as rather banal, except for particularly abnormal and significant events. As such events tend to be catastrophic by nature, at each event, new methods or representation and response are required to productively respond to them. Henri Bergson claimed that we have to change our way of thinking when facing such new events and this can also be contextualised to discuss how artists represent these phenomena. Like most ‘new’ art forms:


"The idea that, for a new object, we might have to create a new concept, perhaps a new way of thinking, is deeply repugnant to us." (Bergson, 1907)


Much of the motivation for this research stems from an artistic interpretation of common perceptions by the public and the media in relation to The Anthropocene, with the goal to create objects of convergence between natural and digital systems that both co-exist and evolve symbiotically. Swan Panda formed in 2014 as a transdisciplinary research collaborative with a general mandate to highlight and reconfigure public understandings of environmental phenomena, such as climate change and explore new ways to build empathy for our impact on such phenomena, using performance, intermediality and networked digital technology as prosthetic devices to connect our human condition with our on-going impact on ecological processes and raise awareness about the Anthropocene.


Consisting of Julian Stadon and Jorge Ramirez, Swan Panda use a combination of open hardware, software and live programmed audio-visual compositions, in order to challenge contemporary weather data aesthetics, using a range of aesthetic conventions such as glitch aesthetics, gif animation, sampling and live coding, also known as algorave, to do so. (McLean & Dean, 2018)







The performative practical method of Swan Panda involves taking atmospheric data from single/multiple locations and integrating it into live coded musical compositions that are then projected into discrete immediate environments or streamed live, to telematic spatial scenarios. These displaced physical locations are converged and augmented through datafication and fed back into each other, through audio and visual interfacing. The resulting sound and imagery refer to a range of phases within our interpretation of post natural, post biological and post Internet aesthetics.


Specific locations are chosen to collect weather data from, based on the location of both artists, for example in our first performance in 2016, one artist was in Corfu and one was in Mexico City so these two locations were chosen, as the weather is considered as the primary medium for this artistic collective and therefore, should be subjectively specific to each member of the collaborative, much like other mediums would be in more traditional collaborations. Weather data is then obtained, using a combination of a bespoke weather sensing station and using the Weather Underground JSON API. This allows for a combination of online, offline and also local and remote weather streams to be simultaneously fed into the live coded composition.


This data is then fed into 2 separate systems that communicate by sending an OSC IP and port message. One system uses the JSON weather stream with Supercollider and the other uses the bespoke weather sensor stream, porting this into Max/MSP. The sound composition consists of live coded generative/mediated soundscapes in Supercollider, mixed with samples that are played via a Max/MSP patch, both running through a 2-track mixer. A data loop is then created, by using microphone inputted feedback on the Supercollider system and an application written for Max/MSP that also implements a GIF video player.


The GIF visualizer is embedded within the Max/MSP patch and uses the accumulative composited audio feed, to live compose and cycle through over 400 sample GIFs, all created by the artists. These visual samples are all inspired by The Anthropocene and consist of a combination of filmed and found video and GIF footage, 3D animation, glitch renderings, remixed vaporwave animations, bio/digital phenomena and other post-biological, post-internet aesthetics-based animations relating to The Anthropocene.


In the physical setting for the performance, two projections are displayed, with one showing the live coding in Supercollider taking place and the Max/MSP GIF visualisation mix shown on the other. This method can be performed with either/or artist/s present or telepresent and also has the versatility to operate with or without either the weather sensor or JSON weather stream.



Full documentation of this project available in this paper.