Notes from the 1st Deep Listening Conference


photo (2)

On July 12-14th, in the impressive EMPAC center on the campus of Renssalaer Polytechnic Institute in Troy, NY, the 1st International Conference on Deep Listening was held, and I (Adam) had the unexpected pleasure to present my ongoing Physics Gestures work at it.

 

[Read more…]

Dance-Physics Collaboration at Yale

We found a great example of another art-science collaboration being conducted at Yale University that resonates strongly with our goals.  Emily Coates (Dance) and Sarah Demers (Physics) have performed an exploration of the recent discovery of the Higgs Boson, the last particle of the standard model.  Collaboratively they have examined the gestures physicists use to describe the Higgs, developed aesthetic compositions from these, and reflected on how these compositions feeds back into the imaginations of scientists as they look to the next big problem in particle physics (which is likely to be the hypothetical dark matter particle).  Emily and Sarah have also taught a class, “The Physics of Dance“, and a book based on the course is in preparation

You can learn more about their project, Discovering the Higgs through Physics, Dance and Photography, at the Reintegrate website and the youtube video below.

[youtube http://www.youtube.com/watch?feature=player_embedded&v=OwZ6TIDLJSA]

TDDE 131 Final Project Showcase

Screen shot 2013-06-12 at 11.59.03 PM

We had a great turnout for our TDDE 131 Final Project Showcase; several dozen people came through to view (and interact with) the works the students created.  Here are brief summaries of all of the pieces:

[Read more…]

TDDE 131 Week 9: Final Project Work

[Notes]
[Crit]


6:00-6:30pm: Notes:

Final exam 12 June 5:30-8:30pm – this will be a public performance in SME 401

[Read more…]

TDDE 131 Week 8: Project Planetaria Devices

[Notes]
[Crit]


6:10pm – 7:30pm: Project Planetaria Devices

ppdWe made Project Planetaria Devices!  Michael led students through the basic electronics, soldering skills and talked about the “stupidest possible chip. ” Working with capacitors, watch batteries, potentiometers, photo capacitors, and toggle switches students created their own devices to emit light and audio.  They “played” these interments by the whole class holding hands to run a current through the whole class, and performed a Cage-inspired score with individual PPDS.

Then everyone got a kit to take home to make more!

 

[Read more…]

Michael Trigilio honored as a Distinguished Teacher

Our own PP PI Michael Trigilio will be receiving a UCSD Distinguished Teaching Award because, well, he is just that awesome.

http://visarts.ucsd.edu/news/michael-trigilio-receive-ucsd-distinguished-teaching-award

 

PP’s First Installation: Solar Variations

Project Planetaria had its first installation in September 2012 with “Solar Variations”, an exploration of the variability of the Sun though light and sound.  The piece was on display during the opening of the new Experimental Media Lab in UCSD’s Visual Arts Department.

In this interactive video and sound installation, we explore themes we have been investigating over the past several months, including transsensory perception, participatory experience, remote perspective, and live performance.  The piece was comprised of several interacting components:

  • Two photodiode sensors were placed on the west-facing window of the Lab, with red and blue filters positioned in front of them.  These diodes drove square-wave generators attached to speakers, producing dissonant sounds with differing frequencies. As the sun set and its color move toward the red end of the visible electromagnetic spectrum due to refraction, the sound produced shifted in tone and volume: an audio representation of the setting sun.
  • A repeating loop of the previous month of UV imaging data of the Sun taken with the Solar Dynamics Observatory (SDO) was generated using the free java program JHelioviewer.  The Sun makes one rotation every 24 (equator) to 34 (poles) days, so this cycle represented roughly one “day” of the Sun.  UV radiation is invisible to our eyes, so the visualization of the SDO data is one example of sensory transformation.  The movie was also embedded in a “sunset” scene from the surface of Mars taken by the NASA Rover Spirit in 2005; as such the audience simultaneously experienced sunsets on two worlds.
  • The television screen output was masked by an black overlay whose transparency depended on the ambient noise in the room.  Hence, the real-time setting of the actual Sun erases the appearance of the SDO UV Solar movie.
  • Naturally, any sound from observers of this piece would also reveal the UV Solar movie, so audience members were encouraged to bring the movie into being by talking, yelling, clapping, etc. after the real Sun had set.  They themselves were also embedded in the movie through a camera with a difference image filter.

7982051248_2f25b327d9_c7982051392_379805cdf8_c

The overall result was a participatory experience which embodied video and audio interaction between observer and herself, the Sun and itself, and the observer and the Sun.

This exercise represents our first exploration of the themes that arise in direct interaction and interpretation of astronomical phenomena and data. We expanded upon many of these themes with student designers in our Spring 2013 workshop class.

group2

The Space of Now

space-of-now-varies-by-perceptual-modeThe other night I was having dinner with my wife and my neighbor, discussing the concept of “living in the now”. Bah! My cold physicist brain immediately dismissed such new age nonsense.  We cannot possibly live in the “now” because all of the sensory input of our surroundings comes from the past, not the present. Information travels at a finite speed, be it via light (300,000 kilometers per second, or roughly 1 foot per nanosecond1), sound (340 meters per second, or about 1 foot per millisecond 1), or someone’s very strong perfume (about 3 millimeters in an hour if it is simply diffusion, but nearly instantaneous if you are on a plane). Looking at my wife, I was seeing her as she appeared 3 nanoseconds in the past; listening to my neighbor, I heard a story she told 3 milliseconds ago; and the full moon bearing down on us was merely a mirage from the distant past – all of 1 second ago. My experience at that dinner table was a hodge-podge of the asynchronous past.

[Read more…]

Beauty of Data


Last week, Netherlands-based freelance editor and aspiring director Sander van den Berg created a beautiful video merging the image sequences of Jupiter and Saturn from the Voyager and Cassini missions to music from That Home. The images, some older than 30 years, are part of a massive repository of data collected by NASA and freely available to anyone with enough bandwidth and storage space.

In van den Berg’s hands, the raw footage becomes a mesmerizing sequence of vignettes set against the largest planets in our solar system. One cannot help but make an emotional connection in the jostling of clouds around Saturn’s North Pole to the hustle and bustle of our daily grind; the light touch of shepharding moons as they gracefully perturb Saturn’s rings illicits a heartfelt yearning for a lover or a lost friend; and the loneliness of two moons, passing each other in silence in their eternal dance is palpable. And despite the alien grandeur, the roughness of the images reminds us that these scenes are common and ordinary, played out for eons past and in eons future, long after the human drama has been eclipsed.

Scientific data is often viewed as cold, hard facts, devoid of sentiment or subjectivity. Van den Berg shows that data that looks upon our Universe also reflects on our humanity, and rejuvenates our connection to the natural world.

Astronomical Synesthesia

crab-nebula-radio-image

Radio image of the Crab Nebula: an example of astronomical information that can be neither seen nor heard directly

The image to the left is a radio picture of the Crab Nebula, the debris field from a massive supernova explosion that was recorded by astronomers in 1054 AD. These data come from 11 hours of observations made by a large array of metal dishes called, uncreatively, the Very Large Array. In this image, we see the structure of the debris as it streams away from the explosion site at 1000 kilometers per second, energized by radiation and winds coming from the remnant of the long-dead star, a spinning ball of neutrons 10 km wide and more massive than our Sun.

Except, we don’t actually see anything.

[Read more…]

%d bloggers like this: