Using Stellar Evolution to Generate Music Score: Part 2

Here’s an excerpt of Michael Trigilio’s death metal score generated from the MESA stellar model data:


Using Stellar Evolution to Generate Music Score: Part 1


As part of our upcoming performance Our Star Will Die Alone, we wanted to generate a music score inspired by the actual evolution of a star as it evolves off of the main sequence – as it “dies” so to speak.  To generate this, we used an open source software package called MESA (Modules for Experiments in Stellar Astrophysics) developed by a team of astronomers and utilized by an extensive community of researchers, teachers and students to study how a star evolves with time (the code will be used in Adam’s undergraduate and graduate stellar astrophysics courses at UCSD this year).

[Read more…]

Genetic Cage Scores

As we are preparing possible project ideas for our class next Spring, I had a chance to explore a simple Cage score (after John Cage) using as a driver not chance but genetic code.  Gene sequencing has exploded in the past few decades thanks to advancements in technologies and techniques, creating an entire industry (with occasionally dubious aims).  The pace of this advancement is remarkable:  the Human Genome Project took 13 years to construct a full sequence of the human genetic code from 1990-2003, at a cost of $3 billion; the same sequencing can now be done in 1 day for $1000.

[Read more…]

PP’s First Installation: Solar Variations

Project Planetaria had its first installation in September 2012 with “Solar Variations”, an exploration of the variability of the Sun though light and sound.  The piece was on display during the opening of the new Experimental Media Lab in UCSD’s Visual Arts Department.

In this interactive video and sound installation, we explore themes we have been investigating over the past several months, including transsensory perception, participatory experience, remote perspective, and live performance.  The piece was comprised of several interacting components:

  • Two photodiode sensors were placed on the west-facing window of the Lab, with red and blue filters positioned in front of them.  These diodes drove square-wave generators attached to speakers, producing dissonant sounds with differing frequencies. As the sun set and its color move toward the red end of the visible electromagnetic spectrum due to refraction, the sound produced shifted in tone and volume: an audio representation of the setting sun.
  • A repeating loop of the previous month of UV imaging data of the Sun taken with the Solar Dynamics Observatory (SDO) was generated using the free java program JHelioviewer.  The Sun makes one rotation every 24 (equator) to 34 (poles) days, so this cycle represented roughly one “day” of the Sun.  UV radiation is invisible to our eyes, so the visualization of the SDO data is one example of sensory transformation.  The movie was also embedded in a “sunset” scene from the surface of Mars taken by the NASA Rover Spirit in 2005; as such the audience simultaneously experienced sunsets on two worlds.
  • The television screen output was masked by an black overlay whose transparency depended on the ambient noise in the room.  Hence, the real-time setting of the actual Sun erases the appearance of the SDO UV Solar movie.
  • Naturally, any sound from observers of this piece would also reveal the UV Solar movie, so audience members were encouraged to bring the movie into being by talking, yelling, clapping, etc. after the real Sun had set.  They themselves were also embedded in the movie through a camera with a difference image filter.


The overall result was a participatory experience which embodied video and audio interaction between observer and herself, the Sun and itself, and the observer and the Sun.

This exercise represents our first exploration of the themes that arise in direct interaction and interpretation of astronomical phenomena and data. We expanded upon many of these themes with student designers in our Spring 2013 workshop class.


Beauty of Data

Last week, Netherlands-based freelance editor and aspiring director Sander van den Berg created a beautiful video merging the image sequences of Jupiter and Saturn from the Voyager and Cassini missions to music from That Home. The images, some older than 30 years, are part of a massive repository of data collected by NASA and freely available to anyone with enough bandwidth and storage space.

In van den Berg’s hands, the raw footage becomes a mesmerizing sequence of vignettes set against the largest planets in our solar system. One cannot help but make an emotional connection in the jostling of clouds around Saturn’s North Pole to the hustle and bustle of our daily grind; the light touch of shepharding moons as they gracefully perturb Saturn’s rings illicits a heartfelt yearning for a lover or a lost friend; and the loneliness of two moons, passing each other in silence in their eternal dance is palpable. And despite the alien grandeur, the roughness of the images reminds us that these scenes are common and ordinary, played out for eons past and in eons future, long after the human drama has been eclipsed.

Scientific data is often viewed as cold, hard facts, devoid of sentiment or subjectivity. Van den Berg shows that data that looks upon our Universe also reflects on our humanity, and rejuvenates our connection to the natural world.

%d bloggers like this: