The following code sample demonstrates how to sonify (generate music from) images via jythonMusic.
In particular, it sonifies the following image. This image depicts a sunset at the Loutraki seaside resort in Greece (Fig. 7.4, p. 232).
Here is the music generated from the sonification:
A soundscape refers to a musical composition that incorporates sounds recorded from, and/or music that depicts the characteristics of an environment (e.g., a city soundscape or a forest soundscape).
The code sample below demonstrates how sonification of image data (using Image library functions) can create interesting musical artifacts. This is done by mapping visual aspects of an image into corresponding musical aspects.
An image consists of pixels (or picture elements). A pixel is the elemental data that digitally represents a single point in the original scene (as captured by a camera). The number of pixels in an image depends on the quality (or resolution) of the digital camera.
When sonifying, the ways we can map pixels to sound are not prescribed. A rule of thumb is to find what inspires you about a particular image and explore how you might convert that to sound. So image sonification involves imagination and artistic exploration.
The image above has a very nice gradient that gets brighter from left to right. The sun is not shown but can be imagined. There is a clear horizontal division between the sea and sky. The mountains, on the left, provide a contrast to the color of the sea and sky. Finally, the image gradient is interrupted by the (somewhat noisy) visual layers and the sea at the bottom half of the image. Clearly, there is enough structural variety in the visual domain to provide interesting analogies in the musical domain. All this can be exploited by selecting certain rows (or columns) of pixels (as shown below in red), scanning the image left-to-right (or up-and-down), and converting individual pixels or areas of pixels to musical notes or passages.
In this case study, we use the following sonification rules:
- Left-to-right pixel (column) position is mapped to time (actually, note start time);
- Brightness (or luminosity) of a pixel (i.e., average RGB value) is mapped to pitch (the brighter the pixel, the higher the pitch);
- Redness of a pixel (R value) is mapped to duration (the redder the pixel, the longer the note); and
- Blueness of a pixel (B value) is mapped to dynamic (the bluer the pixel, the louder the note).
Using the same sonification scheme with other images will likely generate interesting results. For instance, you could select a new image with this scheme in mind. Or you could create/modify an image (e.g., via Photoshop) with the particular sonification scheme in mind.
However, the most appropriate way is to pick an image and then design a set of sonification rules to use that matches its features. The image choice and sonification rules are intimately connected.
This code sample (Ch. 7, p. 231) implements the above rules to sonify the Loutraki Sunset image.
# sonifyImage.py # # Demonstrates how to create a soundscape from an image. # It also demonstrates how to use functions. # It loads a jpg image and scans it from left to right. # Pixels are mapped to notes using these rules: # # + left-to-right column position is mapped to time, # + luminosity (pixel brightness) is mapped to pitch within a scale, # + redness (pixel R value) is mapped to duration, and # + blueness (pixel B value) is mapped to volume. # from music import * from image import * from random import * ##### define data structure soundscapeScore = Score("Loutraki Soundscape", 60) soundscapePart = Part(PIANO, 0) ##### define musical parameters scale = MIXOLYDIAN_SCALE minPitch = 0 # MIDI pitch (0-127) maxPitch = 127 minDuration = 0.8 # duration (1.0 is QN) maxDuration = 6.0 minVolume = 0 # MIDI velocity (0-127) maxVolume = 127 # start time is randomly displaced by one of these # durations (for variety) timeDisplacement = [DEN, EN, SN, TN] ##### read in image (origin (0, 0) is at top left) image = Image("soundscapeLoutrakiSunset.jpg") # specify image pixel rows to sonify - this depends on the image! pixelRows = [0, 53, 106, 159, 212] width = image.getWidth() # get number of columns in image height = image.getHeight() # get number of rows in image ##### define function to sonify one pixel # Returns a note wrapped in a phrase created from sonifying # the RGB values of 'pixel' found on given column. def sonifyPixel(pixel, col): red, green, blue = pixel # get pixel RGB value luminosity = (red + green + blue) / 3 # calculate brightness # map luminosity to pitch (the brighter the pixel, the higher # the pitch) using specified scale pitch = mapScale(luminosity, 0, 255, minPitch, maxPitch, scale) # map red value to duration (the redder the pixel, the longer # the note) duration = mapValue(red, 0, 255, minDuration, maxDuration) # map blue value to dynamic (the bluer the pixel, the louder # the note) dynamic = mapValue(blue, 0, 255, minVolume, maxVolume) # create note and return it to caller note = Note(pitch, duration, dynamic) # use column value as note start time (e.g., 0.0, 1.0, and so on) startTime = float(col) # time is a float # add some random displacement for variety startTime = startTime + choice( timeDisplacement ) # wrap note in a phrase to give it a start time # (Phrases have start time, Notes do not) phrase = Phrase(startTime) # create phrase with given start time phrase.addNote(note) # and put note in it # done sonifying this pixel, so return result return phrase ##### create musical data # sonify image pixels for row in pixelRows: # iterate through selected rows for col in range(width): # iterate through all pixels on this row # get pixel at current coordinates (col and row) pixel = image.getPixel(col, row) # sonify this pixel (we get a note wrapped in a phrase) phrase = sonifyPixel(pixel, col) # put result in part soundscapePart.addPhrase(phrase) # now, all pixels on this row have been sonified # now, all pixelRows have been sonified, and soundscapePart # contains all notes ##### combine musical material soundscapeScore.addPart(soundscapePart) ##### view score and write it to an audio and MIDI files View.sketch(soundscapeScore) Write.midi(soundscapeScore, "soundscapeLoutrakiSunset.mid")