Sonification
Once we had a scoring framework in place, we used it to code the data in the Excel spreadsheet. The Case Conversion file documents how language and expressions were converted into numerical values according to the scoring framework.
We then removed the discourse analysis data from the spreadsheet, retaining only the numerical values. This resulted in the Data Breaches Sonification Data spreadsheet, which we used to create the sonifications in TwoTone.
Sonification Demo 1
Example 1 represents the default output in TwoTone. As detailed in the Example 1 Code Book, the default settings for any dataset uploaded into TwoTone are C major with diverse instrumentation, a duration of 4:47, and a tempo (BPM) of 60. By loading the Data Breaches Sonification Data spreadsheet into TwoTone, you can generate an identical sonification without altering any variables.
A significant amount of time was spent experimenting with TwoTone. The program’s constraints can initially feel limiting, particularly due to the finite selection of instruments. A primary frustration was that early experiments resulted in very conventional musical compositions.
The examples below were created as I became familiar with TwoTone. Listening to even brief snippets will make it clear how I adapted to the application’s constraints.
Test 1
Out of the four examples, this one is the most conventional composition. It was created in C major, using a variety of instruments to represent each variable.
Test 2
This example explores different tempos and both major and minor keys, focusing on experimenting with various ways of manipulating the instruments and identifying which ones could produce intense sounds.
Test 3
Example three marked a turning point in the process. This version, set in D minor, involves significant experimentation with instrument distortion. It also features double bass for core variables like breach and perpetrators, which I carried over to example 2.
Test 4
This example showcases the extensive effort to push the limits of the instruments in TwoTone. To create the distortion and droning sound, I experimented with the track tempo, setting the harp and trumpet to a tempo of 12x. These discoveries were crucial and carried over into examples 2 and 3.
By experimenting with TwoTone, I learned to work within the application’s constraints. Understanding these limitations provided valuable insights into how to manipulate the instruments to produce different sounds, which are discussed in examples 2 and 3.
Additionally, this experimentation helped me develop and refine a process. Initially, I didn’t create a codebook for the sonification, but it became clear that the process was more important than the output. Instead of producing a single sonification to encompass the project, it was more fitting to create several examples to demonstrate the possibilities of sound data.