Selected Examples of Electronic/Computer Music
Peter Gena
Music for Passages (2007-2017), real-time, interactive computer music. Complete description.
- Passages was a permanent interactive installation by my late friend and colleague Steve Waldeck. It ran along a 96 foot-long corridor at the College of Lake County, Grayslake, IL (installation date: May 31, 2007). The piece, an homage to a local American gothic farmhouse, attempts to embody a metaphorical journey, dealing with relationships between past events and current experience. When I was asked to provide music to accompany the proposed twenty kinetic panels, the Alcotts movement of the Concord Sonata, by Charles Ives came to minda trbute to Alcott's "Orchard House." (click here for a description of Passages)
Markoff in: (1992 -), real-time digital music. Algorithm (MAX) for Markoff in Brazil
- Markoff in:, is an algorithmic composition that is generated from material (songs) indigenous to the area in which it is performed. It employs Markoff chains and the Iching process in order to realize a piece, using both the pitches and rhythms from the song with assigned probabilities for each element. Such rank orderings are directly proportional to the frequency of each pitch and rhythm as it appears in the song. The first piece, Markoff in: Milwaukee, based on George Gershwins My Cousin from Milwaukee, was premiered in a one-man show at the Walkers Point Center for the Arts, Milwaukee in early 1992. Markoff in: Darmstadt, was presented at the Ferienkurse für Neue Musik, Darmstadt, in the Summer of 1992. It was generated from the Helige Dankegesang of Beethovens String Quartet, Op. 132. Markoff in: Brazil, performed in November of the same year at the Museu da Imagem e do Som, in São Paulo, used the popular song Sem Compromisso as its basis.
- Along with Interlude for 2 People (1990), and Hoketus (1989), this is a piece written for Two People Came OverWe Didnt Know What To Say So We Played With The Dog And Our Minds Wandered, a multi-media performance work by Michael Meyers, premiered at the Columbia College Dance Center, Chicago, 1990. Elegy was realized from the piano score on a Yamaha TX802.
Unchained Melodies (1974; rev. 1981), piano solo; computer-aided. Score | Recording
- This was written with the aid of MUSICOL (SUNY at Buffalo Technical Report No. 7: Gena, Peter. "MUSICOL Manual, Version 1, [MUSical Instruction Composition Oriented Language] for the 6400 Digital Computer," 1973.), a composition language that employed user-selected stochastic processes in user-constructed time-blocks. The instruction set was made up entirely of musical mnemonics, while the output could have been dumped to a plotter program such as Donald Byrd's "SMUT" (System for Music Transcription) software [Byrd. Indiana University, 1984], or the default text printout by successive beats and subdivisions. The original piece is transcribed verbatim from the print out. The work begins with a G major focus of pitches, chords and rhythms, and gradually evolves to a zeroth order chromatic selection. The revised version of 1981 consists of a sampling of phrases from the piece, beginning to end. Each phrase can be repeated ad lib. The piece is divided into two sections, the firstthe shortestis repeated. The premiere, performed by the composer, was recorded live (see above) at the 1981 International Computer Music Conference, North Texas State University, Denton.
Logos I (1974), 4-channel dyadic brainwave music.
[7' 42"]
- Recently, electrophysiological processes of the human body have been used as direct source and control signals in the synthesis and performance of electronic music. Some forms of music began as attempted reproductions of the logos or internal body symphony. Modern-day composers continue to labor with the externally generated sounds they create and experience without the on-line aid of instruments. Biofeedback technology promises to achieve this end through the production of psychophysiological states necessary for optimal control and perception of biomusic. (from an abstract to a paper given at the Sixth Annual Meeting of the Biofeedback Research Society in Monterey, CA, February 1975).
Logos I, is the result of the initial research that I conducted with Larry Rouse, then a graduate fellow in biophysics at CSU Fresno. The piece uses four distinct alpha waves (8 - 13 Hz), one from each hemisphere of our occipital lobesfour channelsthat are used to modulate frequency on four voltage-controlled oscillators. The alpha waves are each integrated to ensure a smoother sound continuum. The piece was premiered at the conference noted above. A subsequent piece, Piano and Biomusic (1976), was presented at the 1976 Biofeedback Research Society Meeting in Colorado Springs, CO. As I performed from the piano, my dyadic alpha was monitored and realized into electronic music in real time. Thus, a feedback loop was created as I played and listened.
EGERYA (1972), computer-synthesized realization of a Rya rug pattern.
score &
excerpt | data print-out
- This is the first digital piece to be realized in the Computer Music Studio at SUNY, Buffalo. It was generated by ElMus (Electronic Music), a FORTRAN IV program written by the composer. ElMus accepted data as Cartesian coordinateseach IBM card held up to 75 symbols (x-axis, or time domain), while the number of cards determined the length of the y-axis, or frequency domain. The symbols themselves represented a z-axis, which denote intensity. Hence, ElMus realized three-dimensional images into electronic music via quasi "additive synthesis"a sine wave is generated for each coordinate on the y-axis. The Rya "hook" rug came in a kit form with yarn and a chart that graphically represents the pattern, using symbols to denote colors.
EGERYA was premiered at Alea Encuentros, in Pamplona, Spain in July, 1972.
Homage to G. K. Zipf (1971), flute, clarinet, bassoon, vibraphone, violin, viola, violoncello and contrabass (computer-aided composition).
- In the spring of 1971, I developed MUSICOL (MUSical Instruction Composition Oriented Language), a composition language that used musical mnemonics. MUSICOL was constructed to give a great degree of flexibility to the composer as he/she set musical parameters in successive time blocks. This allowed any degree of choice between the two extremes of total control by the computer or literal selection by the composer. I designed the MUSICOL compiler and simulator to run on the CDC 6400 computer at the Computing Center of SUNY at Buffalo. Subsequently, I revised it for the CDC 6500 mainframe at Northwestern University, where I made it available to students in my computer music classes.
Homage was generated in MUSICOL by Zipf's law (see George Kingsley Zipf [1902-1950], Human Behavior and the Principle of Least Effort. NYC: Addison-Wesley, 1949), a stochastic principle developed to express syntax in the English language by frequency in occurrence of words. Each word is ascribed a rank order, and each successive rank order is selected at a probability equal to the inverse of its order number (i.e. rank order: 1, 2, 3, 4,.... equal probability (P): 1/1, 1/2, 1/3, 1/4, ... l/n, where zeroth order implies chaos, and unity - only one rank order - would effect total control). The whole composition contains successive time-blocks where varying rank orders represent all of the musical parameters for the desired content of each time-block. The octave placements of the instruments in the piece were determined beforehand and programmed literally, without reference to rank orders (though the possibility for doing so existed). The instructions were compiled and run, and the output (notes and all other parameters) was printed out in time-sequence.