The Wheel within the Wheel : Listening around EMPAC (2014)
“The Wheel within the Wheel” is a sound art composition for custom software running on smartphones. As of July 2014, there is an iOS version available, and there will be an Android version Fall 2014. The piece is based on two kinds of wheels within wheels, spatial and temporal. The spatial wheels are geographic regions defined by latitude/longitude points. The temporal wheels are the 24 hour day and sunrise/sunset times as days progress through the year.
Throughout the day, the base frequency of the piece glides almost imperceptibly from its low point in the middle of the night to its high point in the middle of the day. Geographic regions are placed at points close to EMPAC, the site of the Deep Listening Convference. Each region contains a different (but complimentary) sonic potential. Each frequency present in the region fades in and out with a differing period length, which can be as long as twelve hours. The listener’s location within the region causes the sounds to be mixed in different ways.
The piece is conceived as a massive, ever-changing sonic system that, while it should be identical if one returns to the same spot at the same time on the same day, is never exactly the same. It can act as a tool for opening one’s experience of sound. It can serve as a meditation aid. It can alter one’s perception of sound after awhile…
The sound walk is a GPS-enhanced software application that senses the participant’s movement and location and plays synthesized sounds at specific points through the landscape. Visitors are encouraged to download the app in advance of their visit and wear headphones for optimal listening. Be mindful of motorists, bicyclists, and other pedestrians when in proximity to roads or parking lots. You may have to adjust your volume controls at times. For best results, start at a moderate level.
Further information about the project may be found at:
Important: The app is configured to continue running in the background. Please press “stop tracking” on the TRACKING screen when you’re done in order to preserve optimal battery life.
Created by Tom Stoll.
Mobile app development: Tom Stoll.
“The Wheel” was created for the 2nd annual Deep Listening Conference at EMPAC, July 2014.
Corpus-based Processing for Python/SuperCollider
Available from GitHub. This is the Python implementation, intended to mirror CBPSC as much as possible. This software is under heavy development. Full documentation and lots of examples coming in December!
- Python - I use version 2.7.2 bundled with Mac OS 10.8.*
- Numpy, Ipython, Matplotlib - use the Superpack
- sc-0.3.1 - SuperCollider lib, not strictly necessary, but handy to have
- jsonpickle-0.4.0 - JSON lib
- Bregman Toolkit - using Bregman’s imagesc function for the time being
- SuperCollider - synthesis engine
Installation and use
- cd into the appropriate directories and sudo python setup.py install
- from corpusdb import *
- look at the examples folder for some example corpora and tasks
Here’s what I do when I have a bunch of aif files that need to become wav files. You will need SoX to do this.
I composed this piece for members of Tilt Brass this past spring. Instrumentation: 2 trumpets, horn, 2 trombones, tuba.
ImageDB is corpus-based image retrieval for OpenFrameworks/OpenCV. It was developed on OS X, but there is no reason why it cannot be used for Linux or Windows.
At this point, I do not have time to do much with this software. I will attempt to make some documentation and tutorials/examples later this year. Get in touch with me if you have any questions, or need help getting up and running.
This is a piece I composed this past fall for Tigue, a percussion trio comprised of Amy Tigue, Matt Evans, and Carson Moody.
I am pleased to announce that the first bits of ACTION have been published here. Thus far, the code consists of two classes that analyze color histograms and optical flow for frames of video. Much more to come in the weeks and months ahead.
Please note: this is a guide mainly intended for students in Teri Rueb’s graduate seminar at UB. I [Tom] will endeavor to keep this document as current as possible with respect to the latest version of the iOS library/app.
We are making steady progress making the soundScape Toolkit ready for public release. If you are interested in trying the software, let us know, and we will give you access to the GitHub code. The following is an overview of how one would develop a soundScape composition. (For now we will assume that you are using the ad hoc distribution, the one that you have to join our development team as a tester to get.)
- in Other
If you were at ISEA 2012, you may have already seen or experienced this. If you are in the Santa Fe area, you can still experience this (see the link)…
I programmed the app that underlies No Places with Names, a large soundscape composition composed by Teri Rueb et al. The underlying software allows a composer to map sound files to playback when a user with a smartphone is in a particular location. I will be porting the software to Android this winter and adding functionality to both the Android and iOS versions as I go.
An open source release of the underlying software will happen within the next few months. Stay tuned, and email me if you’re interested in finding out more!