Hey guys, welcome to the Dave Sharples Senior Design Project Blog!
Ooh, first-off, here's my git repo: https://github.com/dsharps/seniordesign
So the short-term goal is to get the interface design work out of the way. I'm trying to define the ideal (within reason) interface for this instrument, though it'll likely require some tweaks due to technical limitations later on down the road. I mentioned in my proposal and presentation that there are three main areas for improvement that I've identified with electronic instruments. The first is the physical layout of the keys/buttons/note-activating-devices. The second is the expressiveness (number of channels of input that influence the sound) of the instrument. The third is the visual design and general charisma of the interface.
The physical layout problem is an interesting one. Many electronic instruments are designed after traditional, acoustic instruments. This is explained in more detail in the proposal, but essentially, the non-isomorphism and physics-driven nature of traditional musical interfaces adds a lot of cognitive load for the user and makes learning the instrument more challenging for beginners.
Isomorphic layouts are growing in popularity. Some examples (the specifics of each instrument aren't important, but notice the hexagonal layouts):
If we step away from musical instruments for a second and just think about isomorphism in geometry, there's some interesting math we can employ here. What we're essentially trying to do is find a 2D tiling of regular shapes, which we'll then use as a lattice of musical notes. There are three shapes that can make isomorphic tilings in a plane: triangles, squares, and hexagons. Triangles have the disadvantage of needing to change orientation to tile properly, so we'll just consider squares and hexagons.
Once we have a 2D tiling, we have to assign notes to the individual cells, bringing up the question of which note goes where. The point of having an isomorphic interface is that musical structures (like a major chord) will manifest with the exact same fingering regardless of the 'root note' - the base of the chord or scale. What's cool is that we can assign arbitrary relationships between adjacent cells. As an example, think of a square grid. We could define a simple mapping of cells to notes such that if you start on an arbitrary square, the square to the right is +1 semitone, the square above is +1 semitone, and the squares to the left and below are -1 semitone. Similarly we could define another mapping that has a horizontal interval of +/- 1 semitone and a vertical interval of +/- 7 semitones, which is convenient for playing chords.
I found an amazing paper that exhaustively examines every possible geometric tiling combined with every possible interval assignment. They prove in the paper that the various mappings can be uniquely defined by the intervals in the horizontal and vertical directions - the rest of the grid can be derived from those simple rules (even if the grid is hexagonal). They generated every possible tiling, and examined each one for its utility in playing both melodies (based around single notes), and harmonies (based around playing chords).
The conclusion that Maupin, Gerhard, and Park came to is that there are mappings that are clearly easier to use than others. They identified two mappings for square isomorphisms and about four mappings for hexagonal isomorphisms that have distinct advantages. Most of the isomorphic instruments around today are hexagonally isomorphic, and most of them employ the Harmonic Table layout. I previously mentioned the Rainboard (pictured above), which is intended to be paired with the Musix iOS application, that allows generation and use of arbitrary tilings to drive an onboard software synthesizer. The Rainboard is intended to allow experimentation with various isomorphisms, and having played with it, I can see some great advantages to these kinds of interfaces.
I think for simplicity, I'll select one of these existing isomorphic layouts and work to extend it. Currently these instruments are all grids on boards. I think one of the issues with the interfaces is that it's a little overwhelming to see all the keys at once, and it's difficult to play without looking at the instrument. Beyond the physical layout of the keys, I intend to employ a good deal of sensor technology to make it really expressive. These are my thoughts so far:
A pressure-sensitive touch pad would allow three dimensions of control: the X/Y finger position, and the Z variable for pressure. A near-field optical reflector could allow a sort of short-range distance measurement that would be useful in determining hit velocity. It would also allow some method of control involving waving one's hand above the instrument. It could also be interesting to modulate the sound somehow through use of a pressure-sensitive breath pipe. This is not a new technology, it's been employed to great success in the Eigenharp, but it would take advantage of a high-resolution output channel for humans. We can control our breath very carefully and expressively, and it doesn't require use of our hands.
I'm working on sketches of the interface now, which I'll have completed for next week's post. These are my goals for the next week:
- Finish visual design concept
- Identify the sensor components necessary to implement the design concept
- Meet with Rahul Mangharam to discuss the project and potential implementation methods (already scheduled for Monday)
- Determine whether or not the RaspberryPi is a suitable platform for this project (whether it's capable of handling the number of inputs required and the signal processing to make use of the data)
- Potentially purchase sensor components (I'd like to start prototyping soon)
- Hook up a simple button to the RaspberryPi and implement a 'hello world' for the MIDI protocol
This week is the first real week of the project, I suppose. I spent most of my time reading papers and writing my proposal, this next week is going to be the hit-the-ground-running week.