Beatsurfing: FastCoDesign Interview
Interview done in preparation for this article in FastCoDesign: "Beatsurfing iPad App Lets You Design a Frankenstein Music Instrument".
Q: Mark Wilson
A: The Beatsurfing team.
1. What was the inspiration for beatsurfing?
Herrmutt Lobby have been developing and using modded yet powerful hardware controllers (that were already using human movement in a creative way) for years, along with programming acclaimed Live-MAX/MSP patches and bits of other software. The Beatfader, was in a way a ‘single axis Beatsufing’ already, allowing the user to create a beat, complete with ≠ steps and variations, using a simple Fader. No buttons, no pads, just a fader.
But until a few months ago, none of the available technologies enabled them to use software to create controllers from the ground up, with total freedom of movement and an acceptable latency/response . This has changed only recently with the arrival of the iPad and other touchscreen tablets. It opened totally new grounds for reflection on how you could create your controller as if you were to draw a map and let your fingers explore.
2. As I understand it, you actually design and play your own midi instrument on screen?
Yes. The Beatsurfing App has 4 objects available: the Line, the Polygon, the Fader and the Circle. Each of them has special ‘features’ (explained at the bottom of this document). You use those objects to create swarms, groups of objects that you can arrange in layers, duplicate, etc. enabling you to create unique MIDI controls. As you can link objects together and create interactions between them (“Behaviours”, of which an example is available here: https://vimeo.com/40236917), there’s also more to a scene than what meets the eye.
Here is how a simple scene could be built, step by step.
- Drag and modify (size, color, 3-dimensions positionning) objects on a empty scenes , maybe just 2 or 4 objects, set the midi attributes and global settings for each ones.
- Then you can start to add some retro-actions (“behaviours”) between objects.
- Slide your fingers along the swarm you designed and turn this MIDI flow into sounds using any MIDI software or a midi compatible hardware.
- Return in edit mode at the tap of a finger to add objects, tweak behaviours, set colors etc... You build the track and its Frankenstein controller in the same creative process.
An important point is that you don’t need a desktop editor to do that, it’s all embedded in the app.
In one fingertap you switch between EDIT and PLAY modes; EDIT mode is kind of hybrid cause you can stil play during editing. This is crucial during the ceative process; you can adapt very quickly and modify every part of your controller in a split second. This impacts the creative flow in a very positive way.
3. Why leave design up to users?
The Application has been designed on very simple to apprehend basic principles. The fact it’s modular makes for a very large amount of possibilities. You could use it for music production and live performance (that’s what it was developed for), but you can build on very solid bases and use the app to drive any MIDI-enabled device (vj-ing, lightshow, etc...). The point here was to deliver an app void of a maximum of constraints. Pre-programming scenes would have reduced the freedom of the user and probably crippled the ‘ease of creation’ we wanted to put forward. We believe the users will explore and find a thousand ways to use the app which we wouldnt even have thought about. We’re vouching to integrate those newly found uses in the future versions of the app.
4. How else is beatsurfing distinguished from all the apps in the space?
As with the main part of the iPad music apps, you still can play by hitting objects on the screen, but here it’s more appropriate to slide over them with your fingers. Actually that’s why we call it Beatsurfing—you can easily play a complete piece of music guided by your feelings and intuition, and reinterpret it in many ways.
The iPad is not accurate as a percussion instrument. If you want to play by hitting any type of interface, you should really be hitting pads or drums. But developing a piece of music by surfing your fingers on the screen and putting your sounds and controllers anywhere on the screen at the same time as you build your track is something really fresh. When objects can interact with each other, it becomes an organic environment. Surfing along the screen to generate music with great freedom is the main idea here.
Except the features already explained above, there’s also the fact that this app was made by musicians (Herrmutt Lobby) and is released on a Music label (Vlek: the app is ‘VLEK08/APP in their catalogue).
5. It feels like we're moving toward more and more abstract instrument design (as electronics take over from analog), would you agree?
It’s a tendancy in recent instrument design that, we think, was also made possible by the medium used. The iPad is relatively new, and provided the first usable touchscreen interface with sufficient reactivity.
Basically, we think the main goal remains unchanged since the first time people began to use technology to create sound: it’s just about making music. To do that, they all took advantage of the existing techniques of the time. While the iPad has been used for emulating lots of analog hardware or existing softwares during its short history, we tought it was important to take a step back and see it as a white sheet.
Regarding the Beatsurfing dev, the goal remained unchanged as we said, now how could we use the available technology, embed parts of everything we used to know (hardware controls, modded faders, software patches, etc...) and confront them to the new possibilities of the iPad in order to create something fresh and void of the constraints of the ‘physical world’?
We think it’s the same thing that happens with all sorts of evolution, in language, life or music... Nothing comes ‘out of nothing’, and it’s not really interesting to just copy/paste old ideas in a new environment. It’s about using abstractions of existing ideas to create new cultural forms.
BEATSURFING - THE 4 OBJECTS EXPLAINED
All four objects are very simple operators if you consider them separately. It is also important to highlight that each operator has a concept of “behaviors” they can be assigned with.
The Line can trigger a unique MIDI note defined as the “root note.” Each time your finger collides a Line, the MIDI note message is sent. Then you have the notion of directional detection that you can enable or not. When enabled, it will detect in which direction you pass through the Line and trigger the root note if you slide your finger from right to left, or the next note relative to the root note if you slide your finger from left to right.
The Polygon is very similar to the Line, where each segment of the polygon has the same characteristic as the Line operator.
The Circle is more or less a sequencer. You can define from 1 to 16 steps. Each step triggers a MIDI note. Each time you collide a Circle, the current step is increased and as a result triggers the next MIDI note. Other objects in the scene can be configured to reset or set a particular step on the circle.
The Fader sends MIDI Control Change or Pitch Wheel messages.