BEATSURFING: THE CREATORS PROJECT INTERVIEW / FULL TEXT
Written interview in preparation of this article on The Creators Project (VICE/Intel venture). "The Future of Music Production Comes Early".
Q: Abdullah Saeed, The Creators Project
1) Why is an interface like this necessary for today's producers? What do you hope to achieve with Beatsurfing?
We can’t really say it’s necessary however we believe that the ‘real time’ and spontaneity aspects of the app can radically change the way you think about producing music. By ‘real time’ we mean that the music can be a physical movement extension and the core concept of Beatsurfing is really about movements, you play it by sliding your finger(s) across the screen.
We are very excited to discover the way users will take advantage of the app. We’re sure it will be a great source of inspiration for our own way of thinking music. Guess this will also give us ideas on the direction we need to take the app in the future.
2) Can you describe what's happening onscreen in this video? Which shapes represent which types of sound? Which colors? Is there a pattern to this?
First we think it’s important to highligtht that the application is designed to produce a real time MIDI flow. The MIDI output is an instantaneous result of the confrontation of 3 things:
1 The architecture of the scene (geometry)
2 the rules you fix between objects (systemic, cybernetic)
3 the way you surf along screen (time, space, interpretation)
As with the main part of the iPad music apps, you still can play by hitting objects on the screen, but here it's more appropriate to slide over them with your fingers. Actually that's why we call it « beatsurfing ». : you can easily play a complete piece of music guided by your feelings and intuition, and reinterpret it in many ways.
Ipad is not accurate as a percussion instrument. If you want to play by hitting any type of interface, you should really prefer hitting pads or drums. But developing a piece of music by surfing your fingers on the screen and putting your sounds and controllers anywhere on the screen at the same time as you build your track is something really fresh. When objects can interact with each others, it becomes an organic environment. Surfing along the screen to generate music with great freedom is the main idea here.
The video we’re releasing here shows a particular use of Beatsurfing, in which basic interactions are set between objects. Herrmutt Lobby used the App to create modular synthesis on a simple beat: Kick, Snare, and Voice (the voice of NON - ShadowHuntaz, a long-time Herrmutt Lobby collaborator by the way).
fig.1: Beatsurfing iPad screenshot
A & B are SNARES
C & D are KICKS that turn into BASS when held.
E moves the playback head along the first voice sample WAV (fig.2).
F & G change the size of the loop around the playback head (fig.2)
H changes the PITCH on C & D (when used as BASS)
I is not used for this video. But set all fader to center position ************
J is a fader that controls the playback head reading another voice sample WAV, launched by K.
K launches the voice sample WAV controlled by J.
fig.2: Ableton Live screenshot.
3) How does can the Beatsurfing app interact with an existing DAW?
It’s been developed for music production and performance, you control anything that is MIDI enabled, digital and analog devices: synths, vj softwares, lightning systems... you name it.
4) Can you describe the characteristics of the Line, the Circle, the Polygon, and the Fader?
The app is featuring 4 different "operators" that we call Line, Circle, Polygon and Fader. Each of these operators have different characteristics which allows you to build your own MIDI instruments from the ground up by combining them in a creative way. The distribution of the objects on the scene and the trajectory of your fingers then help reintroduce variety and a human feeling to your performance.
Because the idea here is to slide your finger on the surface, we can start to think about interesting ways to place the operators on your scene such as overlapping them.
This concept is very interesting because you can add many layers. You may have for example few Line objects that trigger some drums and a Fader underneath these Line Objects that drives a filter and by sliding your finger from Line to Line to play a rythm and therefore implicitely colliding in the same time the Fader underneath. It may create some interesting variation that would be harder to do otherwise.
All 4 objects are very simple operators if you consider them separately. It is also important to highlight that each operators have a concept of “behaviors” they can be assigned with.
The Line can trigger a unique midi note defined as the “root note”. Each time your finger collide a Line the midi note message is sent. Then you have the notion of direction’s detection that you can enable or not. When enabled it will detect in which direction you pass thru the Line and trigger the root note if you slide your finger from right to left, or the next note relative to the root note if you slide your finger from left to right.
The Polygon is very similar to the Line where each segment of the polygon has the same characteristic as the Line operator.
The Circle is more or less a sequential sequencer. You can define from 1 to 16 steps. Each step trigger a midi note. Each time you collide a Circle the current step is incremented and as a result triggers the next midi note. Other objects in the scene can be configured to reset or set a particular step on the circle.
The Fader is sending MIDI Control Change or Pitch Wheel messages.
You can define:
- a minimum and maximum value from 0 to 127.
- the “attack” which is the time it will take to go from its current position to the new position.
- the “return value” let you define the value it returns as soon as your finger release the Fader.
- the “release” which is the time it will take to go from the latest position to the return value position. So for example if you combine the attack,release and return value parameters it will let you create a pitch wheel control.
- the “definition” that represent the number of steps shown on the Fader. Assuming a Fader assigned from 0 to 127, If you set the definition to 3, when you press the first step it will trigger the value 0, the second step will trigger the value 64 and the third step will trigger 127.
If an attack or a release is defined it will tween smoothly from values to values.
Then we have the notion of Behaviors where each type of Object have a defined set of them.
For example: one of the Circle’s Behavior is called “Clock Direction”. It can be activated by any other object on the scene and be set to “Flip-Flop”, so each time this behavior is triggered the Circle it is associated with switches its playback direction from clockwise to anticlockwise, and vice-versa. See the explanation of the ‘Granular Synthesis’ scene (fig.3) for another example of behaviours.
5) Is Beatsurfing built for a certain style of music?
Not at all, we think Beatsurfing is a tool that can complement any setup. We hope to see musicians using Beatsurfing in any style of music going from Jazz , to rock to techno...
In fact even though it’s built with music production in mind, we believe that Beatsurfing will also be used for other things than music; maybe for controlling lights, or visuals.
6) How's beta testing going? What are some of the improvements being made before launch? And when is the launch exactly? Where will you be taking the app?
We are currently very busy preparing the launch of the private beta release in about a week. The subscriptions are still open here. All features are implemented by now and therefore our main focus will be about making the app as stable as possible while gathering as much feedback as possible from our beta testers and if interesting enhancements are highlighted we will try to squeeze them for the 1.0 release.
We can’t provide a precise date for the release but we are committed to have it on the app store by the end of spring.
We have many ideas, many possible directions. We currently have a roadmap for a few dot releases with features that we could not deliver on time for the first release. We embrace principles of lean startup, we drive a culture of fast innovation and pleasing our users. Our next step, after the release, will be to watch how the people actually use Beatsurfing, and collaborate with them to elaborate a roadmap.
7) Where this idea came from ?
Herrmutt Lobby have been developing and using modded yet powerful hardware controllers (that were already using human movement in a creative way) for years, along with programming acclaimed Live-MAX/MSP patches and bits of other software. But until then, none of the available technologies enabled them to use software to create controllers from the ground up, with total freedom of movement and an acceptable latency/response . This has changed only recently with the arrival of the iPad and other touchscreen tablets. It opened totally new grounds for reflection on how you could create your controller as if you were to draw a map and let your fingers explore.
As a result, Beatsurfing is the combination of many technologies developed by the Lobby through the years. Plus a lot of plain new ones, inspired recently by the tablet’s touch-screen capabilities.