Rubato
A shared-screen visualisation platform to accompany live music.
Above, a video of Rubato in action at one of Richard’s concerts.
Rubato was made in collaboration with musician and technologist Richard Birkin. It choreographs text and images in time with live performance, across as many screen as the performer would like - typically, the individual smartphones of an entire audience - and puts the performer first, synchronising the visuals and words to their music, rather than the other way around.
Richard had explored choreographing text and imagery in time with music previously on Night Sun. Night Sun synchronised poetry and photography against a musical soundtrack, playing out through the browser.
Richard’s Songs for Spoken Words project developed from of a series of pieces original improvised around live poetry performed by Michael Frearson. After Michael stopped performing this poetry, Richard arranged the music for string quartet and guitar, and used a similar platform to Night Sun to choreograph the poems’ text alongisde the music. The music became a soundtrack for the choregropahed text of the poem This was released online and as an app.
Richard wanted to be able to take this experience on tour, so that audiences could see the poetry that inspired the music they were hearing.
We worked on building a platform that would deliver that experience to a live audience. Richard noted that so many audience members at gigs will look at their phones - so why not actively encourage them to do so? We’d display the poetry across all the individual devices in the room, co-ordinate with the music.
As Richard pointed out at the time, if this were delivered on a single large screen, it begins to resemble a silent movie: text punctuating action. The moment it’s placed onto a tiny screen in your hand, it becomes very intimate: much more direct and personal, enhancing the feeling that the music is the soundtrack to the words you’re seeing.
I suggested that to be honest to the brief, it was vital that the performer wasn’t doing the equivalent of pressing ‘play’ on a video: they needed to have control over the playback of the text-choreography. What if they wanted to extemporise, or take a piece a little slower? The textual choreography should be subservient to that; the visualisation should be part of the performance, not in control of it. And hence the name we came to for the project, derived from tempo rubato, the expressive manipulation of tempo by a musician for effect.
Rubato let Richard break the visualisations into scenes which he could then advance through by pressing a footswitch - much like a piece of presentation software. The software would wait at the end of a scene until Richard gave it the next cue to proceed - or, if he was coming up on a cue early, he could set it to carry on immediately, all whilst playing.
Richard had already built the front-end HTML for the songs for his stand-alone app version of Songs For Spoken Words, and all the timings for the choreography were already prepared. I took that information as a starting point for Rubato’s textual timing, adjusted the format to separate out the code and data (making it easier to add new songs in future), and built a small engine in node.js and socket.io to serve as a backend. It took cues to advance from a musician and updated the display accordingly on audience members’ screens in real time. The whole project ran as a web application: the audience could visit our URL on any reasonably modern smartphone and start watching. A second, private URL allowed Richard to have a ‘presenter remote’ on his stage laptop. There’s no app to download, and thus no concerns about compatibility. It also meant code updates were as simple as pushing out another deployment to our server.
I also built Richard a footpedal to perform with. The footpedal is incredibly simple: a pair of footswitches wired into a Teensy pretending to be a HID keyboard. Each pedal triggered a keyboard shortcut in the performer’s remote control: one to move to the next scene, and the other to move to the next song in the set. I housed this in a sturdy aluminium enclosure with a tough, braided USB cable to withstand the stage environment.
The audience at the Bethnal Green gig.
Richard played two gigs with the Iskra string quartet and Rubato - at St John on Bethnal Green and at the Regent Street Apple Store in London. (The video at the start of this page is filmed live at the St John concert).
We had excellent feedback from both gigs. Rather than being a distraction, Rubato really seemed to engage the audience and have the effect we hoped it would - heightening the experience of the music. One striking piece of feedback came from a deaf audience member who enthusiastically explained that the visual and textual component gave a great deal of meaning to their experience of the gig; a use-case we hadn’t even considered, but one that was very rewawrding to hear received so enthusiastically.