
It featured sounds generated live from the brainwaves of a blind singer, rhythm from the heartbeat of audience members, music grabbed from the air by a cyborg using her microchip implants, drumming (and more) by computer games producers, complex coding and datastream management by experts from Finland, Iran, Portugal and beyond, stunning pixelmapped visuals, and groundbreaking registration of all complex creative intellectual property rights live and in real time on the blockchain.

Vahakn from Human Instruments came up with the idea of creating a device that enabled classically-trained singer Riikka Hänninen, who is blind from birth, to digitally manipulate her voice live on stage in response to sibilance, plosives and other vocalisations.
Using code created and contributed by the Bela community, the #MTFLabs team built the device, taking the incoming vocal signal, analysing it in real time for pitch information and sending it to a synthesizer. The vocalised triggers were then used to reshape the synthesized tones, creating a kind of glitchy playable vocoder where the glitches are controllable by vocals alone.
What made this live, performance-based responsive processing possible was Bela’s incredible low-latency processing of audio signal. Huge thanks to our partners at Bela for their contribution to the success of the #MTFLabs in Helsinki.