“Really happy to be joining the awesome #MusicBricks project. The whole concept of making tools freely available, providing space for makers and hackers and then supporting what’s made with them is just brilliant. This is a new cooperative Win strategy. Looking forward to see what comes out of our #MusicBricks offerings. Onwards!”

Matt Black, Coldcut / Ninja Tune



What is #MusicBricks?

#MusicBricks is a set of interoperable tools for hackers and creative developers. Built upon high-level research coming out of Europe’s finest institutions, the technologies are bundled and packaged into simple-to-deploy APIs, GUIs and TUIs that enable creative and technical minds to create new music projects, products and performances that might otherwise have been unimaginable.

New creative projects and startups

Since its development as a toolkit in early 2015 as part of a European funded project, #MusicBricks has expanded from a set of 8 technologies to 15 with the addition of new tools from commercial partners. These organisations have opened up their IP to allow creative developers to integrate their technologies with others and build them into new inventions. These tools have been tested in hackathons and creative laboratories at events right across Europe - and winning projects that use #MusicBricks have been supported and incubated to commercial prototype.

Powerful tools for music innovation

As #MusicBricks enter the market, music hackathons, professional developers and hobbyists will have a unique and ever-expanding set of incredibly powerful tools that include melody extraction, zero-configuration mobile tempo sync, pitch detection, mood and musicological analysis- and even a postage-stamp-sized microboard for connected, gestural interaction with musical performance.

Have a look below at some of the incredible projects that have been developed using the #MusicBricks toolkit.



“I have managed to code the machine learning to recognise 3 different gestures with accuracy close to 100% and I’m so excited about it. I used your suggestions… and now, the machine recognizes the static data up to 100% accurate… It is so cool. I am loving this Machine Learning algorithms. Thank you for your help…”

Rojan Gharibpour, Incubatee



#MusicBricks Results

The success of the #MusicBricks project has been demonstrated not only in the brilliant results created by teams of creative developers at hackathon events and creative test beds throughout Europe, but also in the incredible level of interest that has been generated. Requests for the toolkit at events have come from around the world - from New York to New Zealand - and the impact on social media in its first 18 months of testing as a European Innovation Action project designed to bring academic research to the marketplace has been astonishing.

Innovation using #MusicBricks

11 seed prototypes were incubated using the #MusicBricks toolkit - ranging from data-enhanced product and hardware to SaaS.

Three were chosen to be incubated to prototype at #MTFCentral on the 18-20 September 2015 by judges who included some of the great music minds: Graham Massey (808 State), Stockhausen collaborator and composer Rolf Gehlhaar, Matt Black (Coldcut), producer and inventor Håkan Lidbo and British composer, recording artist and multi-instrumentalist Nitin Sawhney.

At the Sonar +D Music Hack Day in Barcelona, a special #MusicBricks workshop was held on the day before the MHD event. From the high number of projects that used #MusicBricks, four winners were chosen to develop their ideas further.

#MusicBricks tools made their premiere at #MTFScandi 28-30 May 2015. Four great seed ideas using #MusicBricks were chosen to be developed by our judges who included Warner Music’s Head of Technical and Creative (Digital) Josh Saunders, Coldcut’s Matt Black, vocal sculptor and recording artist Jason Singh, Deerlily Music CEO Paul Sonkamble and BBC Click presenter, musician and music hacker LJ Rich.

Read more about each of the incubated #MusicBricks projects below.

#MusicBricks in social media

#MusicBricks were developed as a set of tools created by wrapping the results coming out of top research institutions into easily usable APIs, GUIs and TUIs for hackers and creative developers in hackathons. The first creative testbed for these tools came in month 5 of the project, where online buzz began.

In month 9 of the project alone, the #MusicBricks hashtag registered over half a million impressions on Twitter, and in month 18 that number was over five and a half million. On Facebook, the #MusicBricks community has grown to over 1200 members.


#MusicBricks awards

Perhaps the most resounding commendation for the #MusicBricks idea has come in the form of a European Innovation Luminary award for the projects Innovation Coordinator Michela Magas.

#FindingSomethingBondingSound, one of the projects created using #MusicBricks and incubated during the testbed phase, received the Ars Electronica STARTS Prize Honourable Mention and another incubated project, Dolphin, has filed a patent in Sweden for the innovative process created using the #MusicBricks toolkit.

Read more about the results of the #MusicBricks project.

Incubated projects



“#MusicBricks implicitly initiates a large network of people across many disciplines, with backgrounds as researchers, creators, artists, entrepreneurs, etc. This network provokes exchange of ideas far beyond the traditionally technology centred aspects that a technical university usually focuses on and therefore opens up inter-disciplinary research aspects that were not considered before.”

TU Wien, Project Partner




#MusicBricks are a compendium of both physical and virtual interfaces and APIs, that allow creators, developers and digital content makers easy access to core building blocks of music.

Developed by some of the world’s best music tech research centres, #MusicBricks are available exclusively to creative developers, researchers and artists attending select Music Tech Fest and Sonar+D Music Hack Day events.

#MusicBricks Tools

Gesture Sensors for Music Performance
The R-IoT sensor module embeds a 9 axis sensor with 3 accelerometers, 3 gyroscopes and 3 magnetometers, all 16 bit. It allows for getting 3D acceleration, 3-axis angular velocity and absolute orientation at a framerate of 200 Hz over WiFi. The core of the board is a Texas Instrument WiFi module with a 32 bit Cortex ARM processor that execute the program and deals with the Ethernet / WAN stack. It is compatible with TI’s Code Composer and with Energia, a port of the Arduino environment for TI processors. The sensor module is completed with a series of analysis MaxMSP modules that facilitates its use, based on the MuBu & Co Max library. This collection of analysis tools allows for: filtering and analyzing, computing scalar intensity from accelerometer or gyroscope, kick detection, Mdetection motion patterns such as “freefall”, spinning, shaking, slow motion. Further motion recognition tools are available in the MuBu & Co library.

Melody Extraction
This module includes a number of pitch tracking and melody transcription algorithms implemented in the Essentia library. Applications include visualization of predominant melody, pitch tracking, tuning rating, source separation.

Real-time Onset description
This module allows to detect onsets in real-time and provide a number of audio descriptors. It is part of essentiaRT~, a real-time subset of Essentia (MTG’s open-source C++ library for audio analysis and audio-based music information retrieval) implemented as an external for Pd and Max/MSP. As such, the current version does not yet include all of Essentia’s algorithms, but a number of features to slice and provide on-the-fly descriptors for classification of audio in real-time. A number of extractors analyse instantaneous features like the onset strength, the spectral centroid and the MFCC’s over a fixed-size window of 2048 points, after an onset is reported. Furthermore, essentiaRT~ is able to perform estimations on larger time-frames of user-defined lengths, and to report finer descriptions in terms of noisiness, f0, temporal centroid and loudness.

A slick UI for browsing music by zooming into a colourful world of bubbles which represent genres, artists or moods which allows to discover new music online from various sources. It is available for iOS and Android, with APIs connecting to 7digital, last.fm, Youtube, Spotify, etc. Demo app in Google Play Store. See sonarflow.com and the source code on github.com/spectralmind.

A color detection tool triggering and controlling audio. Based on OpenCV it includes a colour detection engine and object tracking algorithm to control music and sound using plain colours or coloured objects. Using the camera it can trigger musical events. It includes an example app with a dedicated GUI. It is available for OSX and iOS. Find it on github: https://github.com/stromatolite/Synaesthesia

Freesound API
Freesound (www.freesound.org) is a state-of-the-art online collaborative audio database that contains over 200K Creative Commons licensed sound samples. All these sounds are annotated with user-provided free-form tags and textual descriptions that enable text-based retrieval. Content-based audio features are also extracted from sound samples to provide sound similarity search.  Users can browse, search, and retrieve information about the sounds, can find similar sounds to a given target (based on content analysis) and retrieve automatically extracted features from audio files; as well as perform advanced queries combining content analysis features and other metadata (tags, etc…).  The Freesound API will provide access to a RESTful API with API Clients in Python, Javascript, Objective-C.

Rhythm and Timbre Analysis
This is a library that processes audio data as input and analyzes the spectral rhythmic and timbral information in the audio to describe its acoustic content. It captures rhythmic and timbral features which can be stored or directly processed to compute acoustic similarity between two audio segments, find similar sounding songs (or song segments), create playlists of music of a certain style, detect the genre of a song, make music recommendations and much more. Depending on the needs, a range of audio features is available: Rhythm Patterns, Rhythm Histograms (i.e. a rough BPM peak histogram), Spectrum Descriptors and more. The library is available for Python, Matlab and Java.

(Melody & Bass Transcription + Beat & Key & Tempo Estimation)
The MusicBricksTranscriber provided by Fraunhofer IDMT is an executable that allows to transcribe the main melody and bass line of a given audio file. Also, the beat times, the key, and the average tempo are estimated. The results can be provided as MIDI, MusicXML, or plain XML files. In addition, a Python wrapper is included to further process the analysis results.

POF = Pd + OpenFrameworks : openFrameworks externals for Pure Data, providing openGL mutlithreaded rendering and advanced multitouch events management. A recent addition to #MusicBricks by Antoine Rousseau, in collaboration with Matt Black of Ninja Tune, it makes making PD music apps much easier including cross platform. It also feeds the Mobile Orchestra through SyncJams. Find it on Github.

Musimap’s algorithm applies fifty-five weighted variables to each music unit (e.g tracks, genres, labels) so as to model the world’s discography as a multi-layered system of cross-matched influences based on a musicological, lexicological and socio-psychological approach. The granular and proprietary database includes over 3B data points, 2B relations, and soon counting 50M songs. Its neural music network is the result of a unique combination of in-depth human curation and the latest AI technologies to engineer a multi-layered system.

Real-time Pitch Detection
The real-time pitch detection allows to estimate the predominant melody notes (monophonic) or multiple notes (polyphonic) from a consecutive audio sample blocks. This allows to transcribe the currently played / sung note pitches from a recorded instrument / vocal performance. The monophonic version also estimates the exact fundamental frequency values. Typical applications are music games and music learning applications. Fraunhofer IDMT provides a C++ library as well as sample projects that show how to include the functionality.

Search by Sound Music Similarity
The Search by Sound online system is based on the Rhythm and Timbre Analysis (see above) and provides a system which can be used via a REST Web API (called SMINT API) to upload, find and match acoustically similar songs in terms of rhythm and timbre – without the need to install any prerequisite or run the analysis on your own. It can be used with your own custom music dataset or the readily available content from freemusicarchive.org that was already pre-analyzed by rhythm and timbre, to find music matching a particular rhythm or timbre from that archive.

The Goatify tool provided by Fraunhofer IDMT is an executable that automatically replaces the main melody in a song with a given sample. Therefore the main melody is extracted and removed from the song. Then the sample is placed and pitched according to the melody notes in the song. For proper pitching of the sample, the pitch of the sample itself is extracted beforehand. The tool is delivered with free sound samples (goat, etc.) from www.freesound.org for direct use.

A recent addition to #MusicBricks, SyncJams is an open source standard to allow wireless inter music app sync and communication of key/scale between players in a ‘mobile orchestra’, authored by Chris McCormick in collaboration with Matt Black of Ninja Tune. Also defined as: “Zero-configuration network-synchronised metronome and state dictionary for music application”. Currently Pure Data and Python are supported. Find it on Github.

Real-time Pitch-Shifting and Time-Stretching
The real-time pitch shifting library allows users to change the pitch of audio material while keeping the tempo. It allows enabled changing the tempo without changing its pitch. Typical applications are music games and music learning applications as well real time performances. Fraunhofer IDMT provides a C++ library as well as sample projects that show how to include the functionality.

Routes to market

“In some cases it’s “just” entertainment, in some cases this revolutionises how music is performed, in others it produces a new world-wide business”…

#MusicBricks are designed to be the building blocks of future entertainment, performance and products. Any ideas built with #MusicBricks which have been identified to have great potential, will be supported to find routes to market. In the Spring of 2016 we are descending on Berlin with all our ideas and testing the market reactions.

#MusicBricks Blog

#MusicBricks incubated team gets European funding

#MusicBricks incubated team gets European funding

I expect you know someone who has some programming skills. Perhaps someone with a bit of experience with EEG brain-computer interfaces. I imagine that person would probably enjoy going to live for a while in Portugal and enjoy some neurotechnology work in the beautiful, sunny city of Porto with some brilliant members of the Music Tech Fest family.

SoundCloud teams up with #MusicBricks

SoundCloud teams up with #MusicBricks

#MusicBricks is exactly the sort of open and collaborative music creation SoundCloud is about. It opens up new options for makers and musicians to be creative and we’re excited to support it and see what amazing creations are born.

#MTFBerlin: Announcing…

#MTFBerlin: Announcing…

Ben Heck, host of popular YouTube channel The Ben Heck Show joins us with his crew at #MTFBerlin to hack, invent, share a bit of maker knowledge and film a special episode of the show surrounded by our 50 hackers in the element14 Hack Camp.

#MusicBricks is a collaboration between Stromatolite, Sigma-Orionis, Ircam-Centre Pompidou, Music Technology Group, Vienna University of Technology, and Fraunhofer IDMT

For more information join our newsletter


#MusicBricks has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement 644871