Overview

At Never Before Heard Sounds our goal is making cutting edge research available to musicians and music lovers. With our new API, anyone can access our transformation technology. Behind that API is a scalable GPU backed inference service that can complete transformations in 25x realtime. All you have to do is make a request to transform a URL with any of our models, and you will be returned a URL containing the transformed audio. The simplicity allows you to concentrate on creating amazing Never Before Heard music, without having to worry about buying and setting up a GPU powered, machine learning optimized computer.

How It Works

The technology that powers the transformations uses an audio style-transfer built off of speech-synthesis algorithms that we carefully adjusted the parameters to make it work better for musical audio instead of speech; this technique is sometimes called “timbre-transfer” and we use this in our real-time device and gan.style. We’ve been building out this technology for the past year and are excited to share it.

At the core is a custom machine learning model, a neural net trained on vocal and instrumental recordings that have been carefully sourced from permissively-licensed datasets or recorded ourselves. We don’t use MIDI for training or processing, instead the model uses Chroma Spectrograms (a spectrogram aligned to musical notes), which makes the model work well for pitched audio.

The model resynthesizes any input audio using the textures and timbres learned from a dataset of instrumental or vocal recordings, all while retaining the pitches and rhythms of the original. The output is imperfect and idiosyncratic, and produces unexpected (and sometimes bizarre) combinations of the model’s training dataset and the input performance.

Beta Users

const transformedAudio = await SoundStyleTransfer({
	inputAudio: "/path/to/input.mp3",
	progress: (p) => console.log(`loading ${p * 100}%`),
	model: "YOUR_MODEL_NAME",
});

Currently we are using the Inference API in a number of our projects, and are opening it up to private beta users. If you have a need for something like this, please click the button below to fill out the form and we’ll get back to you soon.

Request Beta Access launch