Google debuts touchscreen, AI assisted hardware synth

Google debuts touchscreen, AI assisted hardware synth

 position at the forefront technological innovation finds a new mode in the touchscreen hardware synth recently unveiled by the company. An alternative …


 position at the forefront technological innovation finds a new mode in the touchscreen hardware synth recently unveiled by the company. An alternative to synths that traditionally combine waveforms to generate sound, the touchscreen, AI assisted synth uses NSynth machine learning technology to “interpret” a range inputs and generate new sounds.
The NSynth technology enables Google’s synth to register sounds as numbers in order to mathematically produce a novel series numbers after the synth’s analysis the original set inputs. The synth then coverts its newly conceptualized string numbers back into sound, thus producing sounds that are both new and nonpareil. Sounds that exemplify the synth’s uncanny ability to create the unique audio include a car’s engine combined with a sitar, and bass sound paired with that thunder, in addition to various others.

Those interested can experiment with the NSynth technology in order to fully experience the synth’s anomalous kind machine learning on Google’s the synth.
The synth’s hardware allows its users to transition between four parameters on its X/Y pad, and to play and sequence sounds MIDI, while “morphing between the sound sources in real time.”
Although Google will not commercially market its AI synth, it will release the technology as an open source Github download. Post download, users will have the ability to add their own features to the technology.

H/T: