Posted
What about deep neural learning and procedural generation of music with templates. Some tools, either plugins or A.I. composition tools, are getting very interesting (semi-random) results. If you can control what is generated and if you can learn patterns or have an A.I. trained on musical patterns, then you can get very musical results. It's even possible to train A.I. on few datasets.
Tue, 2021-06-22 - 16:41 Permalink
We spent about a year experimenting with deep learning (DL).
If you can control what is generated
That's basically the challenge. It is very difficult to gradually change the output based on user input in a meaningful way. You would need to (re)train a new net for every project in order to get to fundamentally different styles and results.
This is not to say DL couldn't be useful in some way. It just did not do what we expected it to do and there was no obvious path to make it happen anytime soon.
Procedural (rules-based) approaches led to far better results and they are easy to control. So that's what 2.0 Factories will be using.