
Posted
Oh my.
Is there any pitch bend functionality in Synfire? I can't seem to find it at all.
It is likely I just have my head on wrong.
Thu, 2009-08-20 - 11:30 Permalink
Channel pressure, aftertouch and pitch wheel are not yet accessible through the user interface, although these are all implemented already.
There is a debate going on here how it would make sense to display them, because they are not controllers. This information rather is attached to individual notes. IMO this has more in common with articulations than with controllers and the MIDI spec supports this view. Andre says we should make them controllers to keep it simple.
Any opinions?
Thu, 2009-08-20 - 12:15 Permalink
I'm obviously in favor of a change on the editing side in general. Things come to mind graphically are down that line of thinking. Thats all in the wordkflow study post.
I like the idea of functions being accessed at the individual symbols and segments with the interface i described earlier. In the context of pitch, I imagine a small graph editor popping up under the note, when toggled, where you could quickly alter the pitchgraph of the note and then hide away again when, when the hotkey is released. This could be a semi-transparent panel.
That would make it quick to make changes in an editing context with out having to mouse around or switch to a pitch editing mode. It also gives each note its own pitch parameter that travels with the note.
From this I could see settings addressing how the pitch is interpolated as the different notes play. Possibly a setting in interpretation.
Another consideration would be if a bend, that is intersected by another notes' bend and stretches beyond the end of the bend of the second note, continues... ooh my... maybe a little ascii diagram.
[a:__/--_ [b:_-*-_] --\__] <- is the --\__] cut?
As far as the global copy and paste functionality in Synfire:
To copy the pitch and then paste to other parameters, Synfire could take a "snapshot" of the current pitch bend, based on the current placement of notes in the figure.
Alternately, pasting another parameter (say "breath") onto the pitch parameter could load all of the figures with pitch bend parameters extending from the beginning of the note to the beginning of the next, al la legato.
Fri, 2009-08-28 - 18:02 Permalink
Thank you for your suggestions. Your creativity is much appreciated.
The big issue with pitch bend is that it is not portable across sounds. What sounds fine with one sound creates horrid cacophony with another.
I vaguely remember there was a midi message that could be sent to tell a sound generator to adjust its bend range to a semitone, octave, whatnot. If that message doesn't exist, we're in trouble.
The good thing with controller parameters A, B, C and D is that they are ignored by sounds who do not support them (Synfire takes care of that). Pitch bend will not.
Attaching small vectors of data to a note symbol is quite a challenge. It is possible, but the requirements for a popup editor are considerable. That's bleeding edge GUI art. Even more so as it has to run on both Windows and Mac. We will follow this idea and see how it could be done most efficiently.
Fri, 2009-08-28 - 18:57 Permalink
A network exception occurred with embedding an URL
I believe pitchbend range is set with an RPN message. There's some information below the table on this site: http://users.sa.chariot.net.au/~gmarts/midimsg.htm
Fri, 2009-08-28 - 19:46 Permalink
Hi Andre
My general feeling is that such a message will likely not work on many modern sound generators. MIDI Standards like that seem to be rarely upheld.
My setup is using your "rack" methodology, which further complicates the issue as the actual sound generators are likely to change without synfire being aware, or knowing exactly what they are.
Finally synths that feature non-liner bending or that automatically un-bend the second note (to allow for smooth glissandos) could not be transferred with out some degree of cleanup even in a correct range.
It would be great to have the pitch attached to the notes with a very ergonomic gui, but if thats not in the cards, then just a plain old pitch parameter would be a boon.
Pitch bend has been hard to do with out.
I am far less concerned by whatever speed bumps we may encounter while transferring phrases to other sounds.
Adding the pitch as an after thought in another sequencer is impractical from both a compositional stand point and, for me personally, a production schedule stand point.
Likewise, the idea of dealing with these things by "polishing" in a regular sequencer as part of the prototyping pipline is flawed. This approach totally prohibits any revisions within Synfire without totally blowing away any changes made in the sequencer.
I would think that the basic parameter scaling in synfire would be enough to quickly adjust the pitch down to the correct range in the cases where that is necessary. Perhaps it could gain a modifier key function to scale symmetrically from the top and the bottom (as it is, you would have to grab the top and pull it down ~%25 and then pull the bottom up ~%25)
Wed, 2011-12-14 - 06:31 Permalink
My two cents, FWIW, is that pitch-bends and also "pressure" (after-touch, monophonic or polyphonic) are articulation properties of sounds. For example, the behavior pitch bends with guitars and violins is rather quite different to, for example, a skin-drum or a purely synthetic sound.
The advantage of treating pitch bend as an articulation is that you can develop an ontology of behavioral models based on an ontology of archetypical articulation schemata. Even better, in this way you can have portamento as an articulation parameter derivative of pitchbending or vice-versa.
I would also rate a number of other parameters that sit in between articulation and "controllers". To stay true to the motto that "Notes can Think", the smallest quantum of musical significance then is the note event and with that event may be attached several others. However, when is a filter-sweep not an articulation? I think at this point that it might be useful to think in terms of "articulation" when doing instrument design in concert with composition and "controllers" when doing synthesis of timbres --- and here I am distinguishing "timbre synthesis" as distinct from "instrument design" because the instrument is what gets played and the timbre is what is articulated by playing.
I hope this helps the thinking.
Wed, 2011-12-14 - 13:04 Permalink
Yes, thanks for your thoughts. A redesign of the articulations is already on the agenda. Current articulations are on/off only. Pitch bend and others will require a vector attached to a note, which is a new design that also requires a new and rather complex user interface element. Also these vector-based articulations need to be normalized to ensure portability.
Doing this right is quite a challenge and establishing a standard articulation taxonomy in the industry (and possibly an XML format for device descriptions) is a monster task. Cognitone will possibly brew its own based on what Steinberg and others have begun and publish it as a proposition in the hope other developers will eventually pick it up.
This is a hen and egg problem. Software that makes use of this information is still rare and sample library developers ask why they should put so much effort in it when everybody is "hacking" their way throu it manually anyway.
Wed, 2011-12-14 - 13:26 Permalink
Or to put it in an economical context: This tiny niche industry did not yet get sufficient human resources and money together to improve on their technical "standards" during the past 20 years. Every effort to replace or improve MIDI has failed. VST and AU specs are still unclear, insufficiently documented and leave much to be desired. Working with hardware or software synths still involves manual "patching" and every vendor is brewing their own soup.