Skip to main content

CC out to designate if each note is choral, scalar or chromatic

Posted

It would be very handy for creative uses to have an optional cc output for each note to tell if that note is a chord, diatonic but not a chord tone, or non diatonic.

 

This would not output the symbol type -- if a horizontal scale tone was playing a chord tone, the cc would output the number corresponding to a chord.

 

I think a monophonic functioning would be fine, where if multiple note on's happens simultaneously the cc send a chord message if any one note is a chord tone, or if there are no chord tones, a diatonic message etc.

 

Synfire has access to this information, if would be nice to use it for creative purposes.


Sun, 2011-07-24 - 03:53 Permalink

Just an embelihment on this.  The chord cc as above is my main interest, but it would be great t have access to a whole range of things that synfire is aware of.  

 

This includes

 

all of the parameters that are colorized in the pallet which could be sent at chord transitios.

 

first and last elements of a segment as well as anchor and the rest of the note parameters

 

abstract controllers like, offset, length and variation

 

A little far on, but an index of what folder or bank a segment came from, when draqgged from the library.  This would allow for a kind of tag referencing between synfire and a daw.

Sat, 2011-09-03 - 20:58 Permalink

There is certainly a lot of meta information available that could be exported along with the midi output, but I don't yet understand what exactly you want to use it for. Sorting out arbitrary raw CC data manually in a DAW midi editor sounds like a tedious procedure to me.

Using CCs is also not optimal. CCs can not handle text or any information other than numbers. I would rather use midi meta events as defined by the midi standard. The advantage of this is that meta events get transferred by the new drag & drop export function (drop a track to the DAW directly) and the DAW can interpret them according to the standard (tempo maps, for instance). Text events, cue markers, etc also get nicely displayed on screen.

 

Sat, 2011-09-03 - 21:26 Permalink

The cc's can be used in reaktor, max4live, pd, jesusonic, supercollider, Xenakis etc to interpret the parameters and synths. Any of the parameters I mentioned can be referenced like an index through a cc number.

Meta events aren't handled by any of those and are completely useless.

I understand why you would prefer meta events from a dev perspective. Please try to view it from a user perspective.

Mon, 2011-09-05 - 12:18 Permalink

Using meta events doesn't stem from a developer perspective. The opposite is true. Standard MIDI messages are the only data that most (if not all)  DAW can interpret and make use of without requiring any user interaction or configuration. Putting as much export information as possible into that format is the most user friendly solution. The DAW being the main export target, this is my first concern. If there are additional targets (e.g. the tools you mentioned), that's a different feature to talk about. Sort of a special device description that controls external tools and effects.

Apart from the meta CCs being a potentially interesting option for experimenting, I don't yet quite understand how you want to use this exactly. How can it help you compose better/different/more music? What's the desired effect? If there is a generally useful application behind it, it might make more sense to integrate it into Synfire's renderer rather than relying on external tools.

Mon, 2011-09-05 - 16:36 Permalink

Standard MIDI messages are the only data that most (if not all)  DAW can interpret and make use of without requiring any user interaction or configuration. Putting as much export information as possible into that format is the most user friendly solution. The DAW being the main export target, this is my first concern. If there are additional targets (e.g. the tools you mentioned), that's a different feature to talk about. Sort of a special device description that controls external tools and effects.

 

Midi CC's are standard midi DATA.

Two users are asking you specifically to export the data as CC's to the DAW.  From the daw those CC's could be sent to the VST's i mentioned.

DAW's cannot send META events to VST plugins so their ONLY use is as a reference display withing the DAW.  That use is extremely limited. They do not:

 help you compose better/different/more music?

For macro events there is not

 a generally useful application behind it

and it does not

make more sense to integrate it into Synfire's renderer rather than relying on external tools.

You have sat on the idea of scripting for years and done nothing at all in that regard. I've listed many commonly used scripting plugins that could allow every synfire user to expand their use of your software greatly using mature and well maintained tools.  The tools I mentioned are diverse in their complexity and aims catering to many different types of users with varying levels of technical ability from simple modular synth type patching in Reaktor, to simple patch based programming in Max4Live to pipeline scripting in Kontakt Script Processor.

 

These are not 

Sort of a special device description that controls external tools and effects.

Midi CC's are NOT special device descriptions. What on earth are you talking about?

With the data as CC's, the plugins I mentioned, and also the Kontakt script processor can be setup to receive the data and interpret it to control functions in the plugin.  This could be switching the articulation in a Kontakt instrument, or adjusting the filter of a synth based on the long list of parameters I provided.  This is something that every VST plugin does.

There are thousands of uses for the data we requested. To your users those uses are obvious.

There have been several threads of users asking for scripting control.  After many years of no signifigant updates for this software, this would allow your users to begin scripting with mature tools with minimal work on your part.  It allows us to tie the abstract parameters to tangible sound parameters either directly or through more creative indirect methods... Its is entirely up to the composer as to how that is done.  It is a creative process for quickly binding audio propeties to musical states.

Thousands of composers use midi CC's to control the tools that I mentioned just as I have described. This is not uncommon, it is the MOST common approach.

Macro events are uncommon. Find me any synth or patcher or scirpt processor that can handle them in any way. Find me any composer that uses them is a creative way. Find me the request on this forum that was asking for them.

It is the dev's perspective because no composer here was asking for macro events and no composer uses them. If you want to include them as well as CCs, fine, but to suggest that CC's are "experimental" or "special" is patronizing and rather blind.

Requesting features for this software, more than any other software I have encountered, is like pulling teeth.  You seem to have no faith that we know what we need -- Cognitone knows best, but Cognitone gets it wrong a whole lot.

 

Mon, 2011-09-05 - 22:48 Permalink

I'm afraid you completely got me wrong. You said

it would be great t have access to a whole range of things that synfire is aware of

wich caused me to reason about ways to enhance Synfire's export capabilities, hence the meta events. Referring to your CC suggestion, I asked for examples of how you want to use this information in detail. Then I got bashed for being reluctant against feature suggestions. That went pretty much wrong.

DAW's cannot send META events to VST plugins

That's not what they are intended for. The DAW converts them immediately on reception. I mentioned this as a means to enhance export, making use of the "whole range of things that synfire is aware of" without requiring user interaction or setup.

Midi CC's are NOT special device descriptions. What on earth are you talking about?

The feature you are suggesting requires significant configuration and setup on a per project or per external tool basis. The latter is what I referred to when using the term "sort of device description".

And some sort of this would be needed. Built-in code that just pumps out all available information would produce more CC data than actual music. And I bet that all these CCs run the risk of messing with one or the other DAW and/or plugin out there.

Therefore there is no quick solution without adding a new user interface for setting up the desired CCs, enabling/disabling individual meta information, finding a way to organize this configuration as part of a project or "device kind of thing", then documenting and explaining all this to the user, etc. Considering this cost and time (and the long list of other pending items on our list), I am not asking for to much if I want to learn more about how exactly the meta information is supposed to be used. 

There are thousands of uses for the data we requested.

I don't need a thousand. Tell me one or two examples how you would use a chord index CC, a palette color CC, a library index CC, or a segment start/stop CC.

These particular CCs by the way also involve serious challenges: Segment start/stop: How to deal with overlapping segments? Chord index: How useful is that with shifted segments and transitions? How to encode the chord as a CC? Polyphony? Palette color: How to encode the color, the key, the scale? Library index: How to define a global indexing scheme for all libraries? One important thing that you might overlook is that the parameters and figures you see do not relate 1:1 to the rendered output. Meta information needs to be transformed and carried along through all stages of rendering. The strength of Synfire lies in that it works with structured non-linear information. Linearizing this information is a complex task.

Please don't get me wrong again. I'm not questioning the usefulness and fun of CCs, scripting and automation. I have worked for years with patching environments like Plogue Bidule and Symbolic Composer and yes, I know how CCs are used with plugins, but that's not the point. The point is to find a justification for interrupting current product development to implement such a complex new feature. Please point me to some facts that make this more than nice to have and I will seriously consider it:

How can the special meta information that Synfire has to offer be converted to an effect that can not be achieved through other means and that goes beyond a rather random sound effect?

--

P.S: You may feel that feature requests don't receive the attention they deserve (partly because you have suggested so many features already - thanks for that), but that is simply not true. I'm having a hard time balancing the very limited resources we have with the long list of development ahead. That's why I am looking at the cost of new features. It is my responsibility to address those topics first that are most urgent and pressing for the majority of users (80% had issues with the midi setup and DAW sync, which is why 1.5 is around the corner now), and then look at suggestions and enhancements. I don't say we will never do this or that. I just say we can't do it now, unless it is "must have".

Tue, 2011-09-06 - 01:55 Permalink

First, I did not ask you to interrupt development to implement this feature.

 

I requested a feature with no reference to time or priority.

 

You misinterpreted my request in both implementation and purpose, suggesting meta events should be used instead.

 

I then followed up a second time explaining that they would be of no use. I gave reasons why meta events wont work and why CC's are needed. Since I know you know the software I mentioned I assumed the application would be implicit.

 

You followed up with " Putting as much export information as possible into that format is the most user friendly solution" While I had just explained that that solution is not in any way user friendly or useful.  I am now fighting to get my original idea implemented in a way that would be every remotely useful.

 

I don't mind if you don't implement this feature, in the short term or ever.  I do mind that you would take time out of the development of other more important features to develop this feature in a way that would render it useless.  I explained this twice, and now a third time.

 

As far as concrete examples where CC's could be used out side of what you describe as "a rather random sound effect"

 

-----Using the chordal scalar of chromatic properties of a note with KSP to set the filter cuttoff  of a note for the duration of the note.  Here even a polyphonic chord with different types of elements can be used as such

cc-chord

note1

cc-scalar

note2

cc-chromatic

note3

 

If the notes are set in that order, even if the sample block offset for the notes is the same.  Kontakt will be able to assign a specific value for the duration of each voice that doesn't change for that voice as new CC's arrive 

 

For a twenty minute composition with tens of thousands of notes I could create complexity that would be impractical to implement by hand.

 

-----  Using the chord fragility coloring with no scripting whatsoever in a modulation matrix in a vst like, kontakt, battery or a linnplug synth to color notes consistently with different kinds of synthesis modulations... lfo speeds cuttoffs, wave shape etc.   There is nothing random about the fragility parameters representation between two chords.  It would be higly useful.

 

---- outputting a index representing the source folder in the library to select from a range of a Kontakt instrument banks to apply and mix instrumentation, intelligently with additional scripting in the KSP to link the context of the music materials source with the output sound.

 

Like I said this is a more complex example.  But since I mentioned it I figured it should be clear as to why it would be useful.

 

My process involves making these decisions based on the context of the particular piece. These should give you an idea of some specific uses but the point is that having the data would open up new creative possibilities for inventing new contexts based on the musical and artistic needs of specific song or project.

 

Yes you would need an additional window where only the specific cc's that are required could be selected for output and routed to specific CC's.  I write code like this in the various script processors I mention in about 10 minutes.  Relatively routing is not a complex problem to solve in terms of code.  I really don't see where extracting something like the fragility parameter would be complex to code either as you already have it implemented in the palette.  Given the obtuse history of making feature requests here I suspect you will claim that these are immensely difficult problems to implement requiring years to code.  Fine, I can live with that.

 

As per your p.s., My contention is not that feature requests don't receive attention.  It is that every feature request is fought with an uphill battle as to why it should be done a manner other than the one requested.  Even after several follow ups and explanations as to why it is necessary to implement the feature in the way that it was requested.  Almost every request thread is met with constant second guessing and ill conceived alternatives to a rather direct request.  I don't have time to argue through fifteen posts every time I have an idea that will improve the workflow with Synfire.  When I make a request on the Reaper message board the developers never insist that the idea is not useful, random or should be done in some other way.  Feature requests here are met with argument without exception and it is exhausting more than any other software I work with.

 

You want absolute specificity in my examples.  I don't have time.  You rarely ever implement new features, it would be insane to continue writing with absolute detail as I have done that in the past to absolutely no point or avail.

 

I was not hounding you about this feature. I requested it, someone else agreed it would be useful and then it was left to sit with no response for two months.  It was only after seeing the whole point of the idea mutilated into something useless that I took issue.  Apparently you can't see how that would be frustrating.

Thu, 2011-09-08 - 00:21 Permalink

Thanks for your reply and the examples.

You misinterpreted my request in both implementation and purpose, suggesting meta events should be used instead.

I understood quite well what your suggestion was about, but, because there is currently not enough room in the short term for the implementation as you envisioned it, I reasoned about an alternative feature that can also leverage the meta information you were talking about (enhance export to DAW). We were talking about two different things.

I don't have time.

Me neither. I wrote on this reply for more than an hour, trying to explain and offer more insight into our current schedule and priorities, etc. But then I deleted almost everything, frustrated about why I was spending so much time on a fruitless distraction, especially considering your sometimes insulting tone and lack of understanding for what we are doing. 

Feature requests here are met with argument without exception

That's the normal way all proposed changes are taking in every software development team. The only difference is that most developers are more smart than me in that they do not publicy share their instant toughts and instead follow a strict communication policy that is optimized to free them from pressure. I should probably just continue to note down ideas and discuss them internally when we get around to it.

I have done that in the past to absolutely no point or avail.

Absolutely wrong. Your suggestions have been entered into our logs, been reviewed, considered and found their way into the road map in one or the other way. They're just not yet released.

I write code like this in the various script processors I mention in about 10 minutes.

Sure. That's easy when the environment for that is already complete. You overlook the effort it took the developers to engineer that scripting system and programmable user interface in the first place. Your past estimates, comparisons and unrealistic expectations suggest that you don't know especially much about the economical and technical facts of software engineering. I've been in the software business for more than 25 years. I'm not telling that something requires a lot of work when it doesn't. The contrary is true. More often then not I am underestimating a side project and then regret it.

We have tons of algorithmic, scripting, modular patching and synthesis features in the drawer, but now is just not yet the time for that. Audio and DAW sync comes first (1.5 release), then the pending user interface improvements and usability issues, and then comes the next layer of features on top. 

 

Thu, 2011-09-08 - 08:45 Permalink

We have tons of algorithmic, scripting, modular patching and synthesis features in the drawer, [… snip …] then the pending user interface improvements and usability issues, and then comes the next layer of features on top.

 

I will rejoice in that! Ich freu' mich schon jetzt drauf!

 

Minkepatt

 

 

Thu, 2011-09-08 - 12:28 Permalink

I will rejoice in that! Ich freu' mich schon jetzt drauf!


 


Minkepatt


+1


 


btw: The CC-out-issue is no reason to argue. Sort of „nice to have“-features. But I agree with tokyorose that it could be useful. For example to assign different sounds for different chord types. A dark sound for minor chords and a bright sound for major chords.

Thu, 2011-09-08 - 17:01 Permalink

We have tons of algorithmic, scripting, modular patching and synthesis features in the drawer....

Andre

Please advise which of the above features are likely to appear in a 1.x release, rather than a 2.x release.

Thu, 2011-09-08 - 19:27 Permalink

It is not yet clear if and how many of these ideas will make it into the product at all, so don't hold your breath. I just mentioned this to show that we are not ignorant regarding ideas like that. If you are specifically looking for a modular patch and generative algo environment like Symbolic Composer, Supercollider, Max/MSP, Bidule, CSound, Noatikl and the like, then go with that software. Each of those is great in what it does.

The data driven model of the above tools can not easily be ported to the structured and symbolical information Synfire uses. Integers, floats and audio streams are way simpler to handle and combine than harmonic contexts, figures and polyphonic vectors, each of which have a different semantics.

While it would be possible to come up with a couple cool features in the relative short term (quick and dirty generative features are easy to implement), they tend to be interesting only for one or two projects. Then the generated music is quickly perceived as "all the same", because human hearing is extremely intellligent and quickly recognizes the receipt and scheme behind it. It is economical wahnsinn to invest hard work in a tool that is used once or twice and then abandoned. That said, either we find a system that is flexible and style-agnostic enough to work in the context of Synfire, or we leave it. We might even spin off a separate product. It is too early to say.

 

Fri, 2011-09-09 - 03:05 Permalink

Andre and the Cognitone team are one of the most transparent and good at refining their products. 

Maybe is time for you Andre to spend more time developing that answering every request from us. some people out might want you to have a solution for every obstacle or shortcoming they face.

 

many of us know the quality and effort to improve from your team and you. ou

Go as far as you can see. Once you get there you will see even farther J.PMorgan