Skip to main content

Few rhythmic suggestions

Posted

Since I'm creating R&B, feel, tempo, and rhythm are extremely important. I'm requesting that the following can be applied. Whatever you can implement as soon as 2.7 will be extremely helpful.

Rhythm- Please explain what is happening to on and off events and velocity in between the green bars.  Please improve editing for this parameter. I try to use this as step is sometimes to dramatic, especially when dealing with polyphonic figures.

Step: Add an interpolation so we have choices on what is happening to events in between steps. I believe this will give better flexibility when assigning steps to existing figures from our library.

Strum macro- Please do not apply this to the anchor. When applied to the anchor and the anchor moves, the entire figure renders completely different. If the figure falls on the first beat, sometimes it's not rendered at all because Strum pushes the anchor behind that first beat. 

Shift - Same thing as "strum." Can we have the option to not apply this to the anchor, only shifting what was rendered from the anchor? This would just be an option for us, providing more flexibility.
- Provide an option quantize note ends. If the shift is great with long notes, notes will continue on into another container even. The option to end the notes or cut them short would be nice (Idk how it would be implemented.)
- Also, very important, can we have some type of shift (another parameter maybe) based on milliseconds instead of note values? That way, the micro-shift will always remain the same time, regardless of whether it's 60 BPM or 120. 

 


Thu, 2025-10-02 - 17:43 Permalink

Thanks for your suggestions (as you mention R&B we could use some music examples for that). 

Rhythm is a soft quantization that gradually pulls notes towards the green bars (note onsets only, length is not changed). Notes between two bars are pulled towards the nearest bar.

Rhythm is best created by dropping other figures on the outlet (rhythm transfer). Thickness and height of the bar indicate how much it modulates velocity. Color indicates the strength of the quantization pull. It's worth having a look how this can be made more useful and intuitive. I haven't manually edited a Rhythm pattern for ages. 

Step is rigid on purpose. Its notation-inspired role is probably not ideal for groove design. Shift and Flow together can do wonders.

I agree that strumming should keep the anchor in place, that's a very good point. Will try it.

Shift is meant for steal-time rubato effects and humanization. Whether it's a good idea to exclude the anchor, I don't know. That could be extreme when all the notes shift into one direction and the anchor stays put. What kind of figure segment do you have in mind?

Quantization already quantizes note onsets and lengths. With extreme Shift values, long notes may reach outside a container, yes. There is only so much an algorithmic transformation can do without turning it into a mess of settings and options for all sorts of edge cases. 

What I do when I have synthesized a rhythm that works for me, I save the parameters to Figure and edit the notes that don't fit. And then I add that phrase to a library for future use.

Shifting notes based on milliseconds is latency. You can set one for every device and/or sound e.g. to compensate for a slow attack time.

Thu, 2025-10-02 - 18:30 Permalink

Rhythm Parameter: Could you elaborate on what defines "strength" in the quantization pull? I understand the color indicates strength, but I'm curious about the practical range - if a bar is very green, what would make the quantization even stronger? I'd love to explore using Rhythm as my go-to parameter instead of Step for most applications, since Step can be too rigid for the smooth R&B grooves I'm creating.

Shift and Anchors: This would simply be an optional checkbox on the right panel under shft - "Apply to anchors: Yes/No" - so it wouldn't change your current workflow at all. I'm thinking specifically about polyphonic chord progressions where I want the chord or poly phonic voicings to shift but keep the harmonic rhythm locked to the anchor points. Having that flexibility would be incredibly useful.

Millisecond-Based Timing: I think there may have been a misunderstanding - I'm not referring to device latency compensation. What I'm envisioning is timing extraction and application. For example, if I have a grace note that sounds perfect at 30ms before the target note I'd love to capture that exact timing and apply it to other figures regardless of tempo. Currently, if I create a great grace note feel at 120 BPM using beat divisions, it won't translate the same way at 80 BPM or 160 BPM because the millisecond relationship changes. Being able to preserve those micro-timing relationships would be huge for maintaining consistent groove feel across different tempos. Not to mention the benefits on percussive material.

The R&B examples are a great idea - I want to do this myself, but I have been so busy with work lately.

Please confirm what updates we can expect regarding this thread, specifically in the upcoming versions.

Fri, 2025-10-03 - 15:12 Permalink

Rhythm is primarily a parameter for transferring "groove" from one phrase to another (it's hard to draw one by hand, although you encouraged me to improve the input). It's is not a humanizer. It doesn't create random variations.

If you want to create a Rhythm parameter with meaningful soft quantization effects, you need to extract it from unquantized raw material (recorded performances are best). There need to be deviations from a recognized pattern in order to modulate strength of attraction to groove points. Quantized example material creates only velocity modulation.

Strength works this way: the stronger a groove point, the more will a note near it be pulled towards the point. 100% strength moves it completely to the point.

The visualization should be improved. In an experiment I changed it to bar length = velocity modulation and brightness/thickness = quantization strength, which is more intuitive. Using the shape tool to draw the velocity modulation can also be improved. This will be included with the next version.

I understand you want to keep the anchor unshifted to ensure a chord segment is not accidentally rendered with a different harmony (good point btw). That should not require a manual setting though. Synfire should handle this intelligently. This is more involved.

For now, you can use the "Look Ahead" setting of Interpretation. That preserves an area left of a harmony change. Also check the "Hold" setting for segments which is useful for broken-up chords.

Time-based distances between individual notes would be very fiddly to setup and maintain. For this the note symbol would be set at the same position with a negative delay attached to it. I like the idea but the Shift parameter is not the right tool for that. There are unforeseen implications when notes are set at the same time but rendered at different times. This may break things like harmonic analysis, etc.

Fri, 2025-10-03 - 17:57 Permalink

This was extremely helpful regarding how to use the rhythm parameter. Knowing this, I will be using it more often now. As for the shift, thanks. I run into this all the time, even for monophonic figures. If there's a monophonic note that falls on the first beat, shift will sometimes push it behind the beat, and it wouldn't render.

I understand about the time-based parameter. Another option is to have it as a macro, the same thing as with the strum option. So on the clipboard, a user could highlight some notes and adjust the actual written figures by time? I'm not sure how that would work.

Fri, 2025-10-03 - 18:32 Permalink

You can drag Rhythm from the library sidebar and drop it on any phrase in the arrangement to play around with the rhythm transfer feature.