… purveyor of funky beats and assorted electric treats …

Ableton Live, The Machinedrum and The Monomachine (Part 2): Minimizing Latency

Posted on | June 6, 2010 | 4 Comments

In Part one of this series, I posted tips for getting the Monomachine and Machinedrum synced and recording properly with your Live sessions. The other half of the equation is which operations to avoid that might introduce latency and timing errors during your sessions.

Ableton Prints Recordings Where It Thinks You Heard Them

I guess this design must be intuitive for many users, but it confused me for a while.  If you have a setup with anything but a miniscule audio buffer, monitoring through a virtual instrument witha few latency-inducing plugins in the monitoring chain, you will hear a fair amount of monitoring latency when you play a note.  The same goes for recording audio.

When recording a MIDI clip, I expected that Live puts the actual MIDI events when I played them — which it doesn’t.  It shifts the MIDI notes later in time to match when you actually heard the output sound — trying to account for your audio buffer delay, the latency of your virtual instrument, and any audio processing delay from plugins in the downstream signal path.  There’s one exception to this — it doesn’t worry about delays you might hear due to any “Sends” your track is using.

So your MIDI notes (and CC’s) are recorded with “baked-in” delays the size of your monitoring chain latency. I’m going to call this baked latency.

To be fair, Live’s manual goes into the reasoning behind this in some detail, and offers some suggestions for working with optimal MIDI timing. I wanted to find out what this entails.

I’m guessing most artists working with MIDI will also quantize the crap out of these notes anyways, effectively gridding-away the baked latency. However, if you actually want to hear Live try to play back the notes when you hit them — say if you are trying to record some precise rhythmic passages by hand — you have to minimize (or eliminate) any of this record latency compensation that Live is doing for you.

Whether the Options menu “Delay Compensation” feature is enabled or not, any devices that are monitored through Live — MIDI tracks, Audio Tracks, or Audio Instrument MIDI tracks — have a delay the size of your audio input buffer baked in.

In addition, if “Delay Compensation” is enabled while recording, the additional latency due to any audio devices in your monitoring signal path will be baked in.

Thwarting Ableton’s Kind Efforts To Bake Your Latency

I’ve found a few general methods for this.

1) Don’t Software Monitor

In my experience, this is always the best option, if it’s possible.  All of the following work:

  • Record an outboard MIDI synth to a normal MIDI track. Monitor mode should be set to “Off” (yes Ableton will still bake latency for MIDI tracks with monitoring enabled!).  Use local control on your MIDI device, and monitor it directly through something like a mixer or a soundcard monitoring feature. Edit and tweak the clips to your liking. Then record your synth’s outputs via normal Audio tracks with monitoring disabled.
  • Record a hardware-monitored outboard MIDI synth to an External Instrument track with Live’s Monitor mode set to “Off”.  Freezing the resulting clip(s), and option-dragging them to an audio track will give you a nice tight recording with well-chosen loop position.
  • Record audio tracks with Monitor set to “off”.  Use direct monitoring facilities on your audio interface, or a mixer.
or 2) Temporarily Turn off Delay Compensation

If you have to record using software monitoring (eg. your soundcard doesn’t support direct monitoring, you’re using a virtual instrument, and/or you need to monitor through some plugins), you can minimize latency by reducing your audio buffer size.  If you have latency-inducing devices in your monitoring chain, turning off Delay Compensation under the Options menu will help keep your recorded clips rougly where Ableton actually received then — not shifted late for you.  They will still be delayed by your audio buffer size in an effort to minimize jitter (as explained in the manual’s chapter “MIDI Fact Sheet”), but at least the “device” Delay Compensation shift won’t be added.

This method may not work if you have a complicated signal chain on other parts of your track that will now sound cacophonous with latency compensation disabled during your tracking session.

or 3) Bite The Bullet, But Minimize Latency During Recording

If you have to have delay compensation enabled, and you have to software monitor, then the general rule of thumb is to minimize latency.  Keep your audio buffer size tiny.  Don’t put any plugins on the master.  Don’t put any plugins on tracks you’re recording on.  Having plugins or audio devices disabled often isn’t enough — you have to actually delete them for Live to completely take them out of the latency compensation chain. Live will still bake in your audio input buffer latency, but at least it will be minimal.

Ableton Doesn’t Latency-Compensate MIDI Clock Out

No matter what it’s doing to your recordings, it doesn’t bother to delay compensate MIDI clock signals at all.  So if you have much latency in your system, and you’re monitoring an outboard sequencer that’s slaved to Live, and you’re monitoring that sequencer audio via a bunch of plugins, you have to put in large negative delay adjustments for Live’s MIDI Clock Delay or things won’t line up.

And if you change your signal path at all, you’ll most likely have to hand-tune that delay parameter again — for all outputs that care about MIDI Clock.

What Ableton’s Doing Behind The Scenes On Playback

This behavior all kind of makes sense in Ableton’s world-view of latency compensation: on playback, shift pre-recorded clips early to anticipate for playback delays. I’ll call this early playback. It has the greatest effect when “Delay Compensation” is on, adjusting for high-latency plugins in the signal path.

External Instruments MIDI, internally-routed MIDI, and audio tracks clips are all shifted early on playback. MIDI clips on tracks routed to an external MIDI interface aren’t played back early, probably because it’s not obvious how they’re being monitored.

MIDI clock output is kind of an unknown, external thing, and it’s not clear how much you’d want to offset if by.  As I mentioned above, it’s not shifted early.

Strangely, the metronome in Live seems to be added post-Master fx, so whether device delay compensation is active or not, it will always be playing “on time”.

Try this — create a set with a single beat loop in an audio track.  Put a ton of latency-inducing plugins on your master channel.  Turn on the metronome.  Make sure delay compensation is active.  Your beat and the metronome should sound in-time.  Now disable delay compensation.  Your beat should sound late, but the metronome is still “on-time”.  With delay compensation active, your beat’s audio was being shifted early to account for the processing delay from the plugs.

This device delay compensation on playback is usually your friend — it can keep a set with tons of realtime audio processing in near-perfect sync.

So What About The Machinedrum or Monomachine?

When recording the Machinedrum and the Monomachine, I’ve currently opted to software monitor since I like recording a bunch of discrete outputs from them, and my audio interface doesn’t support mixing multiple inputs for direct monitoring.

However, I have a template set that basically has no plugins in those devices’ monitoring path, and I start composing songs with my audio buffer set to 32 samples.  The latency is constant and known, so I’ve compensated for what little there is using the MIDI Clock Delay parameter.  And I don’t mess with it unless I have to change my audio buffer size. Basically, I’m using Option 3 from the section above.

I don’t put plugins on the Master until well after I’ve recorded the MD and MNM to audio, and I don’t monitor the MD or MNM through plugins.  If I’m recording their MIDI output, I don’t record monitor the input track (suggested in the Live manual).

If I’m going to use the MNM as a synth responding to MIDI notes from Live, I either use the audio + midi track setup, or an External Instrument track.  

With these precautions, I think I’ve come pretty close to an optimal setup for working with these machines in Live.


4 Responses to “Ableton Live, The Machinedrum and The Monomachine (Part 2): Minimizing Latency”

  1. balache
    June 28th, 2010 @ 7:21 pm

    ableton problem with latency and jitter is really serious, it is surprising they have not already solved, is there a midi plugin or something that can “move” automatically recorded midi events to a defined offset?

  2. jdn
    July 13th, 2010 @ 4:11 am

    Balache, I could imagine using Max For Live to shift notes back by a user-defined offset, but for now I’m just trying to keep my latency low, and understand what’s actually happening in the system. That way, when I play, things are tight, and the recorded performance is pretty darn close to what I want in the first place. Cheers, -j

  3. Ambrose Holmes
    September 16th, 2010 @ 12:21 am

    Bal­ache, I could imag­ine using Max For Live to shift notes back by a user-defined off­set, but for now I’m just try­ing to keep my latency low, and under­stand what’s actu­ally hap­pen­ing in the sys­tem. That way, when I play, things are tight, and the recorded per­for­mance is pretty darn close to what I want in the first place. Cheers, –j

  4. J J T
    September 6th, 2014 @ 12:30 pm

    To get machinedrum and monomachine and all my synthesizers with internal sequencers to sync all together i just stop monitoring from the DAW ( If you are in ableton Monitoring off) and i open my audio interface monitoring. The motu interfaces gives zero latency in the midi.

    Now are all synced the DAW and the external sequencers but now i can use the DAW only as a recorder.

Leave a Reply