Calculating latency values in signal path
Posted: Tue Sep 30, 2008 12:53 pm
Hi all. I was hoping you could help me find a way of solving this latency problem I'm having. Here's a simplified diagram of my studio, to help describe what I need to find out. There's a main DAW, with Ableton Live as Audio/MIDI sequencer, connected via ADAT to a second DAW, where Scope resides. Although there is a monitoring path from DAW1 to the speakers, I usually mix and monitor everything in Scope, as you can see in the picture. This way, all external synths and modules, all SFP synths, and all VST tracks (audio or instruments) are summed in the Scope mixer (in this case, SpaceFs FP106).
This works pretty well for playing and composing. SFP's synths behave like external modules, and doing things like having DAW1 playing a Drum loop and triggering a bass in SFP is easy and the sound "locks up" pretty well, as the ASIO latency is compensated inside Ableton for the audio and MIDI parts, and the SFP latency introduced is the same for the ADAT inputs and the Synths output.
The problem is that DAW1 is where I do all recordings, so when I want to record the bass sound back into Ableton, I use the record out in the FP106 (orange square in pic) to send it via ADAT to an audio track into Live and record the output - but the delay introduced in the signal is quite significant!
In the image, you can see this graphically: in yellow is the normal path: MIDI is triggered from Ableton into a Scope Synth, then to a mixer, blended with the other audio and then to the outputs. I want to record that path and replace it with the green path, but maintaining the timing.
What I need to find out is if this delay is consistent, and if it is, how I can calculate it. I would expect it to be:
(ASIO latency for MIDI)+(SFP latency for synth)+(SFP latency for output)+(VST latency for recording)=total delay, which means in this case: 12+7+7+12=38ms, but introducing that in the "track delay" doesn't solve the problem, and pushing the star point forward 38ms is not the trick either, so something must be eluding me.
Ableton is not so good for this kind of precision measurement, or at least I can't think of a way to do it there. I could do it in Cubase, but would that then translate back to Ableton?
Any help appreciated!
Thanks in advance,
T
EDIT: OK, so I figured out a way to measure this. I make the recording into ableton, and activate both yellow and green paths (playing the MIDI sequence and the recorded audio) at the same time. Using the direct outs, I can tap both signals in the FP106 and send them via ADAT to separate inputs on DAW1, record these with the RME's built in Wave recorder, and compare both files in Wavelab. This should give me an accurate measurement of the difference between both signals and, after checking out with different synths for consistency, it will enable faster bouncing of synthesized tracks to audio.
Anyway, my Scope card is dead right now, so I can't try it out. In fact, I am in very deep trouble if I can't have it working soon
This works pretty well for playing and composing. SFP's synths behave like external modules, and doing things like having DAW1 playing a Drum loop and triggering a bass in SFP is easy and the sound "locks up" pretty well, as the ASIO latency is compensated inside Ableton for the audio and MIDI parts, and the SFP latency introduced is the same for the ADAT inputs and the Synths output.
The problem is that DAW1 is where I do all recordings, so when I want to record the bass sound back into Ableton, I use the record out in the FP106 (orange square in pic) to send it via ADAT to an audio track into Live and record the output - but the delay introduced in the signal is quite significant!
In the image, you can see this graphically: in yellow is the normal path: MIDI is triggered from Ableton into a Scope Synth, then to a mixer, blended with the other audio and then to the outputs. I want to record that path and replace it with the green path, but maintaining the timing.
What I need to find out is if this delay is consistent, and if it is, how I can calculate it. I would expect it to be:
(ASIO latency for MIDI)+(SFP latency for synth)+(SFP latency for output)+(VST latency for recording)=total delay, which means in this case: 12+7+7+12=38ms, but introducing that in the "track delay" doesn't solve the problem, and pushing the star point forward 38ms is not the trick either, so something must be eluding me.
Ableton is not so good for this kind of precision measurement, or at least I can't think of a way to do it there. I could do it in Cubase, but would that then translate back to Ableton?
Any help appreciated!
Thanks in advance,
T
EDIT: OK, so I figured out a way to measure this. I make the recording into ableton, and activate both yellow and green paths (playing the MIDI sequence and the recorded audio) at the same time. Using the direct outs, I can tap both signals in the FP106 and send them via ADAT to separate inputs on DAW1, record these with the RME's built in Wave recorder, and compare both files in Wavelab. This should give me an accurate measurement of the difference between both signals and, after checking out with different synths for consistency, it will enable faster bouncing of synthesized tracks to audio.
Anyway, my Scope card is dead right now, so I can't try it out. In fact, I am in very deep trouble if I can't have it working soon