Thanks MC.
For context, I play keyboard live, keyboard usb direct to macbook running mainstage, then usb out to audio interface, to headphones and line out speakers. So I only care about output latency.
So continuing that chapter:
"Mac OS X: The buffer size is defined within the application. A setting of 64 samples at 44.1 kHz causes a latency of 1.5 ms, for record and playback each": Fine.
"When performing a digital loopback test no latency/offset can be detected":
Loopback = "A playback and re-record of the same signal via DA and AD (loopback)"
I don't think loopback (or offsets) are relevant to me I just need DA (not DA + AD).
"Under Windows the Fireface UCX II uses a fixed additional buffer of 32 samples, under Mac 24 samples, which is added to the current buffer size": Ok that's useful.
* Is it the RME driver which adds that?
Mainstage I/O Safety Buffer
https://support.apple.com/en-nz/101938
"When you turn on the I/O Safety Buffer, MainStage will add an additional output buffer to protect against overloads due to unexpected CPU spikes. Its size is equal to the I/O Buffer Size setting, but it only affects the output buffer."
In the youtube example, he must have safety buffer off, because turning it on would yield 64+64+24 samples x 0.0227ms = 3.45ms.
So maybe my total output latency (with safety buffer off) is 64 (mainstage) + 24 (RME driver in Mac) + 6 (RME UCX II output)
94 samples x 0.0227ms = 2.13 ms
Is that right? What about the remaining 0.27ms (2.4 - 2.13)?
If we got the fastest Macbook on the planet, and applied the same 44.1kHz + 64samples settings, would the DAW display exactly the same output latency result?