Topic: Optimal Converter Input Line Level

If I use the consumer -10dBV setting on my Multiface inputs, I significantly reduce the amount of gain I need to acheive with my mic preamps. But most people I see on the forums use the professional +4 dBu setting... why?

Less gain in an amplifier leads to lower THD and higher SNR, correct? But is there a also tradeoff in conversion distortion when using the -10dBV setting?

How do I figure out which is the optimal setting for my case?

Thanks,

Dave

Re: Optimal Converter Input Line Level

The lower input sensitivity on your Multiface will increase the noise from the Preamp's output stages.  The preamp's output stage noise floor will be LOUDER with the Multiface set to -10dBv and will be QUIETER if set to "High Gain" (but then you'll need more Preamp Gain).  You'll have to experiment and see which setting works best with your equipment and your specific noise floor and gain structure.  Some gear just sounds better when run at a certain range as well - that in itself would likely be more of a concern than the raw noise floor (I'd trade a better/smoother sound for slightly more noise - depending on content, of course)

If a unit is designed to operate at +4dBu output levels (say +27dBu peak), then it just makes sense to have your associated downstream gear anticipate/operate at that same level range or gain staging issues (and associated noise floor effects) might be an issue.

Do what sounds best - A cliche - but a very true one!

cool

MADIface-XT+ARC / 3x HDSP MADI / ADI648
2x SSL Alphalink MADI AX
2x Multiface / 2x Digiface /2x ADI8