According manual (Pro manual, page 94) ADI has three (16, 24 and 32-bit) bit perfect check circuits, constantly running i.e. waiting their Right Pattern. They are programmed to react these and only these RME´s bittest.wav -test file bit patterns. Their bit pattern is 400 samples long, very specific line of 0´s and 1´s. Like 6400, 9600 or 12800 characters long password.
Check circuits won´t trigger if a bit pattern fed in deviates even by one bit of what they are expecting. Which would be a case if feeding streamer distorts i.e. fucks its output bitstream even by one bit. "Wrong password" i.e no pass in such a case.
onlyoneme wrote:Is it possible to pass bitperfect test despite of very high level of 3rd harmonic distortions for -1.1 dBFS and above?
Some users say that streamer X passes RME bitperfect test. Some others report very high signal distortions with their devices.
No.
If this distortion really origins from digital signal, some PROCESSING in this Streamer X must do this alteration i.e. add that distortion there. Some bug in handling of sample rates, would be my quess. Wouldn´t be a mirracle, such a bug-sack peace of shit streamers are on market.
Anyway in that case, when putting bittest.wav on playback and fed to ADI, bittest.wav´s bit pattern would get severely modified due to said bug-processing of Streamer X: ADI´s check circuits wouldn´t trigger i.e. detect it from incoming bitstream.
Yes... there are many kind of Digital Audio Einsteins etc untechnical-morons playing expert-influencers in internet these days... in reality some of them barely qualified to use a screwdriver.