A better way to measure the performance of DACs

For a long time, I’ve wondered how useful the ‘standard’ battery of tests (THD+N/SINAD, IMD, jitter, etc.) really are for measuring a DAC’s performance. They all rely of the use of single or multiple tones, which don’t reflect all the elements that make up music and are certainly not what we listen to. I understand why humans might not be used to test new pharmaceuticals, but I don’t understand why music is not used to test audio gear.

It seems to me that the gold standard would be to compare what comes out of the DAC with what goes in… using real music. And that’s exactly what a null test attempts to do.

Null tesing is fraught with difficulty, which is probably the main reason why it isn’t used much. The main problem is in aligning the two signals (what comes out with what goes in). They need to be matched in time and in amplitude. The former is best done by syncing the DAC and ADC clocks and then aligning at the sample level. The latter requires that the RMS values of the two signals be matched as closely as possible (better than 0.001dB).

There’s a massive thread on Gearspace that attempts to use null testing to rank DAC/ADC combos, with very misleading results, unfortunately. The method is severely broken.

There is a piece of software called DeltaWave that can be downloaded for free and is by far the best null-testing tool I’ve come across.

I’ve just compared 3 DACs using DeltaWave:

  1. dCS Scarlatti
  2. RME ADI-2 Pro fs R
  3. Okto dac8 PRO

The RME and Okto DACs are modern DACs with state-of-the-art performance according to the standard battery of tests, and would handily beat my Scarlatti in this regard, I suspect.

But can null testing shed any further insights? Well, here are the results of my null testing (using the same classical track as used in the Gearspace test):
image

In absolute terms, the RME and Okto are indeed superior to the Scarlatti - they have deeper nulls (RMS differences against the source of -73.80dB and -71.02dB respectively). However…

The RMS difference does not take into account the effects of the anti-imaging filters >20kHz, which will skew the results in favour of slow filters. The A-weighted difference does, and here the Scarlatti is superior.

The ‘PK Metric’ is a measure that the developer of the software has created, which he believes gives a better indication of the DAC’s ‘perceptual accuracy’. And again, the Scarlatti is superior.

Subjectively, the Scarlatti sounds ‘fuller’ and more laid back than the modern DACs. Null testing suggests it’s also the most perceptually accurate.

Just thought I’d share…

Mani.

3 Likes

Now all I need is for dCS to send me a Vivaldi APEX to test (preferably on long-term loan) :wink:.

Mani.

2 Likes

Spot on Mani :slight_smile:

In fact, not only is DeltaWave, and from years ago, Audio DIffMaker, excellent tools to measure the performance of a DAC, they’re great at objectively debunking nonsense “audiophile” products (like “Audiophile” Ethernet Switches and “Audiophile” Ethernet Cables for example).

Ironically, the dCS Vivaldi stack is the perfect platform to use in conjunction with these tools, as functions are disaggregated into separate elements, and each interconnect point can be examined independently; the Vivaldi Upsampler for example is especially useful when paired with my TASCAM DA3000 sync’ed to the Vivaldi Clock to capture PCM streams (instead of your AD124), or PrismSounds’s DSA-1 AES/EBU Analyser, which I also own.

3 Likes

Hi Anup, lovely setup you seem to have there :slightly_smiling_face:.

I tried Audio DiffMaker back in the day and found it pernickety and inconsistent. But it showed the potential of null testing. DeltaWave is on another level. It has the ability to match RMS levels perfectly, to correct for clock drift (not required if all clocks are synced of course), to deal with subsample offsets, and to align the files at the sample level, all of which are essential requirements for achieving accurate nulls. Its non-linear corrections go deeper still, but need to be used with caution - they’re not required for comparing audio DACs and may skew the results.

Null testing is helping me to better correlate what I’m hearing with what the DAC is doing. And I agree that having a stack is helpful - being able to separate the elements to determine what each is contributing, or not. I’ve got more testing to do in this regard.

Haha… With what do you think I capture the digital output of the AD124? With my Tascam DA3000 of course! (Also synced to the Scarlatti clock, along with the DAC and ADC.). The Tascam’s a lovely unit, but not up to the job of doing ADC duties here. Haven’t tried connecting the Scarlatti upsampler directly to the DA3000 yet because it’s easier to just upsample in Roon and capture the output stream directly in Audacity.

Mani.

1 Like

Yeah, for one, the captured clip and baseline source clip have to each be under 30 seconds or else the software chokes and dies (albeit 30s is more than enough to demonstrate complete transparency). And then theres limited adjustability for offsets, so the samples have to be closely aligned to begin with, etc. :laughing:

I have to agree, DeltaWave is just brilliant, and Paul’s written a bunch of other excellent tools as well.

The interesting feature useful for digital analysis is the Vivaldi Upsampler’s “Clone” mode; it makes the Upsampler an ideal Ethernet-stream to AES/SPDIF (PCM) converter:

Stream a track, capture the Upsampler’s PCM output, and compare to the source track PCM - You can immediately tell if anything anywhere along the chain from the Music Server all the way to the output of the Vivaldi Sampler has had any impact whatsoever.

A similar thing can be achieved with a Raspberry Pi and an S/PDIF HATs of course, but nowhere as elegantly. Likewise with Roon, I’ve captured the output of Roon Bridge onto Adobe Audition at the other end of the Server-Cable-Ethernet Switch-Cable chain. Although, Roon has a built-in transparency detector, but it’s not as fun :grin:.

1 Like