I have become more interested in clocking and have learned a lot from others who have posted on this fascinating topic. Per @ChrisK’s thoughtful suggestion I have moved a prior comment to a new post. I found this statement from MSB quite interesting as it seems in contrast to dCS. I’d appreciate the thoughts of others on this topic:
dCS (from Vivaldi Clock description):
"In a dCS system, the DAC can act as the system master clock, but listening tests have shown that there is no substitute for a dedicated high quality master clock. dCS pioneered the use of external clocks in digital audio systems and this clocking technology has been continually refined so that our latest multi-stage Phase-Locked-Loop (PLL) system sets world-beating standards for accuracy and control of troublesome jitter from the incoming audio stream.
MSB:
"Why are external clocks sub-optimal for digital audio?
A clock signal is a fast moving precision electrical signal and is extremely sensitive to added noise or distortion. Each time it’s buffered or transmitted, a portion of its precision is lost. Even if a small amount of noise couples into the clocks, jitter will increase dramatically. It might still be an accurate clock, but accuracy is of little performance benefit to digital audio. A clock sent over an ultra-high quality cable will still increase its jitter considerably. The best solution is to create the lowest jitter clock as close to the DAC as possible.
This is how a dCS external clock works, as explained by @James
[…] it is worth noting that the use of a Master Clock in a dCS system does not replace the internal clock inside of the DAC. It simply acts as a stable reference for the DAC to lock itself to, and allows for DAC and source to be properly synchronised without issues such as intersymbol interference causing jitter within the audio data. The DAC’s internal clock still dictates when samples are converted, it simply adjusts its frequency over time to match that of the Master Clock. This means the DAC still benefits from having a high-quality clock close to the DAC circuitry. The clock directly controlling the audio is still part of a tightly controlled environment, while also being in sync with the rest of the system.
From:
So, in a dCS system, there is synergy between the internal and external clock.
To me, this is “apples, meet oranges.” Two very different approaches, just like the core DACs themselves. I have difficulty with any suggestion that one is inherently superior to the other, in the absence of empirical evidence of some sort and qualified listening.
For two reasons, (I believe previously covered in other Clock threads);
An external Clock is required in order to keep multiple units, like in the case of the Vivaldi stack, fully synchronised.
But more importantly, a external master Clock with its own dedicated Power supply, isolation chassis, and control circuitry will be technically superior to one that’s built-in and has to share resources with the DAC.
In the case of MSB, since they don’t have an external master clock, their introduction of their Digital Director - effectively similar to the Vivaldi Upsampler - makes things very complicated They’re likely using some proprietary way to keep the units properly synchronised; not quite sure how exactly, probably exporting the DAC’s Clock over the Control Link Module. Greg @PaleRider, do you know?
Well…it is still the MSB strategy…upgrades inside one box…but for how long time ?
Reading what they say about their “Director” I found this * Fully syncronizes all clocks prior to transmission to the DAC"
It is the same idea fostered by dCS…MSB shall soon say it is even better when this synchronisation is isolated from the streamer…And they will market a new product like “time protector”
I spoke with someone in MSB long ago and the way I understood him was that MSB is one of the few companies that in their DAC re-clock digital data using their high-quality clock, no matter what input data are coming from. Yes, they re-clock the SPDIF/AES input too. While in general, most of the DACs including dCS’ by design are always slave to source’s clock. That is why it is important to have high-quality clocked data at DAC’s input.
A Phase Locked Loop, or PLL, is a circuit that works to match the frequency of an incoming signal with that of an outgoing signal. They are often used to synchronise a DAC’s internal clock to that of an incoming signal, such as SPDIF from a CD transport. A ‘phase detector’ in the PLL attempts to match the phase of the incoming SPDIF signal with that of the DAC’s internal clock. Its aim is to get the phase error as low as possible, ensuring that over time, the two clocks run at on average the same rate, and the DAC’s buffer never under or overflows.
The most common place to see a PLL in an audio product is within an ‘off-the-shelf’ SPDIF receiver chip. This chip will be utilised on the SPDIF input of a product, typically combining an SPDIF to I2S block together with a PLL. Using a third-party solution such as this can give rise to some issues. With such a chip, it can be very difficult to separate out the functions of signal conversion and clock domain matching. This becomes problematic when attempting to use a Word Clock signal as the clock master for the DAC. What’s more, if the performance of the chip isn’t up to scratch, then it is impossible to change it. AES clock extraction is a good example. This is actually quite difficult to do well; because of the structure of illegal codes within the signal, it is easy to induce jitter from the channel block marker that occurs every 192 samples (the structure of SPDIF/AES is beyond the scope of this post but in essence, the signal deliberately breaks the ‘rules’ by having periods of 3 0s or 1s in a row for various reasons, including to lock the PLL to).
At dCS, we’ve taken a different approach. dCS DACs still use a PLL, but it is a hybrid design, developed entirely in-house. Part of the PLL is digital, by way of DSP inside the product’s FPGA, and part of it is analogue. This lends an enormous amount of flexibility, and a much higher level of performance. Additionally, it is completely independent from the input source. We are also able to carry out functions like dramatically altering the bandwidth of the PLL. This allows the DAC to lock very quickly to a source, thanks to a wide bandwidth on the PLL. The bandwidth can then be tightened over time to reduce jitter.
This approach ensures that, within a dCS product, the clock and data paths remain independent. There is a part of the product’s FPGA which works solely to extract the clock embedded in, for example, the incoming AES signal (again, this is done using a bespoke design, rather than an off-the-shelf chip); another part which works to retrieve the audio, another for routing it, then processing it, and so on.
This gives us a tremendous amount of flexibility in terms of how we handle, for example, Dual AES: we can run the signal, have a separate Master Clock input, have the DAC act as the Master Clock for the whole audio system, tolerate different lengths of cables in Dual AES, and deal with phase offset between clock and audio, and all of this can be done without adding latency to the audio, meaning it can still properly integrate with video. We are also able to hide commands embedded in the non-audio bits of AES, which allows us to have, say, the Vivaldi DAC (a non-network equipped product) controlled by the dCS Mosaic Control app.
I’m not so sure how similar the Digital Director is to the Vivaldi Upsampler. For one, it performs no upsampling. None of MSB’s current DACs perform upsampling to my knowledge.
On point 2, if the following statement is true, I am not so sure.
MSB: “A clock [signal] sent over an ultra-high quality cable will still increase its jitter considerably.”
If there is no transport, and no separate upsampling device, it would seem that all clocking should happen inside the DAC.
Surely the proof is in the listening, or perhaps I should say the choice is for the listener. If you like what an external clock does, then use it. If not, don’t.
I demoed a clock at the dealer and noticed a small but material improvement with his very expensive demo gear. So I bought a clock for my Rossini Player and despite extensive listening tests with all manner of cables I could hear no difference whatsoever. So I took it back to the dealer. Who knows why; maybe my gear was incapable of resolving the difference, maybe my Rossini Player by chance already had the best ever internal clock ever made. It matters not, I saved the money and have the same quality music in my opinion.
Yes, but the point I was trying to make is that as long as multiple digital units are involved, there must be only a single Clock source to keep all unit synchronised. Which is where an external Clock becomes necessary.
(I did some digging, apparently MSB sends Clock synchronisation signals over their proprietary ProISL module/interface. So, unlike dCS, there’s no way for MSB to support a non-MSB Transport or Streamer if you want to keep them properly synchronised.)
MSB is not wrong that sending a Clock signal over external interface/cable will increase jitter of the Clock signal (which obviously equally applies to them in their multi-box setup! ); which is why dCS has one of the most sophisticated PLL’s in the market. You might want to read carefully James’ posts that Erno’s highlighted above.
Also, as Simon says above, the proof is in the listening, but also in the objective measurements of jitter. dCS systems have one of the lowest jitter profiles in the market; which proves such external clock based systems can perform just as well, if not better, than systems with built-in “femto” clocks.
And they do! As pointed out by Erno above; all dCS devices have internal clocks built-in! They’re just cannot be as accurate as a dedicated external Clock
I’m just not sure about this statement:. “They just cannot be as accurate as a dedicated external Clock”
Perhaps when multiple separated are involved, otherwise I see no reason why this would be so.
Taking another approach to explore the point: I wonder if dCS itself would state the clock inside the Vivaldi One is inferior to the separate Vivaldi clock.
I agree @Ermos. Below is the source of the “simplified quote read somewhere” (!) ; )
Again, there appears to be quite different design philosophies between the two companies: upsampling versus no upsampling; external clock versus internal clock; multiple integrated power supplies versus one external, separate power supply.
I find this quite interesting and curious since both manufacturers are highly esteemed. Hence my interest in other’s views on this matter.
Hi @Anupc, continuing to push on this thread/point:
The logical conclusion of that position is that the Vivaldi One provides inferior clocking than the Vivaldi stack w external master clock. Do you agree?