I got four clocking cables, two each 1 and 1,5 meters in length (not sure why, quite possibly due to stock, Chris Sommovigo’s Black Cat being a one-man operation). I’m using one length for the 44kS/s, the other for the 48kS/s group, respectively, even though I’m quite sure the minor difference in length is irrelevant. Even so, I’d be curious to know if indeed it doesn’t matter? Thanks in advance!
There is a consideration of the signal reflection caused by the change in impedance due to the termination of the cable. You will find that a cable length of 1.5m is often considered as a minimum length in digital audio in order achieve a reasonable degree of attenuation by the cable of that signal reflection.
Thanks! I must not have been very clear in my original post: I meant clocking (synchronizing) several units from the same master clock using cables of varying lengths - I’assuming that if one were a few hundred meters long and the other short, this might affect clocking, but with differences as small as half a meter?
Obviously, in a Vivaldi stack without Transport, i.e. Upsampler, DAC and Clock, I need 4 clocking cables. It seemed to me to make most sense to use the 1,5 meter ones for the 44kS/s group and the 1 meter ones for the 48kS/s group, just because my lizard brain tells me not to mix cables of varying lengths. Even so, I’m wondering if it matters?
Hey Hari, just thought I’d provide my 2cents on this:
The engineering behind’s Steve Nugent’s PFO article is real, though a little skewed I feel.
In a typical home S/PDIF interconnection, the reflection coefficient would be rather inconsequential (S/PDIF is also spec’ed to 75Ohm, and with a wide margin). Probably explains why one can use even the cheapest RCA cables and things still work well enough, subjective qualitative debates aside
Ironically the article isn’t accompanied by any empirical demonstration of the issue, would be simple enough to show with any basic digital Oscilloscope. It’s not just him though, I don’t seen any cable manufacturer proving their case for this “minimum digital cable length” argument in in a real-world setup.
The dCS’ clock ports are engineered for 75 Ohms matching. If you’re worried about 0.5m, just try one, won’t cost you much more than a cup of Starbucks coffee for a properly spec’ed 0.5m 75 Ohms BNC cable that can easily handle even Gigabit-rate digital transmissions, let alone a 44.1/48kHz clock signal
Nominally. Maintaining a 75 ohm transmission line from transmitter to receiver can be defeated in practice at several points. Nugent lists some of them in the second paragraph of his piece. Just my single example: Those 75ohm BNC connectors are not specified and tested to be 75 ohm impedance in the kHz band but, as they were intended as video connectors, in the Ghz band. Does a typical BNC connector preserve 75ohm impedance @ 48kHz? I don’t know and I suspect neither does anyone else.
This is not even to consider how the chassis connector is connected to the PC traces ( simple hook up wire is common). Are the PC traces 75ohm? Nugent states he has never seen impedance controlled PC traces.
These days I am of the opinion that what the conceptual design may envisage is unlikely to occur in practical engineering (solder joints anyone?) and that the short length of the transmission line in a typical home audio system makes its relevance arguable.
Pete, only if the impedance mismatch is a magnitudelarger or more, at the S/PDIF input (like 1k Ohm or more), will we see [significant enough] signal reflections. A platform has to be extraordinarily poorly designed to have that kind of S/PDIF input impedance.
In other words, that PFO article grossly exaggerates the issue
Thanks Anup. So given that what you say is correct it would seem that goodsource need have no difficulty in using short clock cables as the role of the length of cable in attenuating reflections is trumped by that fact that the possibility of any reflection is itself insignificant.