Dear dCS owners
Whilst jogging this morning, I was musing on the vaguely inexplicable observation that digital music playback systems seem - in the real world - to be far more unreliable and generally tweaky than analogue. This runs directly counter to what we had all simplistically assumed, and what we are usually told.
I’m NOT saying we have been lied to. But! Every high resolution digital playback system I have used in the last 20 years has displayed unacceptably high levels of unreliability and unpredictable behaviour, clearly worse than analogue vinyl players. And the more modern digital technologies – including otherwise impeccable devices like Auralic and dCS DACs and servers - are actually worse than most. And this is not just my experience – many relatives and acquaintances seem to have suffered in this way, as well as some folk I happen to respect (yes, there are a few…) in the higher-end music mags and companies. But they don’t always go public. So, what is going on here?
I believe a possible explanation may lie in deep-physics, and namely the emergent properties of complex systems. OK, I can hear your head bouncing off the table right now, but bear with me… a classic exemplar of emergent properties is the human brain. Many neurobiologists suspect that the complexity of this system – notably its 10 billion neurons and complex connection topologies – lead to such otherwise inexplicable phenomena as personality or consciousness.
Now think about other complex technologies – like, for instance, cars or PCs. Yep, we all know these can be pesky, unreliable and generally have a mind of their own! Now compare digital and analogue music playback in conceptual, entropic and system terms, Digital systems are far more complex – especially when you include ancillary technologies such as WiFi, the Internet and PCs; by comparison, analogue is low in parts count, subtle but basically far simpler and with less things to break. And analogue doesn’t depend on software!! (See below).
That’s why digital playback goes wrong more often and typically in unpredictable ways. Analogue may suffer from misalignments, mechanical wear or physical damage, but you can very nearly always identify a simple reason or explanation as to why this has happened. By contrast, digital breakdowns or problems tend to be inexplicable, semi-random or just downright bizarre. And digital devices are clearly capable of just sulking if the mood takes them.
In case you think of some counter-arguments, I give you two:
Televisions are equally complicated, but tend to be far more reliable. OK, but TVs are also mass-produced, with larger production runs giving more opportunity to chase out the bugs. Moreover, QA/QC is better for the good companies– critical things like panels or chipsets are rigorously tested and suspect or substandard components weeded out. Bespoke, low production-run digital tech simply doesn’t have these advantages.
NASA moon shots. The reason these cost trillions of dollars, at least in part, was due to fanatical QA/QC of every single component, system and subsystem, together with massive and exhaustive analysis of possible contingencies, and heavily enforced backup capabilities, Even so, shit happened quite regularly with shuttles, cabin fires etc.
Addendum: my brother, also a physicist by training and career, has a slightly different spin. He has likewise had his nose repeatedly rubbed in digital flakiness and unreliability. As someone who wrote shedloads of scientific software in his time, he thinks that the complexity of most serious modern software is primarily to blame. Combined, that is, with 1) use of languages like C that don’t enforce proper memory handling and 2) the nearly impossibility of considering in advance all the so-called contingencies and ‘corner cases’ - especially if they involve error handling.
Same observation, different hypotheses – you choose!
But digital reliability still sucks. Sorry.