Wrong Frequency information in Mosaic

Hello,
I noticed another problem. This is not a critical but annoying problem. When I play 24bit / 88.2kHz files, Mozaic informs me that they are 88.1kHz files. I am using Mosaic and MinimSerwer. Files in flac, upsampling format: DXD. The selected filter does not matter.
Regards Robert

Robert , I found a similar issue a couple of years ago relating to the frequency resolution displayed by Mosaic when playing BBC Radio3 HD. Officially this is AAC 320kb/s. Mosaic shows it as AAC 317kb/s. dCS responded that Mosaic displays the actual input that it analyses not a metadata description. So the BBC stream is actually 317kb/s. 320kb/s is , of course, only the BBC’s assertion.

So my bet is that the reality of those files is that they are 88.1 and not 88.2. That is , in a way, unsurprising given all of the tiny errors that can creep in if there was a complex or technically deficient production history of the file or files in question.

I will be watching, although I have the impression that all the 88.2 files I have show the frequency of 88.1 in Mosaic . I don’t think they are all damaged?

Robert , like to give me a couple of examples? Of course maybe that would not be comparing like with like as I would probably need to find them on Qobuz. I assume that these are purchased downloads as you mention MinimServer ?

Is this why the signal from my television via optical is displayed as 96 @ 22 bits? Found it a bit odd since I’ve never heard of 22 bit being a format.

It could be as the audio coding used for video material is not always identical to that used in sound only recording and includes formats unfamiliar to our world ; Nellymoser, Speke, Opus anyone?

Further in TV broadcasting it may be possible that part of the available bit depth is used for other data or control purposes similar to the old analogue days when although you had a 625 line TV you never actually received a 625 line picture as a couple of lines carried data such as teletext. Perhaps it’s a bit ( pun) like MQA where the least significant bits carry the “folded” data. Remember that a 24 bit signal can provide a dynamic range of 144dB which is way beyond what you would want to exploit in real life.

What dCS told me when I asked about the digital output from my satellite receiver (which is listed as 17/48 KHz) is that they report the number of bits actually seen in the samples.

1 Like

As a non engineer I don’t understand why your satellite receiver would be designed to output at 17 bits and my television at 22 bits. Wouldn’t that require additional engineering vs standard 20 & 24 bit?

The 17 I don’t get.

The 22 I could see somehow being a cost savings measure to avoid having a full 24-bit data path.

I have now found that Digital TV broadcast sound uses a codec which for N. America is Dolby AC-3 ( ATSC standard). This is decoded by your TV or receiver not by Mosaic. Your TV/receiver is sending 17/48 to Mosaic not 16/48.So it seems there may be a small quantisation error somewhere outside of the dCS domain as Mosaic displays what it receives.

@Katzky’s TV should also be 16/48 ( for broadcast TV) but I don’t know what source he was using when he referred to 22 bits displaying .

Yes… and no.

HD content on DISH network is AC-3, but SD programming and their music channels are two channel PCM.

When that PCM is sent to my Rossini, it sees it as 17/48 PCM.

The original explanation was here:

SPDIF reads 17/48?

The important part being:

The likelihood is that the network dish box is actually outputting 17 bits. This is possible as SPDIF audio always transmits 24 bits for the audio signal. Sending a lower bit-depth signal with SPDIF means some of these bits simply won’t be used. The box is likely indicating 16-bit audio as the standard indicators for SPDIF are 16/20/24 bit.

Lots of products rely on the embedded information in SPDIF to determine sample rate and bit depth. A dCS product does this in a different way. It determines the bit depth by counting the active bits in the audio signal.

We do not look at the status bits to tell us what bit-depth the audio is, as this is often incorrect. Instead, we count the bits in the audio signal which actively toggle. Therefore, if the network dish box is outputting an SPDIF signal which has 17 audio bits toggling, it will report a 17-bit signal.

So it’s likely the TV is only sending 22 bits that toggle.

2 Likes

Thanks Bill. I think that effectively answers Robert’s original post.

So… my tv is sending 24 bits but the last two aren’t significant because they never change? This makes sense vs only sending 22 bits.

Basically, yes.

The S/PDIF spec has room for 24 bits of audio data to be sent within the data block and the receiving device has to look at the data frame to determine the format. (This is why there is a “buffer” option and why without it you may get a burst of white noise before the DAC realizes the data being sent is DSD or another format, not PCM.)

While most vendors just assume 16 or 24 bit, dCS looks at how many bits are changing and report that.

S/PDIF Data Frame Graphic