Only if those measurements are the output of the DAC, not the oscillator.
Out of interest, is anyone aware of any experiments done where an audio circuit using a high performance oscillator was doctored, so that one could arbitrarily adjust the level of jitter of the clock, from performing at its best, to absolutely terrible - and have listening tests so see when the jitter was "detectable", using music recordings as the signal?
But, if the noise of the supply is increased, how can one be certain that this increased noise is not affecting the overall performance via another, interference, route? To do the experiment properly, to my mind, would be for everything to be fixed, except for some circuit addition which 'artificially' varies the jitter - and listen ..It's very easy to reduce jitter performance of a clock. All you need to do is increase noise on the power supply. Like clearly demonstrated in this proper scientific test.
There are a couple. This Dolby paper is one of them: https://secure.aes.org/forum/pubs/conventions/?elib=8354Out of interest, is anyone aware of any experiments done where an audio circuit using a high performance oscillator was doctored, so that one could arbitrarily adjust the level of jitter of the clock, from performing at its best, to absolutely terrible - and have listening tests so see when the jitter was "detectable", using music recordings as the signal?
There are a couple. This Dolby paper is one of them: https://secure.aes.org/forum/pubs/conventions/?elib=8354
"Theoretical and Audible Effects of Jitter on Digital Audio Quality"
Here are a few graphs I have saved up from it:
View attachment 6625
View attachment 6626
The audible thresholds were considerably above what we typically see in DACs.
It is called engineering excellence. A requirement in my book for high-end audio.Why does Mark Levinson use premium clocks then? What about clean power supplies? Just a marketing gimmick?
Listening tests don't have expiration dates on them. There can be criticism against the paper and study as with anything. But its overall message unfortunately is not friendly to creating ultra-low jitter products.That paper is way outdated. They have learned lots about jitter and audio over the last 19 years.
It is called engineering excellence. A requirement in my book for high-end audio.
Listening tests don't have expiration dates on them. There can be criticism against the paper and study as with anything. But its overall message unfortunately is not friendly to creating ultra-low jitter products.
Thanks for that, Amir. Unfortunately, music was not used for the test - I was specifically curious if there was any testing using such.There are a couple. This Dolby paper is one of them: https://secure.aes.org/forum/pubs/conventions/?elib=8354
The audible thresholds were considerably above what we typically see in DACs.
Thanks for that, Amir. Unfortunately, music was not used for the test - I was specifically curious if there was any testing using such.
Informally, yes. But it would be nice to have a "proper paper" to point to, that ticked all the boxes for doing the test in a manner that audiophiles could relate to.1000's of tests have been done with music by audio gear manufacturers. This is why they use low jitter clocks. It's not a practice limited to the snake oil brands of the industry.
Informally, yes. But it would be nice to have a "proper paper" to point to, that ticked all the boxes for doing the test in a manner that audiophiles could relate to.
It was. It just wasn't in that graph that I showed. Here are the threshold results in nanoseconds for music:Thanks for that, Amir. Unfortunately, music was not used for the test - I was specifically curious if there was any testing using such.
My car runs no faster or better due to better stitching of the leather on the dash. It is there because it is expected that when cost factor is taken out, better quality goes in.If there's no audible effect by using more expensive low phase noise clocks, how is using them engineering excellence? Sounds like a great way to increase the BOM, with no benefit to the end user.
My car runs no faster or better due to better stitching of the leather on the dash. It is there because it is expected that when cost factor is taken out, better quality goes in.
Can it make a difference? In extensive testing I did of much lower quality AVRs with high jitter HDMI input, I could not show any of them having jitter distortions above masking thresholds. With high-performance DACs with far lower jitter, I am confident even trained listeners would fail such tests.
Thanks. There is some range in the results, showing variability of sensitivity to the resultant distortion - a good start to better understanding.It was. It just wasn't in that graph that I showed. Here are the threshold results in nanoseconds for music: