Thanks for the responses!
I am confused by the role of a DAC for testing the ADC performance. I think my issue is an overall lack of understanding as to what gets baked into the PCM (or even DSD) data. Specifically jitter. If I understand correctly, the only way to correct this jitter would be to assume that the ADC attached a (very long) time stamp to every sample —this is obviously not possible. I think I once read that the MQA people claimed they could “correct” this if they knew what ADC was used at the studio or for the a digital master transfer. That sounded absolutely insane to me.
I have a pcb coming for the cirrus CS5361 and was mainly curious how the device would respond to different oscillators (mainly Abracon, some NDK and other far less expensive). Their CS5381 (same pinout) dev board has a crystal socket even (intended to be able to change sample rates, etc) that I thought would be fun to experiment with. However, clearly according to
@amirm ADC tests the winners of the ADC world are utilizing more than a simple XO. Yet also, the Cirrus Logic engineers didn’t find this necessary to achieve the results in the data sheet with regard to SINAD or at least were unscrupulous enough to omit this haha.
By the time an (arguably) unstable clock has baked it’s poorly divided slices into PCM data even the most accurately clocked DAC is just going to properly distribute the slices and rewrite history, right? There is no contract between ADC and DAC — beyond that I still cannot figure out how DSD offers any advantage on the ADC side other than possibly introducing less jitter in the first place because it’s slightly less demanding on processing.
I have just recently seen “time smear” used on audiophile sites. Is this the marketing term for the history rewriting? I’m not asking if it’s audible…just trying to figure out what they’re referring to.
I’m going to download REW now thanks again!