Hi all, I have signed up only very recently, I hope I'm not posting on the wrong forum/sub-forum, please redirect me to the right one in case, thanks.
I'm a happy owner of a DX7 Pro DAC, it is an awesome device and I couldn't be happier.
In my system I stream audio to the DAC from a mini-itx computer built with components from mini-itx.com:
I run Debian Linux on it, optimized for low latency and low power at the same time, using ALSA drivers through asynchronous USB.
The OS allows me to do fine tuning on ALSA, specifically on:
Here come the point: with that low value of latency I obtain a great sound quality, but if I change the parameters and the latency goes higher, then the sound is sensibly worse.
I don't mean stuttering or "pops" or any kind of noise, I mean less beautiful: less detailed and defined, less impact, smaller "space" etc. "Worse" in HiFi terms.
I really cannot understand this behavior: to my knowledge, latency should not have anything to do with the sound quality, it should only be a small delay in the transmission... am I wrong?
Is the DX7 Pro somehow sensible to latency in the USB data stream? Is there maybe a relation between audio data latency and jitter?...
I'll be very grateful if you can explain this mystery to me.
Thanks in advance.
I'm a happy owner of a DX7 Pro DAC, it is an awesome device and I couldn't be happier.
In my system I stream audio to the DAC from a mini-itx computer built with components from mini-itx.com:
- Q1900DC-ITX fanless mobo, with 19V DC connector
- M350 enclosure
- Samsung 860 EVO 250GB SSD
- 2 x 2GB DDR3 Crucial SODIMM 1600 (I had them already)
- external 90W power supply (taken from a video surveillance system, gift from a friend, no brand on it)
I run Debian Linux on it, optimized for low latency and low power at the same time, using ALSA drivers through asynchronous USB.
The OS allows me to do fine tuning on ALSA, specifically on:
- sample frequency → always 44.1Khz for me, I only have CDs
- size of the output buffer (in terms of number of samples) → my value is '16' samples
- period/buffer (how many interrupt will occur to fill the buffer once) → my value is '3'
Here come the point: with that low value of latency I obtain a great sound quality, but if I change the parameters and the latency goes higher, then the sound is sensibly worse.
I don't mean stuttering or "pops" or any kind of noise, I mean less beautiful: less detailed and defined, less impact, smaller "space" etc. "Worse" in HiFi terms.
I really cannot understand this behavior: to my knowledge, latency should not have anything to do with the sound quality, it should only be a small delay in the transmission... am I wrong?
Is the DX7 Pro somehow sensible to latency in the USB data stream? Is there maybe a relation between audio data latency and jitter?...
I'll be very grateful if you can explain this mystery to me.
Thanks in advance.