Now. . . I have a real actual question—could ultrasonic frequencies ever have an impact on perception of audible frequencies? Even at the extremes?
At the levels in this context? No.
Now. . . I have a real actual question—could ultrasonic frequencies ever have an impact on perception of audible frequencies? Even at the extremes?
Now. . . I have a real actual question—could ultrasonic frequencies ever have an impact on perception of audible frequencies? Even at the extremes?
Sorry, if a (say) 50 kHz signal at 40 dB SPL is added, it's just not going to be heard.
Want to talk 130-140dB SPL ultrasonics? Sure, that can have an effect. But those are stunningly unrealistic unless you have something purpose-built to achieve that.
Other questions would be ...
How do they shape the sound?
The question is incomplete because you did not specify whether or not the ultrasonics are related or not related.
Also you did not state what you consider extremes.
That question can easily be answered by yourself when you want to know your hearing abilities.
For this there are many tests around.
All you need to do is have a DAC capable of reaching 192/24 and have files which have been properly made and compare these files with AB test software.
A possible snag here is how well the transducers and amplifiers used deal with ultrasonics.
When this is O.K. it is perfectly fine to use your ears... as long as levels are unchanged and one does not know what file is being reproduced.
As someone who's training was done many years ago my standard load for line outputs is 600 ohms. If an opamp output, balanced or unbalanced does better than 0.02-0.03% THD into 600 ohms, then I'm happy the device is working properly. In practice, it'll most likely to be used into a bridging impedance of around 10k, so the distortion will likely be lower, but as long as the distortion is well below audibility, that's quite good enough for Government work.
Occasionally, a device will have the minimum load specified, in which case I test at that load, otherwise 600 ohms is a good number to use.
S.
Thanks. I know you could come at it from an a/b/x perspective but if the answer is it would take a 130-140 SPL signal that’s not something I’m going to try at home, even if I could educate myself enough to set it up and have transducers that could reliably reach those frequencies. But your reply does give me a little more insight.
At least in theory, ultrasonic content that is inaudible could cause a transducer to create intermodulation distortion that is audible. So, for example, if a transducer is reproducing a 30kHz tone and a 32kHz tone, this may result in an intermodulation product at (32-30=) 2kHz that, if loud enough and if not masked by signal content, would be audible.
I'm not aware of any tests of common HF transducers that have investigated whether this is actually of any audible concern, however. Given that most studies into the audibility of high-res vs. redbook have returned negatives (i.e. no audible difference), it would seem safe to presume there is probably in practice nothing to worry about.
How did you arrive at 600 Ohm as the figure to test at? Is this kind of an absolutely worst-case scenario terribly-performing amp input stage figure? Or some other reason?
How did you arrive at 600 Ohm as the figure to test at? Is this kind of an absolutely worst-case scenario terribly-performing amp input stage figure? Or some other reason?
It goes back to telephony in the late part of the 19th Century. (No, I don't remember it personally) Open wires spaced a certain distance apart have a characteristic impedance of 600 ohms, so terminating equipment had to present that impedance to avoid reflections. It has no real bearing on 'normal' cable lengths, but becomes relevant on cable lengths of hundreds of metres or kilometres such as telephone cables. By custom and practice, it became the standard sending and receiving impedance in studios. Now, with equipment having essentially a zero ohms output impedance, and a high >10k ohm input impedance, 600 is of historic interest, but nevertheless, any decent piece of modern equipment should drive 600ohms, so it's still used as a 'standard' load on specifications.How did you arrive at 600 Ohm as the figure to test at? Is this kind of an absolutely worst-case scenario terribly-performing amp input stage figure? Or some other reason?
You do not know very much about op-amps..... First of all, we want to extract more current from the output stage of op-amps to force an exist from a A class and entering in a B or AB class. There are few loads standards like 2.2kohm and 600ohm.
Very nice resource, thanks from me tooI think you may have misunderstood my question. I wasn't asking why an opamp would be likely to perform worse into a lower impedance load, which is clear, but rather why the figure of 600 Ohm was arrived at as a standard given that no decent real-world device presents such a load to a DAC or preamp (well I have come across one power amp that does present such a load, but I'm not sure it could be described as decent).
Anyway I think the answer is clear now.
Very nice resource, thanks
EDIT: there are some more opamp measurements here FWIW, although the load impedance is not stated as far as I can tell.
It's a wrongz method.The question is incomplete because you did not specify whether or not the ultrasonics are related or not related.
Also you did not state what you consider extremes.
That question can easily be answered by yourself when you want to know your hearing abilities.
For this there are many tests around.
All you need to do is have a DAC capable of reaching 192/24 and have files which have been properly made and compare these files with AB test software.
A possible snag here is how well the transducers and amplifiers used deal with ultrasonics.
When this is O.K. it is perfectly fine to use your ears... as long as levels are unchanged and one does not know what file is being reproduced.