I'm recently in the market for an amp for the first time in years, and the input sensitivity on those is very straightforward: I need 2V (or whatever) to reach peak output. My DAC puts out 2.5V, so I'm good.
But I wanted to compare that to my active monitors, KH120 and IN-8, and I see the input sensitivity is rated at 0.. -15dB (adjustable) and 94dB @1m respectively.
Why is there such a large disparity between these, and why have the manufacturers expressed them this way? Can I convert them to V? This is more for my own knowledge, every DAC I own powers these beyond comfortable levels I just don't understand the purpose of such an ostensibly large variance.
But I wanted to compare that to my active monitors, KH120 and IN-8, and I see the input sensitivity is rated at 0.. -15dB (adjustable) and 94dB @1m respectively.
Why is there such a large disparity between these, and why have the manufacturers expressed them this way? Can I convert them to V? This is more for my own knowledge, every DAC I own powers these beyond comfortable levels I just don't understand the purpose of such an ostensibly large variance.