I understand that if a CD player has an unusually high output voltage it will cause distortion and the need to attenuate may be needed. 10DB seems excessive.
OK I Read it here: http://www.rothwellaudioproducts.co.uk/
And what kind of logic is this?
"If you operate the system with the volume control turned down around the nine o'clock position most of the time, the signal to noise ratio will be much worse than the manufacturer's spec. Operating the volume control further up its range will improve the signal/noise ratio, but the actual volume of the music may be just too loud. However, if the signal is reduced by 10dB with a pair of attenuators at the inputs to the power amp, then the volume of the music will be reduced, but the volume of any noise generated by the pre-amp's gain stage will be reduced too. This allows you to operate the pre-amp's volume control farther round its range and the signal to noise ratio will be improved by 10dB"
This implies that an amplifier has poorer signal to noise ratio when the input is higher. It will amplify more pre-amp noise when the volume control is turned down but less when it is turned up so we need to input less signal.
As soon as I turn it up to achieve the original loudness that "pre-amp noise" will be amplified as before and not be less. I think it will be worse because the signal is lower now and the unchanged noise floor (hiss and hum) is now being amplified along with the signal.
Did I read this wrong or is it nonsense.