Originally Posted by clockworks999:
“I'd agree that the difference has to be measurable, but are the measuring instruments used good enough to measure the differences?
The human ear/brain is pretty good, probably better than any microphone and oscilloscope or spectrum analyser. It's also easily fooled.”
The Mk 1 Human Earhole is indeed a fine instrument and an essential part of any decent sound engineers toolbox.

But is in no way perfect.
And you have obviously never used professional test gear if you query whether the test gear is good enough. I use test gear that can measure to an accuracy of under 0.1dB and fractions of a percent distortion. And noise virtually down to the theoretical limit of an electronic circuit.
Believe me a level change of 0.1dB is all but inaudible. So I would have to say that if you can hear it, you can measure it. Now whether or how these measurements relate directly to the quality of the listening experience is another matter.
Don't forget it is relatively easy to fool the ear/brain. it is after all the basis of perceptual coding techniques as used to encode an mp3 audio file for instance.
If you play two tones with closely spaced frequencies you only hear the louder of the two. The closer they are in frequency the smaller the gap in loudness can be to make the quieter one inaudible.
And if you think about it for a second. A CD track has a bitrate of 1,411,200 bits per second. A mp3 file may be just 128,000 bits per second. So in the encoding process some 1,283,200 bits per second are being discarded, very nearly 91%

But an mp3 doesn't sound as bad as the raw figure would lead you to believe (still pretty crap though which is why I don't own an mp3 player!

)