Taperssection.com
Gear / Technical Help => Ask The Tapers => Topic started by: yates7592 on March 31, 2013, 03:23:12 PM
-
Can anybody explain how to measure the input clipping level on either a line or mic input on a recorder? I know this will vary according to the input level on the recorder. I also know that the units dBu (or dBv, dBfu etc) are related to a standard voltage, but I'm not sure what equipment I would need to measure this for any given recorder's preamps. Any help is much appreciated.
-
As Jon says, you need an audio signal generator with variable-level output. Or if you have a meter that can measure the voltage level of analog audio signals, you can generate test tones on your computer using almost any audio editing software. Keep the test tones in the range of, say, 600 Hz to 1 kHz, because you especially want the third and fifth harmonics to fall in the part of the audio range where the ear is most sensitive to low-level signal components. The object is to find the lowest signal level that generates audible distortion in your recorder WITHOUT reaching 0 dB on the meters. You may have to turn the recording level knob down pretty far to do this.
On pure (sinusoidal) midrange tones, by the time you can hear clipping distortion it's often at around the 3% THD level. Given that THD can be reliably measured down to tiny fractions of 1%, anyone who says that "the human ear is more sensitive than any measuring device" is talking sheer nonsense.
--best regards