Become a Site Supporter and Never see Ads again!

Author Topic: 24 bit v 16 bit  (Read 1840 times)

0 Members and 1 Guest are viewing this topic.

Offline DSatz

  • Site Supporter
  • Trade Count: (31)
  • Needs to get out more...
  • *
  • Posts: 2634
  • Gender: Male
Re: 24 bit v 16 bit
« Reply #30 on: February 01, 2019, 12:38:48 PM »
> The -18dBFS is the EBU standard and the -20dBFS the SMPTE standard to set for 0VU as I understand it - which is why I quoted those figures.

John, I know that you know this, but for the benefit of those who don't:

VU meters are a very particular thing. Not every analog meter is a VU meter--not even every analog meter that has the letters "VU" on its face. True VU meters are/were highly standardized "volume indicators" (VU = "volume units"). They were designed in the early 1940s to give an idea of the perceived loudness mainly of speech transmission.

The motion of a VU meter's needle has "syllabic response"--using an integration time (ca. 200 milliseconds as I recall) that the Bell System engineers considered optimal for telephone systems and AM broadcasting. True VU meters are dead-on accurate when fed continuous tones, but with live, uncompressed program material and modern microphones (e.g. condensers), signal peaks are typically about 8 dB higher than what you see on the meter. (John, you will probably recall that PPMs were calibrated so that their zero point was effectively equal to +8 VU for this reason.) With some program material a VU meter would "under-read" signal peaks by a significantly greater amount--especially strong signal components that rise and fall quickly, such as percussion picked up at close range with condenser microphones.

Now, the scale on a VU meter doesn't stop at 0 dB; it continues to +3. The zone between 0 VU and +3 is marked in red, and with analog tape there was higher distortion above 0 VU than below it, sometimes audibly so. But even +3 VU wasn't a "brick wall" limit. The most conservative, purist classical approach to recording still involved "going into the red" sometimes--just not hanging out there for any length of time. Rock music, on the other hand, was often recorded with the needle well into the red a lot of the time--intentionally using tape saturation as a kind of compressor. I've seen some engineers "push" tape so hard that the VU meters were continuously "pinned", i.e. off the scale and all the way to the right, and they were proud of it.

In summary: For live recording, if you set -18 dBFS = 0 VU on continuous tone, your typical (often-recurring) peaks on the digital side will tend to be around -10 to -8 dBFS, with occasional peaks going maybe to around -5 or in truly extreme cases, a touch higher. Those would be very good levels for 24-bit recording in my opinion.

So the standards you mentioned make excellent sense IF the fundamental difference between VU meters and peak-reading meters is well understood. For better or worse, though, fewer and fewer people nowadays have ever seen real VU meters in action, let alone used them for live recording. So some people might infer (quite wrongly) that according to these standards, their digital recording levels should be set so that the peaks occur at -18 or -20 dBFS.

But that's not at all what those standards mean, and setting your levels so low is just asking for extra noise--if not from the recording channel itself, then from everything that comes before and after it.

--best regards
« Last Edit: February 05, 2019, 10:16:59 AM by DSatz »
music > microphones > a recorder of some sort

 

RSS | Mobile
Page created in 0.062 seconds with 28 queries.
© 2002-2019 Taperssection.com
Powered by SMF