freelunch, hello and many thanks for your reply. With equipment that's poorly matched or poorly designed, of course you may have to make compromises just to get a usable recording. But the resulting low recorded levels aren't a good thing in themselves; they're the price you've paid for survival in a given situation. Me, I'd really try to avoid that situation the next time around.
Like, in the early 1990s when I started working with opera singers, some of whom can exceed 120 dB SPL at times, I realized that several of my up-to-then favorite portable preamps just weren't going to be usable any more. Their input circuits couldn't take the voltages that my microphones were putting out, and I wasn't about to change microphones on that account. Until I could get better preamps, the short-term solution was to use resistive pads at the preamp inputs. But then you turn up the gain on the preamp, feed the A/D at normal levels, and recording-level-wise, everything's all right again.
With pro equipment feeding pro equipment, or consumer equipment feeding consumer equipment, there's rarely a need to record at low digital levels on account of a preamp that overloads too easily, since generally the overload occurs in the input circuit of the preamp. If on the other hand you have a preamp designed to feed -10 dBV consumer equipment, and you connect it to pro gear that expects +4 dBU and doesn't have any input gain controls, then yeah--I can see the problem. But that's a highly inappropriate combination of gear. If you end up with peak levels down around -15 dBFS, that's a huge honking symptom of a problem--not a solution.
If you have a recorder that "doesn't sound good" in the top (however many) dB of its range, assuming that its input circuits aren't overloading (e.g. if the top "however many" dB sounds bad even when the recording level controls are set 3/4 of the way up), then that equipment is sadly broken, and you should shout it from the rooftops so that others here don't buy that same type of gear. Equipment of that kind isn't something to dance around and develop rituals so that you can still use it. Instead, please do us all the honor of letting us join you in hammering the manufacturer to clean up his act.
> As you know, 0 VU is still -20 dBFS.
"As I know"? There's no fixed relationship between 0 VU and any particular digital recording level. Levels expressed in dBFS are instantaneous peak amplitudes, while volume units represent average power across a time span of ca. 1/4 second ("syllabic" response). Since two different things are being measured, no simple conversion can bridge the units involved.
For systems that used VU meters--originally telephone lines, and later, sound systems--0 VU was never the maximum signal level permitted. On the contrary, it was assumed that occasional peaks would reach 6 VU, even though those levels would be off the meter's scale if they were sustained long enough for the needle to get there. But the operator's rule of thumb was that the needle could "go into the red" as long as it came right back out again without hitting the pin hard. (Rock-n-rollers liked to break that rule and "squash" their signals on the tape, using it like a kind of compressor--analog tape couldn't let the peaks exceed a certain level, and oddly, would actually give back a little less if you pushed it past a certain point.)
In any event -20 dBFS is considerably lower than the typical calibration points I remember. The studio gear that I worked with all had markers at -15 dBFS which were used for lining up to the 0 VU tones on analog tapes. High-output analog tapes (Scotch 250 or Ampex 456 a/k/a "Grand Master") could record some 12 dB above Dolby level in the midrange. For 7-1/2 ips or 15 ips recordings on more conventional tape stock, the lineup tone could even be set a little higher (e.g. -12) because the peaks on the analog tape were correspondingly lower.
You see, the problem in the early days wasn't that high levels sounded bad. The main digital recording processors back then (the Sony PCM-1600 and its improved successor, the PCM-1610) were undithered, at least in their default settings, so they sounded gross if you didn't use all (or nearly all) of the upward range that was available. In addition, people back then were already conscious of the marketing value of a "loud" recording. So the absolute imperative was to get the peak levels as close to 0 as possible without "going over."
(Please pardon the somewhat verbose history lesson ...)
--best regards