I said earlier that I have a hard time believing that a recording with very low peak levels can be relied upon to be as quiet as one made with more conventional levels. It's not impossible--but it depends on a whole slew of particulars, none of which I can ascertain in anyone else's recording equipment. I totally get it about setting conservative levels, but at some point there's a price to pay in signal-to-noise performance, and I want to know where that point actually is for my microphones and my preamp and/or recorder rather than guessing.
From what I've seen in numerous messages on this board, I think some people's intuitions about gain and noise aren't how they actually work. Some people seem to think that the noise added by a gain stage will be, in essence, directly proportional to the amount of gain being added, so that lower gain settings will translate more or less directly into less noise. I've seen people say, for example, that for reasons of noise they'd rather connect a (relatively high-output) microphone to a recorder's line input than its mike input even if they have to turn up the record level control all the way and their recorded peaks don't get anywhere close to 0. They count on 24-bit or 32-bit recording to make up the difference. I'm pretty sure I remember seeing this w/r/t the Sony M10 recorder, for example. Unfortunately as far as I was able to tell when I tested one (not in a very elegant way), the 24-bit setting was hardly quieter than the 16-bit setting--there was less than a 2 dB difference as I recall. Now, I wasn't completely confident of my measurement result in that case. But if it was right or nearly so, then the option that the person chose based on their concept of how it should work, was probably a bunch noisier than the alternative that they rejected.
It's hard to do valid comparison testing in live recording situations; that's more of a test-bench project. Some kind of tone generator and voltmeter, separate from your recorder, are essential. And preferably there should be one other thing on hand that's less common, because when you connect a tone generator to the input of a mike preamp, the preamp doesn't "see" the same source (driving) impedance as that of your microphones, which in turn affects the noise floor. So it really helps if you have the kind of test setup, for the particular brand and type of microphones that you use, that a service technician would use if they work seriously on that kind of microphone. For condenser microphones with removable capsules, manufacturers make shielded "test heads" (a/k/a measurement adapters) that can be attached in place of a capsule, and some manufacturers will sell you one if you ask nicely. A test head places the same capacitance across the microphone amplifier's input, but you can feed signals through it, or you can short its input to measure noise (which still leaves the capacitance in place). The microphone amplifier, meanwhile, is powered in the normal way, so everything in the circuit is just as it would be in a real recording setup.
I have a test head like that for the type of microphones that I usually use (no secret--Schoeps Colette series), and I used it for my preamp noise tests. I set the preamps to 30 - 35 dB gain and the results that I got were not IN ANY WAY predictable from the spec sheet values for those preamps, which were all compiled at higher gain settings. I know of no way to predict or infer the noise performance of a preamp at a given gain level (and microphone source impedance) from knowing its noise performance at its highest gain setting with resistors across its inputs. I've never measured preamp noise at lower gain settings, but other people's measurements seem to indicate that for the best noise performance from typical preamps, you generally want a setting more in the direction of the highest gain that you can set without overloading anything--not the lowest.
So the remaining question is how that's affecting person X's recordings with recorder Y and mike preamp Z using microphones A and B, which I can't answer from here, not having their set of gear and the test equipment in front of me. But the assumption of getting similar (let alone better) noise performance from a preamp at low gain than with the same preamp set to a higher (but still safe) setting, isn't a safe assumption. It's definitely wrong in at least some cases, perhaps in many or even most, but I'd want to see a lot more measurement results before generalizing that far.
Thus my uncertainty, but thus also my sense that somewhat higher gain earlier in the "recording chain" could bring an improvement in signal-to-noise ratio without necessarily taking away safety.