Back to the question.
There really is no need to record at 24 bits, just laziness. I am one of those lazy guys. Always recording at 24 bits.
To keep it simple, number of bits spells the distance between the noise floor and full signal (where clipping begins). I do NOT like clipping in a digital signal, it sounds really bad. So I want a bit of headroom to handle those unexpected peaks. Personally I aim for about -12dB. Generally never change levels once set, unless I get red lights.
As one bit is equal to about 6dB I would have about 14 bits left if recording to 16 bit. This means about 6 x 14 db = 84dB of signal to noise. Quite usable but maybe a bit marginal.
Now, if I switch to "so-called" 24 bit recording, I might get something like 20 bits of real resolution. The last few bits are down in the noise on all AD converters accessable to me. Guess it is the same for you as well (S/N of 120dB, very few realworld converters go above that). So if I aim for max -12dB I still have 18 bits real resolution left.
In post production I do a bit of level setting, maybe a bit of EQ and maybe a bit of compression. Soft songs gets a bit of extra boost as well. The end result fits well into the 16 bit resolution of a CD. Had I recorded at 16bit from the beginning, noise would be quite noticable.
Anyway, my take on it.
Gunnar