Because in this case 24 bits is not any better than 16 bits. 24 bits adds nothing to the quality, because there is nothing on the analog tape that 16 bit system can not catch, hold and reproduce.
We run a Dead project called MOTB. We transfer older analog recordings. We release every show in 2 versions (from the same analog transfer), one in 24-bit and one in 16-bit to burn to Cds. You can HEAR the difference. As mentioned, any post-mastering is one reason for 24-bit... but even WITHOUT any post transfer changes, you can still hear the difference between a 24-bit transfer and a 16-bit transfer -- using the same master recording and the same A/D but at different sample rates. Why? Frequency response detail. Well, some people can't hear it -- but that is another discussion.
I can write pages here on many topics that effect the quality of sound -- especially when taking an old master cassette to digital.
But let's look at how major studios release CDs. How many older analog commercial albums have been remastered to CD in the last 10 years. Did they go to 24-bit from the master analog... or did they just go directly to 16-bit? duh! If there is no audible difference, why would they ALL go to 24-bit...
Petrus' statement is related to dynamic range, not fidelity... by trying to associate dynamic range to the "quality" of recording (frequency response) is misunderstanding what human ear's hear. You can't change the dynamic range of an original analog cassette, but you can change the amount of frequency response. The higher, the more natural sounding.
The frequency response of audio CD is sufficiently wide to cover the entire audible range, which roughly extends from 20 Hz to 20 kHz. Analog audio is unrestricted in its possible frequency response, but the limitations of the particular analog format will provide a cap. High-quality metal-particle cassettes may have a response extending up to 14 kHz at full (0 dB) recording level.
Why do early 16-bit digital converters sound different than today's digital converters at 16-bit? So all 16-bit digital is not the same? If they are the same bit rate, how can they sound different? How about 24-bit?
When I record to 1-bit DSD... is that lower quality or higher "quality" than 16-bit PCM? Does the same analog transfer "sound" different going to 1-bit DSD than 24/96 PCM?
Don't confuse bits with sound quality... to help determine what bit rate to use when converting an older analog cassette is to check the frequency response of the microphones used in the recording. But on that same note, frequency response does not guarantee a specific fidelity either.