digifish, when digital audio was introduced to the general public in the early 1980s, there was a great deal of emphasis on making everyone into instant experts by explaining "how it works" down to a certain depth of understanding. Most journalists and other commentators relied on a certain stock set of images or metaphors to explain how it works. Part of that set was the idea that digital audio is recorded as a series of "stairstep" sample values.
It was remarkable at the time how often you would see almost the identical set of drawings in every article about digital audio--showing how a smoothly flowing analog signal would be reduced to a succession of evenly spaced stairsteps. That image has led to a number of suppositions and expectations which are not actually true of digital audio recording when correctly implemented and used. Just about everyone got the impression that if you were to hook up an oscilloscope to the output of a CD player, you would see these stairsteps, as if that's what its analog output looked like, instead of a the smoothly flowing analog signal which actually is there.
The author of the article that you referred us to is a very careful and intelligent writer, but like many other people in audio even today, he seems perhaps to be unaware of how digital audio recording actually works as a system, since he writes:
> (A) I had predicted that 16 bit quantization noise might be audible when recording in the quietest natural locations and I was wrong. It remains surprising to me that quantization noise is inaudible with the 16 bit file all the way down to levels producing -60dB peaks (or ~.1% saturation!) from the background sounds. One would think, as the bit "steps" are divided evenly across the 96dB total range, that resolution and performance would drop off faster than this.
It's to the author's great credit that he admits what his ears were telling him instead of clinging to what his belief system told him to expect. He explains the reasons for his expectation--and terms such as "bit 'steps'" and "resolution" are leftovers from the broken metaphor that I was talking about.
I have to get going now, but just to summarize--a properly dithered digital recording system doesn't have audible quantization noise as such, nor stairstep waveforms, nor increasing distortion at lower recorded levels (nor increased "resolution" at higher recorded levels), nor a floor threshold beneath which there is "digital deafness." It has a soft noise floor just like any analog medium; as low-level signals approach that noise floor they can go beneath it for as much as 10 dB or more and still be heard, just as in any analog medium. The audible character of that noise floor will depend on its statistical properties and its frequency curve just as with any analog medium.
All of the above is directly observable by anyone who cares to check it out. The persistence of beliefs to the contrary on all of the above statements is a testament to ... I don't know what, but I wish it would go away.
And even with the slight increase in the level of the noise floor that dither causes, 16-bit linear PCM (the system used on CDs) is 20+ dB quieter than the best analog tape recordings of the era in which the CD was introduced. Dolby and the others had to come up with advanced, new noise reduction algorithms such as Dolby SR so that analog tape at 15 or even 30 ips could reasonably compete.
For anyone who came up in the analog era (my favorite records as a child, e.g. Elvis Presley's "Hound Dog," were still 78s), 16-bit PCM gives an astoundingly wide dynamic range. The fact that it's in consumer hands now is a very big deal.
--best regards