there is a bit meter in wavelab (under the tools menu?). If the signal is padded you will only see the top 16 bits of the meter light up.
bingo, thats what im looking for! cause i can record a 24bit wav with a 16bit dat, so going by whatever wavelab or SF says the file is, doesnt really tell you if the source was true 24bit. Anyway.. im getting there! I found out, i had to disable the SBM pin as well, because thats the thing that takes the signal converts to 20bits then dithers to 16. so even tho i changed the output pin to 24bit, i was still getting a 16bit signal. when i disabled the sbm pin, i get a 24bit signal now the signal is way way too hot tho, so i guess there is more to the equation
Todd,
(I make this follow-up public as it might have general interest and/or could catch the sight of people sitting with relevant information).
One thing to have in mind is that the ADC within the SBM-1 is the CXD8493M (per service manual printed in 1995). I assume from the partnumbering that it is made by Sony. The chip itself must then be of '94 vintage at best. (But interestingly enough the analog stage within this ADC is running on +- 4.8 Volts, twice what is standard today).
I just made a few online attempts at locating a datasheet for this IC but could not. I'd be *very* interested in a copy if anyone has access to this document.
What does this information indicate? Well, I'd be surprised if this ADC has a S/N ratio better than that equivalent to a bit depth of 17 or 18 (108 dB). What I do not know is how many bits this IC sends to the DSP. Is it really more than 18? Say for the argument it is 20. If you disable the superbitmapping and word reduction within the DSP then the DSP must pad with 0 four times to get 24 bits. So I am in two bits(!) of doubt of how to interpret you now having "24" live bits out from the sbm-1.
What I expected was for you to possibly/hopefully see A) the noisefloor (with the record volume at 0 [shorted inputs on the ADC itself could be needed] move below 90 dB (and not just because of negative DC offset) and B) possibly as much as 20 active bits.
24 just seems plain unlikely with no dsp processing!
If you amplify the noisefloor signal by 6 dB and do this 15 times (basically you chop off bits from the top and rescale the remaining signal), while zooming in on the timescale so as to see individual samples in time, then an ideal 16 bit noise signal will after 15 iterations sit at -1, 0 and +1 [or -0.5 and +0.5] (overlooking dc offset). I.e. you have shaved off bits and arrived at the lowest lying bit and it's two fluctuations: up one and down one. Any other levels present in the remaining data must be due to lower lying (17th etc) bits. (Having said that one must be open for the fact that the noisefloor could be higher).
Now. The problems you have with both the levels and the left/right channel being messing up, might indicate that there is a problem remaining here that invalidates current assumptions of more than 16 bits present in the output. I suspect for now that this is either an issue of the dsp not being made for "non-sbm" operation (even if it has the pin to disable it), and/or a problem in the transfer of data from the dsp and to the spdif transmitter.
Time will show :-)
Regards
Jon