I have been wondering how long I can run the cables to my DPA 4061's. They're 9volt mics, not 48, so cable length is a concern.
Some folks report mounting their pre-amp on the stand and making their long runs post-pre. Obviously, that doesn't allow the level adjustments we'd like. And for me, hanging the MiniMe off the stand and sending USB to my laptop isn't practical. I recently recorded a show using 15' runs and was concerned that I may have lost quality as a result.
So yesterday, I decided to finally do some tests of cable length. The extreme case was 40', achieved by connecting 25' and 15' cables. My 'short' rig has the DPA's running into a sound pro bbox and then into the XLR inputs of the MiniMe via a 12" pair of cables I made. So 1' vs. 40' in the extreme case. I also collected data at 15 and 25'.
I setup a PC to send source material via USB to a UA5 and into a pair of Sennheiser HD280's. I sandwiched the DPAs between the headphones with a rubberband. The DPAs went into my Minime which recorded via USB to a laptop at 16/44.1.
All cables were Canare quads with Neutrik gold XLRs.
The initial tests were done with tones I generated at 50, 3000, 12000 and 16000 hz.
For the 3000 hz case, the amplitude difference between the 40' and 1' Canare runs was .0081 dB. I consider that below what I can reliably measure.
I also compared samples of a recently recorded show. Again, the peaks were essentially identical between the 40' and 1'. In subjective listening with the headphones, I couldn't identify a difference. Of course I couldn't compare imaging or soundstage with this test method.
I had hoped to post some snazzy eye-candy graphs of dB loss, frequency rolloff, etc, but I couldn't measure a difference or find anything to graph
I think more testing could be done here, but I've mostly satisfied myself that there are far more important things to worry about than cable length
Does anyone have any thoughts regarding what form the degradation should take as cable lengths increase?