here's a couple of thoughts...
Brian's comments are spot on. There is no way to NOT run the signal through the pre-amp of the V3 and only use the A/D (i.e. if you went V3 > analog > ACM V3, even if you use the internal attenuators and got the levels right, you'd still be passing the signal through the second V3's pre-amp).
That said, it seems like the best way is as Brian described, using some sort of splitter. As far as level matching between the two units, other threads have described the ACM V3 as being more sensitive, i.e. a lower input level results in 0dBfs relative to a stock V3. Based on that, you should get everything set up before hand, and then set the gain on the two V3's to be identical. for example, 20dB and all the trim knobs at 0dB. Then take both recorded files to a computer, and look at the overall level. Whatever the difference in levels between the two files is equal to the difference in sensitivity between the units. Now you know the difference between them, so you can adjust accordingly in the field, i.e. if you find a 10dB difference in those recordings, at the show, you can adjust the stock V3 exactly 10dB higher. with this method, you should be able to get the final levels very close, and thus have a better comparison...