Pre-emphasis in DAT (and the earlier "PCM-F1" formats as well as the Compact Disc) is a considerable (max. 10 dB above 4 kHz as I recall) treble boost that was applied to the analog signal before it was digitized and recorded. The point of this was to improve the signal-to-noise ratio of the medium, since affordable converters back then weren't as good as they are today. Most program material has far less energy in the high frequencies than in the midrange or below, so this boost didn't usually cause problems with recording headroom, though it could in some special cases.
To get back the audio signal that you put in, the playback equipment had to detect the presence of pre-emphasis via a "flag" bit in the data stream, and if present, reduce the treble by the corresponding amount. The curve was standardized, only one curve was ever used, and the "flag bit" in the various signal and recording formats was clearly defined so that the equipment manufacturers could handle it perfectly well, at least in theory. But most consumer playback equipment does NOT indicate in any way when pre-emphasis is detected; it simply engages the corresponding de-emphasis and all should be well (one hopes).
As a result, when extracting digital data from older audio recordings, you need to be aware of which recording equipment was used and whether it generally used pre-emphasis or not. I mentioned the old Sony "PCM-F1"--all unmodified F1s applied pre-emphasis 100% of the time. With portable DAT recorders, I frankly don't remember, but I'm sure that information is available somewhere. With CDs it is really a problem. A substantial number of CDs in the early years were originally recorded with pre-emphasis (some were recorded on F1s and "bumped up" to 1600/1610/1630 format), and you could need special, professional equipment to detect the flag bit in the S/P-DIF stream coming out of the player. I don't recall ever seeing a CD player that had a visible indicator of whether emphasis was detected on the disc or not.
With classical music (which is what I've mainly recorded) it's relatively reliable just to listen to a playback without de-emphasis. If it sounds artificially bright and that's not your style of recording, then try applying the standard de-emphasis and listen again--things may "click into place". But with pop and rock music I don't know--I can pretty well imagine a dull-sounding recording that would sound better with the pre-emphasis left in! It is, after all, a kind of EQ, and EQ can sometimes make things sound better. But if I had a dull recording, I might want to choose a different form of treble boost rather than the particular, prescribed pre-emphasis curve for making it sound better, is all.
This all sounds appallingly loose and poorly implemented, but keep in mind that originally, consumer digital audio (including the PCM-F1, DAT, and the compact disc) were intended to be closed systems in which the consumer would never have access to the digital data stream. It was years before the first CD players were available with digital outputs of any kind--while the DAT format was designed so that you couldn't record the output of a CD player (consumer DAT recorders with digital inputs accepted only 48 kHz digital signals, not 44.1 kHz). Eventually the manufacturers' control got broken down and we are now completely used to recording onto multi-gigabyte memory cards, and handling WAV audio files just like all other files on a computer. But for example the PCM-F1 was introduced at around the same time as IBM's first PCs, which didn't even have hard drives yet, let alone hard drives capable of holding a CD's worth of material (I can remember when a 70 MB hard drive cost $600, where that $600 itself would be the equivalent of maybe $2000 today).
--best regards