The direct answer to your question is ca. 11.8 dB.
-10 dBV is ca. 316 mV while +4 dBu is ca. 1.228 V. The ratio between them is ca. 3.886. The log (to the base 10) of that ratio is ca. 0.59; 20 times that is ca. 11.79; voilĂ .
Decibels are a unit of the relative power between two signals, or between one signal and another one that's known to be at a standard level of power. In the first half of the 20th century or so, the standard signal power used to be 1 mW in a 600 Ohm transmission line--thus the "m" in dBm.
But audio technology isn't dominated by the telephone company any more, and we no longer use transmission lines of any kind for audio frequency signals (despite what you may see in the ads for high-priced cables--or excuse me, "interconnects"--in audiophile magazines). As a result, the normal load impedance for nearly all analog audio connections is simply "high" (or "high enough to avoid loading down the source significantly," whatever "significantly" might mean to you), and most audio level measurements have become voltage measurements.
The decibel was such a useful unit that it got carried forward and transplanted into the brave, new voltage-dominated world, by assuming that whenever you compare the levels of any two signals, the load impedance on the two of them will be the same. Computationally this allows you the shortcut of leaving current out of the equation; you simply take the logarithm of the voltage ratio without computing the power levels explicitly.
But then you have to multiply by 20 rather than 10 (the "deci" in decibels) because decibels are still a unit of relative power. By implication we assumed that the currents will always be proportional to the voltages when we assumed that the load impedances were some constant "high enough" value. It's a step into virtual reality in a way, but it's been the standard practice for decades now.
--best regards