I understand that "5%" per unit impedance is not the same as voltage regulation, and voltage regulation is a function of load. But, if you are running a 120V, "5%" transformer at half the full load,shouldn't you expect to see 2.5% x 120V drop at the secondary (3V drop)?
Some numbers to maybe illustrate my previous point now that I have a little more time.
Take your 1kVA single phase transformer.
Consider it as an ideal voltage source with an inductive reactance and resistance in series with it.
The rated output current of the 1kVA transformer at 120V is 8.3A.
Assume the transformer has a resistive load of 30 ohms and its
terminal voltage is 120V with this load. That gives 4A - about half load. Again, sticking with simple numbers.
The reactance is about 0.7 ohms which is how we both derived about 1.9mH. A current of 4A in that would result in 2.8V across it - not too far from your 3V.
The resistance from my simple ratio is about 0.18 ohms. Add that to the resistive load and, at 4A, you get 120.7V across the restive part and 2.7V across the inductive part.
The voltages are 90deg out of phase.
So the total voltage is then sqrt(120.7^2 +2.7^2)
It's a drop of less than 1V.