No, the transformer will _not_ maintain constant output voltage.
As the output load current increases, the output voltage decreases. This decrease in output voltage is what limits the short circuit current of a given transformer; under short circuit conditions enough current flows to reduce the output voltage to near zero.
Transformers have an 'impedance rating', you probably see this number used to calculate the available short circuit current. As I understand the impedance rating, it is measured by short circuiting the secondary, and then adjusting the primary voltage until normal full load current flows. The percentage the test primary voltage to the normal primary voltage is the impedance rating.
For example: a 2.4KV to 480/277Y 75KVA transformer with 3% impedance. This transformer has a nominal full load secondary current of 90A. If the secondary is short circuited and 72V is applied to the primary, then 90A will flow in the secondary.
Under the assumption that the transformer is a linear device, the above information is used to approximate the following: with the full 2.4KV on the primary, in the event of a short circuit, the secondary current will be 90/0.03 = 3000A
Under the same assumption, this means that we expect the transformer output voltage to drop by about 3% as the output load goes from 0% to 100%.
Now, transformers are not really linear devices; and there are transformers which will _regulate_ their output voltage. Large power distribution transformers include 'tap changers' which adjust the turns ratio of the transformer as the load changes, so that the output voltage remains constant. Finally, if the load is relatively constant, you can probably use fixed taps to adjust the output voltage to the desired value.
(Note: working from motor theory here, not installation experience!)
-Jon