What makes you think the phase difference in the waveforms is not real? Of course it is real because the math is just modeling the real world. If the phase difference were not real, we could not really make use of the phase difference in the physical world.
No, the phase difference is not real. When you move your reference point, you are changing the polarity (i.e. a minus sign), but instead of just using a minus sign, you have instead inserted a 180? time shift in the phase to achieve the "effect" of a minus sign.
It is not real-world correct nor mathematically correct, but it does work out as a tool, simply because the input signal is assumed a perfect sine wave with its expected symmetry about pi. If you doubt this, then simply hook up a non-symmetrical wave input to your transformer and look at the resulting wave forms on your scope. You will see that they are not time-shifted on your scope, but instead are simply inverted in magnitude.
No, I didn't overlook your previous "signals and systems" comment. It was wrong in this application and was ignored for being inapplicable and off-topic. As it applies to this discussion, a phase shift is a time shift. If this isn't clear to you, then redo the mathematics with a non-symmetrical waveform just like the above example.
A simple example of this is to put a diode (half-wave rectifier) on your primary side. This will remove the symmetry about pi of the source waveform. If you truly had a 180? phase shifted output, then you would see the two phases no longer being mirrored, but instead shifted. You would have half-wave rectification from Neutral-to-A, half-wave from Neutral-to-B, but A-to-B would still remain full-wave, un-rectified. If you can design a transformer that duplicates the above scenario, then I will personally pay for your patent application and submit your design for the Nobel Prize.