Therefore is a solar inverter is going to cause a load to draw power from the inverter instead of the grid, it has to raise the voltage at the node where all three come together higher than it would be if the load drew power from the grid. Right?
If we fix a voltage reference, such as ground for a grounded supply system, that node (point) is going to have a single voltage (for a given steady state).
Let's compare case (a) the load wants 10A, the grid provides 10A, and the inverter provides 0A to case (b) the load wants 20A, the grid provides 10A, and the inverter provides 10A. So in case (b) none of the inverter power is going back to the grid, it just goes to the load. And let's assume no other load or sources in the system.
With that comparison, the voltage at the node is the same in either case. If the grid transformer is a fixed voltage source, the voltage at the node will be determined by the impedance of the conductor between that fixed voltage source and this node (the service drop and feeder conductors; probably not branch circuit conductors, as this node would be at a panelboard). The transformer terminal voltage, less the voltage drop from the grid current, is the node voltage. And the grid current is 10A in either case.
Then in case (b) the PV inverter has to match that voltage at that node. Because the conductors between the inverter and the node have some impedance of their own, there will be some voltage drop between the PV inverter and the node. The PV inverter has to put out the correct voltage at its terminals so that the terminal voltage, less the PV voltage drop, matches the voltage from the previous paragraph.
Cheers, Wayne