The concept of using a transformer on the output of a LV drive to run an MV motor is old, tried and true, it's done that way QUITE a lot in the oilfield industry for ESPs (Electric Submersible Pumps) and in the deep well water pumping industry. They will use 4000V motors because of the long distance from drive to motor, but they don't have 4160V power available, nor does anyone want to spend $250k for a 4160V VFD when they can use a $100k 480V drive and a $50k transformer. In addition, if they don't have anyone qualified to work on 4160V drives, that can be another factor to sticking with 480V.
Yes, there CAN be issues with the transformer operating at variable frequencies, even if the V/Hz ratio is correct. That's mostly due to eddy current heating of the transformer core and harmonic heating in general (remember, this is the OUTPUT side of the VFD where harmonics are not normally mitigated). In the past I used to have custom transformers made that were designed to minimize the eddy current heating effects (better quality grain oriented steel, formed windings etc.), but after observing several competitors using standard off-the-shelf transformers and consulting with some of my internal engineers, I discovered that what most people do is just over size it to compensate, using a high "K-Factor" transformer, designed specifically for handling higher harmonics. In addition, it's best to operate the VFD output at as high of a carrier frequency as possible, that helps with the heating issues. But that then can ALSO cause added heating in the VFD output transistors (more switching = more switching losses), so you have to de-rate the VFD in some cases.
Still, if I have my druthers, I prefer to use a 4160V drive for a 4160V motor, but that's just me. What most people leave out of this equation is the issue of protection for that transformer. Yes, the VFD can be programmed to current limit and if set at the VFD output rating, that can be indefinite. But current limiting indefinitely into a shorted transformer winding is the sort of thing that causes fires. However if you put fuses on that transformer, the act of a fuse clearing on the output of a VFD is something that can take out the transistors. So yes, this scheme saves initial money in most cases, but at the risk of catastrophic failure if anything happens. Factor in the added operating losses through that down-stream transformer that are PERMANENT losses, and the overall long term cost of ownership doesn't look so good. In the oil-well business that's often irrelevant, because the well may not produce long enough to make that matter. In water wells, it's a real economic consideration. In this case, 350HP is roughly 260kW, but running at 48Hz, lets call it 170kW. If the transformer adds just 2% additional losses, that's 3.4kW in losses every hour so assuming 24hr 365 days running, that's almost 30,000 kWh per year, at .11/kWh the long term ADDED costs to do this are over $3k per year. It adds up...