Is there a good reference that has a history of where these oddball Voltages started and why?
these are not oddball voltages, they are just old.
First you need to remember that we have
nominal supply voltages, today they are: 120V, 208, 240, 480, and 600V
And we have
nominal utilization voltages (what equipment wants), today these are: 115V, 200V, 230V, 460V, and 575V.
Note that utilization voltages are a % of the nominal supply voltage.
There are also tolerances, the most commonly quoted being +/- 10% of rated.
Then we have maximum voltage ratings, the most common seem to be 125V, 250V, and 600V.
In the early days of commercial electricity, 110V was very common. But line losses were very problematic so it became common to connect two 110V sources in series creating 110/220V systems. This concept of connecting two sources in series led to the voltages of 110V, 220, and 440V. These voltages were prevalent through the early days of electrification, so our they became part of our general conversational language. As the US industrial complex grew, line losses continued to play havoc so, the nominal supply voltages were increased to 115V, 230V. and 460V. By the 60's the use of electricity had grown so much, the standard US voltages were again raised to the values we have today.
Because there is no single set of utility regulations for the entire US, many smaller companies were slow in upgrading their 'delivery' equipment so it is not un-common to have nominal voltages that went 'obsolete' more than 50years ago.
So, to the OP's question. Without knowing the actual tolerances of your equipment it would be difficult to say if it can be used at today's nominal voltages. However, most non-motor equipment is not very fussy about the voltage it gets, as long as its maximum rating is not exceeded. Inductive loads (e.g. motors) are not so forgiving, it is somewhat load dependent, so it is best to supply them less than 10% above their nameplate rating. But, I have seen 440V motors purring merrily along on nominal 480V systems.