In general this factor gets ignored.
What matters is _conductor temperature. So in order to adjust for temperature you need some way of calculating conductor temperature. The conductor temperature changes the resistance value used in the voltage drop calculation.
For conductors loaded at their ampacity, the assumption is that the conductor temperature will be the temperature used to calculate the ampacity. (Note that this is rather conservative; conductors are usually much cooler than this.) So if you are using 75C conductors at their ampacity, you would assume that the conductor temperature is 75C. Ambient temperature would enter into this calculation because it changes the rated ampacity by an adjustment factor.
(For example, if you have 100A running in 3AWG conductors in normal conditions, you would assume a conductor temperature of 75C. With the same conductor in 40C ambient you would assume you get 75C with 88A of current.)
For lightly loaded conductors, the heating is quite small, and conductor temperature is closer to ambient. If you have conductors well oversized to deal with voltage drop, a reasonable approximation is that they are at ambient temperature.
If for some reason you need something more accurate than simply assuming the conductors are at their rated temperature, then you need to run engineering calcs to estimate the actual expected conductor temperature.
-Jon