A google search turns up
https://ashrae-meteo.info/v2.0/places.php?continent=North America but I have no idea if that's official or reliable.
It says that for the 2021 ASHRAE data for Atlanta, GA airport, for example, the 0.4% dry bulb (what you want) temperature is 34.3 C, while the 1% dry bulb is 33.1 C. It also says the mean extreme annual dry bulb temperature is 35.8 C, with a standard deviation of 2.0 C.
So those numbers mean that if for example the weather station logs temperature once a minute, that's 1440 numbers per day, or 525,500 numbers per year. Of those data points, in the average year 1% are at least 33.1C ; 0.4% are at least 34.3C; and the average largest number seen is 35.8C. Using the 35.8C mean extreme temperature and the 2.0 C standard deviation, it also computes the 5, 10, 20, and 50 year expected extremes, meaning for each time period the temperature for which (if I understand correctly) you'd have a 50% chance of seeing it at least once over that time period.
Having said all that, i have no idea which of those temperatures it would be appropriate to select. It presumably would depend on what you need the value for and what the consequences are if you are off. If your equipment dies if the temperature is exceeded, and you want the equipment to last 20 years, and you're willing to accept a 1% chance of equipment death due to temperature over 20 years, then you'd want the 20 year 1% likelihood extreme value (which is not any of the numbers I listed above, but is computable from the extreme annual mean and standard deviation).
Versus if you're just doing conductor ampacity temperature correction, maybe the 0.4% annual number is fine. Not sure if there's any engineering consensus on which number to use for that purpose.
Cheers, Wayne