You don't need to wait for AI, you just need to craft a spreadsheet.
Start with run length and required ampacity.
Next determine the expected current and required voltage drop. (Expected as opposed to NEC calculated, use your understanding of what the real load will look like. Required voltage drop per energy codes or equipment requirements.)
This give you the parameters for the minimally acceptable cables. You could use Al or Cu, parallel runs, 75 or 90C...just list a bunch of the possibilities. Keep in mind that some will add additional termination costs, some will add conduit costs, some will be cheaper to pull...
Shop these different configurations from your supplier to get a wire cost.
Estimate the installation cost for each approach.
Finally for each configuration calculate the power lost in the wire (just current squared times resistance) at the expected current level, multiply by the number of hours in a year and the cost of electricity to get the annual running cost of the wire.
Not an easy spreadsheet to create, but doable in a few hours for a simple feeder.
I expect that the additional cost of terminating 90C wire negates any savings in the wire. Al wire will be cheaper for the same ampacity and have very similar losses for the same ampacity.
IMHO the place where you will likely see installation vs running cost trades is in number of parallel runs. Parallel installations give you more amps per circular mil, which means more loss per amp and higher running costs.