If you really want efficiency, then you don't want to use a resistor to drive the LEDs, rather you want a switching power supply with constant current output. However I am not really recommending this approach, because you are operating at such low power levels (7 leds, 3V, 7mA, 150mW).
The general rule is that you subtract your LED operating voltage from your supply voltage, giving the voltage difference that the resistor has to drop. You use Ohm's law, that voltage difference, and your desired operating current to calculate the resistor value.
The smaller the LED voltage relative to the supply voltage, the more drop in the resistor and the more power wasted heating the resistor. However supply and LED voltages both vary, which means that the closer you try to match supply and LED voltage the greater the variability in LED output.
You can trade off the above effects by putting more LEDs in series and using a smaller resistor.
Consider the example: nominal 13.8V power supply and 3V LED, but the supply is actually 14.4V and the LED 2.9V. The nominal efficiency is 22%, and when the voltage changes your current ends up 6% high. Now use the same components, but put 4 LEDs in series with a smaller resistor. Your nominal efficiency is 87%, but the change in voltage causes a huge 250% increase in drive current.
There is a useful and simple trick: you can use a simple linear variable 3 terminal voltage regulator with a single resistor to create a linear current regulator. See page 10 of the datasheet:
The above component ($0.35 at digikey
https://www.digikey.com/en/products/detail/texas-instruments/LM317LCLPRE3/1510140 ) and a 125 ohm resistor will provide a solid 10mA as long as the input voltage is about 4V higher than the output voltage.
What I would recommend is to use the linear regulator described above with a 30V power supply and all 7 LEDs in series. This will give you even current to all of the LEDs, you can easily change the brightness by changing 1 resistor, and efficiency will be good at about 70%.
-Jon