Using the assumption that resistance for metallic conductors is practically linear over the normal operating range (for this case there probably is some non-linearity) and that the normal operating temperature of the bulb is 2200 Deg C (assumed), and the bulb was at room temperature (20 deg C assumed) when measured to get the 39 ohms (and the bulb was rated for 230V not 240 volts (assumed), by adjusting for temperature I come out with 43 ohms.
That's roughly a 10% error from calculated versus actual measured, which I can live with since there were so many assumptions that had to be made.
To know for sure use a third-order polynomial curve fit for the VI curve, where V = A*I + B*I^3
Use two nonzero experimental data points possibly the half rated and full rated current would be good choices, you can sovle for good approximations of A and B. Then, because resistance (R) = V/I, assuming this can be modeled as a purly static device and the contribution due to power factor can be neglected:
R = A + B*I^2, for any given amount of current passing through the lamp.