Gar's explanation is great. You have to be aware that "calibration" has different meanings for different applications. What gar describes is setting the span and zero for a typical field instrument.
I'm of the opinion that "spanning" and "calibrating" are 2 different things which may be identical for linear systems. Calibrating, to me, is matching the output curve (often a straight line) to the (usually nonlinear) input variable. In this case, a thermocouple's electronics "linearizes" the device. Many thermocouples may be used over a range in excess of 1000C degrees. If I want to monitor room temperature, I may only need 50-86F, 10-30C. The old days, before microprocessors, there were multiple stages which worked over parts of the "transfer function" to have the whole range as linear as practical. As stated in gar's post above, if a J thermocouple is AMPLIFIED (with an appropriate cold junction) without curve correction, and used between 0 and 100C, the voltage error vs a straight line is a maximum of about 1.8C. If I use 2 line segments between 0 and 100C, I believe I can get that down to less than 0.2C, less than the variability due to wire characteristics.
BUT, if I only need 10 to 30C (50-86F), the straight line fit will again be closer than the wire effects.
Then, if I convert that to a (say 10 bit, 1024 steps) digital signal, my analog to digital converted value will much higher resolution (~50 counts/degreeC) than is real. If I use the full 100C range, same 10 bit conversion, my resolution is 10 counts/degreeC ... maybe repeatable, but still it exceeds accuracy. If I take a 1000C range, the same 10 bit conversion ... 1 degree/count ... resolution and accuracy are PROBABLY in the same range.
It's words, only words.
Think about your voltmeter. 1 volt resolution on a 460V system is pretty good. 1 volt resolution on an automobile system is not very good.