Starrett Micrometer Calibration
Is it possible to calibrate a starrett 1-2 inch micrometer? If so, how? Thanks.
You use a micrometer standard, either 1" or 2". They're usually straight rods with plastic insulation over the middle, but they also have round standards. Anyway, a 1" standard should measure 1". If not, use the spanner wrench supplied with the micrometer to turn the barrel so it does.
Stuart de Haro
I have calibration blocks. I was wondering if you could adjust the micrometer. If I understand, the barrel will rotate to make the correction. I'm about .010" off.
.010 off ????
You might have to read the mic upside-down after adjusting the barrel.
Purchased on Ebay. I guess you get what you pay for, but it may work out ok.
Got it. Only .001" out not .010".
You can reset the starting point to One inch but that does not calibrate the micrometer. To make a calibration, you would need to measure, and tabulate, several different Gage blocks to develop a working curve. A micrometer is only as accurate as the basic movement element, the screw.
There are two ways to adjust the common Starret mics.
For small increments, use the provided spanner to rotate the barrel of the mic.
For large increments, take off the back screw or ratchet, press of the thimble, put it back where it belongs, hold the screw (lock if it has one) and reassemble.
While you're at it, check the adjustment of the nut inside the thimble, you want it free turning but without slop.
Biggest remaining problem you may have is lack of parallelism in the faces. This can be checked with optical flats and it can also be lapped out.
Per J.R. Williams point, you can adjust the mic to read exactly at the most common measurement (e.g. 1.5") and avoid a bit of screw error if that has developed.
I was just reading this in "The Fundamentals of Dimensional Metrology". The author (Ted Busch) cautions against checking every 1/4 in an attempt at being thorough. This will check the pitch variable but not the lead variable.
One recommended series of gage block intervals given is:
zero, 0.195, 0.390, 0.585, 0.780, 1.000
Frankly, all you SHOULD have to do is check zero, and check for slop in threads.
If you have to make up a correction table for the mic, then either:
1) you are using the wrong mic for the job
2) You need to trash the piece of junk and get a real one.
Any mic should read within a fraction of its smallest marks all over its range. If it does not (and you should check some odd dimension every so often as a check), it is defined as unreliable.
If you are getting into interpolation, you are almost certainly going beyond the justifiable accuracy assumptions for the instrument. (it may be accurate there, but you shouldn't care)
And if you need a table because your instrument is off by one or several marks at some places, stop diddling with the POS as soon as you can.