Depth Micrometer Calibration
The purpose of this procedure is to provide general instructions for the calibration
of rod type depth micrometers.
This general procedure shall be used in the absence of any specific calibration
procedure for rod type depth micrometers.
Preliminary Instructions and Notes
- Read this entire procedure before beginning the calibration.
- Calibration shall be performed in an environment that conforms to Manufacturer Specifications.
- The dept micrometer will hereafter be referred to as the Instrument Under Test (IUT).
- Verify that the IUT is clean.
- Visually examine the IUT for any condition that could cause errors in the calibration.
- When adjusting the IUT, do not retract the spindle without carefully stoning the
rod O.D. (if needed), to remove burrs or nicks, which may damage the spindle bushing.
- Whenever necessary to disassemble for adjustment, use care and cleanliness to assure
no damage to threads.
- If any of the requirements cannot be met, refer to the applicable manufacturer manual.
- If a malfunction occurs or a defect is observed while calibration is in progress,
the calibration shall be discontinued and necessary corrective action taken; if
corrective action affects a measurement function previously calibrated, the function
shall be recalibrated before the remainder of the procedure implemented.
Applicable Manufacturers Manual or Brochures.
The specifications of the IUT are determined by the applicable manufacturer's documentation.
If the manufacturer's documentation is not available, then the specifications identified
in this procedure are used.
- Micrometers with .0001" graduations shall be within .0001".
- Micrometers with .001" graduations shall be within .001".
The Standards listed below should be selected on the basis of their higher accuracy
level when compared to the unit under test. Equivalent Standards must be equal to
or better than the Minimum-Use-Specification.
Minimum-Use-Specifications for Standards listed are 1/4 the accuracy required by
- Cleaning solution
- Hard arkansas stone
- Lint free cloth
- Gage oil
- Gage block set
- Surface plate
- Clean contact faces with lint free cloth, dampened with cleaning solution.
- Clean exterior surfaces.
- Remove spindle assembly.
- Clean and oil spindle and measuring screw.
- Clean and oil fixed nut in barrel.
- Reassemble IUT.
- Check the measuring screw for wear by pushing the thimble to and from in the direction
of the measuring screw axis. There should be no reciprocating movement. Adjust for
wear if necessary by tightening fixed nut on barrel to a smooth tight (no shake)
fit for the full thread length. A smooth tight fit must be achieved to pass this
- Place the IUT on a surface plate to verify zero setting. Carefully rotate ratchet
or friction stop to obtain reading. Record the above value onto the Equipment Calibration
- Check accuracy with gage blocks having an accuracy not less than 0.00001 inch. Standards
chosen must test the micrometer not only at complete turns of the thimble scale
but also at intermediate positions. This is required as a check on the accuracy
of the scale around the thimble as well as the measuring screw.
- Arrange two separate combinations onto surface plate in such a manner so as to utilize
surface plate to height of blocks as a reference plane. Place one block to each
side of the spindle. Use block combinations to check the accuracy of the instrument
in at least three positions of the total thimble range. (zero, midway & full length).
Repeat for each rod size.
Any gage exceeding specified tolerances at any time during calibration shall be
repaired and recalibrated, returned as is for restricted use or scrapped.
All repair work shall be performed by a qualified individual.
Reason for Reissue
Revision A - First Release
Download Calibration Control