How to calibrate micrometers

Written by greyson ferguson
  • Share
  • Tweet
  • Share
  • Email

A micrometer is a device most often used by engineers to precisely measure machines and other industrial instruments. Although the device originated as an analogue system where you read the markings of the micrometer to determine the measurements, standard micrometers have now all become digital, allowing you to calibrate the device through the aid of a computer.

Skill level:

Other People Are Reading

Things you need

  • Computer
  • Micrometer

Show MoreHide


  1. 1

    Open the micrometer program you run on your computer. There is a wide variety of different programs in use, so the exact name can differ. The program is either going to be on the desktop, or accessed through "Start," "All Programs."

  2. 2

    Select the "Adjust" option, followed by "Micrometer" and finally "Calibrate." This opens up a small calibration window in the middle of the screen and an image of the micrometer scale to its side.

  3. 3

    Click on the left-most measurement line on the micrometer scale, then click the right most measurement line. A set measurement of the distant between the two clicks appears in the micrometer calibration window.

  4. 4

    Click "OK" an the line measurement is calibrated. This calibration becomes the default measurement standard for every time you use the micrometer, until you recalibrate the device.

Don't Miss

  • All types
  • Articles
  • Slideshows
  • Videos
  • Most relevant
  • Most popular
  • Most recent

No articles available

No slideshows available

No videos available

By using the site, you consent to the use of cookies. For more information, please see our Cookie policy.