Vernier Micrometer

There are other specialized instruments based on the thimble, such as depth micrometers and tubing wall thickness micrometers. I made my own tubing wall thickness micrometer from a conventional micrometer for use in turning rifle cartridge necks to uniform thickness. It looks crude, but it works fine. The Vernier feature is simply a way of accurately estimating the distance between two graduations using a secondary scale, and there are indeed both vernier micrometers and vernier calipers. And other instruments also use vernier scales, such as the aforementioned surveyor's transit, where it is used for measuring angles precisely and also marine sextants (not many of either are in use these days).
 
Tom S. said:
What we referred to calibrate or zero a mic, you clean the faces, close it, and if the zero marks don't line up, you loosen the barrel aka thimble, and turn it so they do. Are you saying you can calibrate a mic outside of zeroing it?
The only correct answer to this is, it depends.

First we need to nail down some terminology. Unfortunately, the term "calibrate" is used differently by different people. In the USAF we used "calibrate" to mean: verify manufacturer's specifications by comparing against a standard. In most civilian parlance, calibrate adds this to the above statement: ...and adjust to nominal.

There are several aspects of a micrometer that need to be checked/adjusted to say that it was calibrated:
  • Flatness
  • Parallelism
  • Linearity
  • Repeatability
  • Accuracy
Not necessarily in that order.

Because of how a micrometer works, if the face of the anvil and spindle are not flat and parallel, it won't be linear. If it's not repeatable, it won't be accurate.

I have seen some micrometers that are adjustable for zero and full scale, but they are rare. It's usually not worth the time to make the adjustment. So, to "calibrate" a micrometer, it is checked against a standard and if it doesn't pass, toss it.

If you have to adjust the zero of the micrometer after the initial adjustment, I would suspect the whole thing of being mishandled.

These are the tools needed to calibrate a micrometer:

Optical Flat
Optical%20Flat.jpg


Gauge Blocks
RY81A1.jpg


I prefer the ceramic gauge blocks because they can be handled without gloves. But you should use gloves anyway to combat thermal expansion. No normal person is going to have a set of gauge blocks handy because they cost about $4K for a calibrated set.
 
Now lets talk about scales.

This is a vernier scale:
measuring_with_metric_vernier_caliper.gif

It is read by first seeing where the zero of the small scale lines up relative to the large scale. In this case it falls between 2.8 and 2.9 which is easy to see. If no more precision is necessary, stop here. However, if you want to be more precise, the bottom scale is used. To do this, find the line on the bottom scale that most closely lines up with a line on the top scale. This is not so easy. In this drawing the red arrow is pointing to the 6.2 line, but we could argue if that's really the measurement. No one will argue that the closest major division is 6 and we could then call the measurement 2.86. If we needed more precision we could say it's 2.862 if we believe the red arrow.

Vernier scales are not easy to read when the highest level of precision is necessary. Therefore, I don't recommend them if you can avoid it.

On a micrometer it looks like this:
CM-80-1.jpg


This is read by first reading the scale on the sleeve or barrel. It is read by noting where the bottom of the thimble is. If this is a 1" micrometer, the thimble is sitting between .275 and .3. Then the number on the thimble is added like this: .275+.023=.298. But the thimble scale isn't perfectly on a line either so, we have to add the next number that is indicated by the lines around the barrel and where they most closely line up with a line on the thimble. This is a drawing and doesn't line up properly, but lets say the first line past zero is the one that lines up. That would make this reading: .275+.023+.0001 or .2981".

Micrometers like this are a pain to use. First you have to be very gentle and consistent with how tightly you close it. Then you have to turn the thing around to make the reading. If you have to remove it from the device to make the reading, you risk moving the thimble and that ruins the reading.

Digital is better if only for saving your eyes.


A micrometer like this does not have a vernier scale:
103-177.jpg


This is a direct reading scale. Still have to add the thimble to the barrel reading. It has less precision, but is much easier to use. Digital is better.
 
There are vernier calibers, but none that are accurate to a tenth of a thousandth. And there are micrometers that are accurate to a tenth of a thousandth, if they are calibrated. But there are no things as vernier micrometers.

Vernier refers to a specific type of marking on the scale of the instrument that makes reading an additional digit relatively easy.

Most are familiar with them on calipers, but they are common on micrometers and many other types of instruments with a mechanical scale. I have mics both with and without a vernier scale, and the ones with a vernier scale are definitely the easier ones to use.

I have a two pan analytical balance as part of instrument/glassware collection(I'm a chemist) with a vernier scale on the gold chain. It makes reading the last decimal place(.0001g is a typical precision on this type of balance) very easy.
 
The only correct answer to this is, it depends.

First we need to nail down some terminology. Unfortunately, the term "calibrate" is used differently by different people. In the USAF we used "calibrate" to mean: verify manufacturer's specifications by comparing against a standard. In most civilian parlance, calibrate adds this to the above statement: ...and adjust to nominal.

There are several aspects of a micrometer that need to be checked/adjusted to say that it was calibrated:
  • Flatness
  • Parallelism
  • Linearity
  • Repeatability
  • Accuracy
Not necessarily in that order.

Because of how a micrometer works, if the face of the anvil and spindle are not flat and parallel, it won't be linear. If it's not repeatable, it won't be accurate.

I have seen some micrometers that are adjustable for zero and full scale, but they are rare. It's usually not worth the time to make the adjustment. So, to "calibrate" a micrometer, it is checked against a standard and if it doesn't pass, toss it.

If you have to adjust the zero of the micrometer after the initial adjustment, I would suspect the whole thing of being mishandled.

These are the tools needed to calibrate a micrometer:

Optical Flat
Optical%20Flat.jpg


Gauge Blocks
RY81A1.jpg


I prefer the ceramic gauge blocks because they can be handled without gloves. But you should use gloves anyway to combat thermal expansion. No normal person is going to have a set of gauge blocks handy because they cost about $4K for a calibrated set.

Out of curiosity, as I said I was taught "3 clicks on the clutch" for using a mic. It at least gives me repeatable results. I'm just curious, however, if this is the correct way to use one.

I'd appreciate an answer on this from somewhat who definitely seems to know what they're talking about.
 
If you insist on buying a used micrometer, look very closely at the spindle and anvil faces. If they look like this:
Micrometer%202%20of%202_zpsmtnpwonv.jpg

Micrometer%201%20of%202_zpswqemusdh.jpg

Find another micrometer.

The damage you can see on this micrometer is enough to make any measurement at the .0001" resolution useless. It might be good enough to use to .001", but for that the digital caliper is much easier to use.

Also, if you pick up a used micrometer and see that it was stored with the spindle touching the anvil, reject it. Storing it this way will promote corrosion and could mean they over-tightened it which will damage the faces.
 
Out of curiosity, as I said I was taught "3 clicks on the clutch" for using a mic.
Unfortunately, there is no standard for this. The addition of the clutch was brilliant. It allows for a very consistent amount of pressure to be used.

When I was calibrating these, I used one click. I would bring the spindle close and then gently move it until it touched. Then I would give it one click. This gave me the most repeatable readings.

I did do some experimentation along these lines though. I found that if I brought the anvil and spindle together gently, one click was enough and two or more, gentle, clicks didn't change the reading. However, if I brought the spindle and anvil together too quickly or with too much force, the reading would change based on how much extra pressure was exerted by the rate of closure.

What's more important is doing it the same every time. By having a standard of 3 clicks, you made for very consistent measurements and that's a good thing.
 
Last edited:
Unfortunately, there is no standard for this. The addition of the clutch was brilliant. It allows for a very consistent amount of pressure to be used.

When I was calibrating these, I used one click. I would bring the spindle close and then gently move it until it touched. Then I would give it one click. This gave me the most repeatable readings.

I did do some experimentation along these lines though. I found that if I brought the anvil and spindle together gently, one click was enough and two or more, gentle, clicks didn't change the reading. However, if I brought the spindle and anvil together too quickly or with too much force, the reading would change based on how much extra pressure was exerted by the rate of closure.

What's more important is doing it the same every time. By having a standard of 3 clicks, you made for very consistent measurements and that's a good thing.

Thanks-I'm glad to know that I'm at least not doing any harm. When I'm taking a measurement and know I'm getting close, I approach the first click with care and then slowly add the two additional clicks.

BTW, I use my bench mic wherever possible. Do my comments about thermal expansion of handheld ones have any merit?

For reference, my main use for mics is in watchmaking work, so both precise and accurate measurements(I use the word precise as a scientist-i.e. repeatable) are important.
 
BTW, I use my bench mic wherever possible. Do my comments about thermal expansion of handheld ones have any merit?
Absolutely! However, you have to take into consideration the level of precision necessary.

For most home measurements, a resolution of .001" is enough. At that level of resolution/precision thermal expansion is not critical enough to make a noticeable difference for most people. When you increase your precision to the .0001" level it matters more. If you hold the micrometer in your hand for 5 minutes, that's enough to change the reading more than .0001". This is why we always use cotton gloves when handling devices like this. It reduces the incidence of corrosion from the oils on your hand and reduces the effect of thermal expansion/contraction on the measurement device and device under test.

For watch making it is very important. The pieces are so small, even a little expansion through the heat from your hand, can make a part not fit. If I were you, and it was possible, I'd hold the part under test with tweezers and only touch the clutch of the micrometer.

Most people don't have the luxury of working at a bench in a controlled environment though. It's not the end of the world to be outside in 90°F either. As long as all the parts are at the same temperature, the measurement should be good.
 
Rastoff: good and informative posts! I will say though that since our mics all had carbide tips, we'd get our rear end chewed out if we got them anywhere near a gauge block. We used gauge blocks to measure and set indicators with, but were told to never use one with a mic. They (our instructors) didn't mind us using verniers or the new fangled dial calipers on them though because the surfaces weren't carbide and wouldn't mar the blocks. BTW: we called them Jo Blocks, as the manufacturers name was Johansson (pronounced Yo-hanson).

Then there was an apprentice who blued up a gauge block and used it set his surface grinder height by letting the wheel take off the bluing. :eek:
 
Thank you all for all the help and information!Much more than I thought I would receive !I guess then it will not hurt to use my Chinese mic as a glueing clamp !:D:D:D
 
For watch making it is very important. The pieces are so small, even a little expansion through the heat from your hand, can make a part not fit. If I were you, and it was possible, I'd hold the part under test with tweezers and only touch the clutch of the micrometer.

Trust me when I say that "lesson 0" in watchmaking is how to use tweezers(it's not as straight-forward as one would think) and once mastered your tweezers do become an extension of your hand.

In fact, I have a pair of Dumont #2s that are significantly shorter than a new pair. They are my primary tweezers, and I dread the day when they pass the length where they can reasonably be sharpened.

It's unimaginable to me to place something in a mic with my hands. It may be with my tweezers(most often), it could be held in a piece of pithwood or rodico(a type of watchmakers sticky tack) or it might be a piece on the lathe.
 
I have always stopped at the first click. I've never seen anything which says how many clicks are appropriate. I have a metric micrometer without the little ratchet clutch on the spindle, so I just go until I feel a little resistance without forcing it.
 
I will say though that since our mics all had carbide tips, we'd get our rear end chewed out if we got them anywhere near a gauge block.
I understand they want to be careful with their gauge blocks, but this is unwarranted. How do you think we calibrated micrometers? Yep, even the carbide ones, which was most of what we saw, were used on the gauge blocks. Of course we spent a decent amount of time cleaning before anything touched the gauge block.

Then there was an apprentice who blued up a gauge block and used it set his surface grinder height by letting the wheel take off the bluing. :eek:
Please tell me he was fired the same day. This guy clearly doesn't understand measurement.
 
I almost forgot to ask...
...verniers or the new fangled dial calipers...
...what do you mean when you say "verniers"?

As I stated in one of my earlier posts, how we use terms varies with industry and location. To my father-in-law this was a vernier:
image_16342.jpg

Of course that is actually a dial caliper.

This is a vernier caliper:
50011_BIG.jpg


This is a digital caliper:
31yrogdlWzL.jpg



Hi. My name is Rastoff and I'm a metrology nerd.
Metrology is the science of weights and measures.
 
Please tell me he was fired the same day. This guy clearly doesn't understand measurement.

fortunately for him, none of the supervisors saw him do it. His fellow apprentices ragged on him bad enough he never thought about doing it again.

And what we referred to as verniers are in your second picture. We had them up to 6 feet long.
 
When using those Digital Calipers or Micrometers PLEASE keep in mind that just because the digital readout go to 5, 6 or more decimal places it does not mean that they are accurate to that many decimal placed.

I have worked in quality Assurance for over 40 years and been in aerospace work for about 30. The digital tools are very nice and make it easy to read the measurement BUT it does not change the basic precision possible from a given kind of tool. Unfortunate enough people think they do and that the tool is accurate or precise to 6 or 7 decimal places because the readout says so.

A sliding caliper, whether it has a vernier scale, a dial or a digital readout is still about a .002 precision tool and I prefer not to use then on tolerances less than .005.

A decent micrometer is capable of "tenths" (.0001) precision but there are even better tools when you need to measure that closely.

In quality assurance the "Rule of ten" is used, The tool needs to be tn times more precise in its discrimination than the tolerance. For a .001 tolerance the tool must be "accurate" to .00001.

There is also a huge difference between Discrimination, Accuracy, and Precision. But that is an argument for another day
 
Last edited:
When using those Digital Calipers or Micrometers PLEASE keep in mind that just because the digital readout go to 5, 6 or more decimal places it does not mean that they are accurate to that many decimal placed.
This is a accurate statement, but you can use them to their limit. Almost all digital calipers have an accuracy of +/-.001" and they can be used to that accuracy.

...a dial or a digital readout is still about a .002 precision tool and I prefer not to use then on tolerances less than .005.
I guess you could do that, but it's a personal choice not an industry standard.

A decent micrometer is capable of "tenths" (.0001) precision but there are even better tools when you need to measure that closely.
Just out of curiosity, what's a better tool?

For a .001 tolerance the tool must be "accurate" to .00001.
Really? This is not the standard the USAF uses. In metrology we use a 4:1 rule. That means a standard must be at least 4 times more accurate than the tool it is being used to calibrate. Your example is 100:1, but I'm sure it's a typo. 10:1 is a very difficult ratio to maintain.

There is also a huge difference between Discrimination, Accuracy, and Precision. But that is an argument for another day
I'm curious, what do you mean when you say discrimination?
 
I'm just going to come out and say it. Some of you guys could use a hobby.

Precision tools can be a hobby in and of themselves.

I'm always surprised to see a comment like this on a forum with a heavy emphasis on collecting. The same could be said about folks who hunt down specific S&Ws in particular configurations, or spend hours arguing over the "best"powder.
 

Latest posts

Back
Top