Vernier Micrometer

This lit'l Starrett repeats purty good, but working in tenths ain't cheap.





If'n when I needed to work in tenths or closer....

It called for the 'Jo' blocks on the double A Starrett surface plate.




.
 
Precision tools can be a hobby in and of themselves.

I'm always surprised to see a comment like this on a forum with a heavy emphasis on collecting. The same could be said about folks who hunt down specific S&Ws in particular configurations, or spend hours arguing over the "best"powder.

Ben, that was called sarcasm.
 
Precision tools can be a hobby in and of themselves.

I'm always surprised to see a comment like this on a forum with a heavy emphasis on collecting. The same could be said about folks who hunt down specific S&Ws in particular configurations, or spend hours arguing over the "best"powder.

Back in the inter-war period, Mauser in Germany made precision measuring instruments. I understand they are quite valuable to such collectors if you happen to find find one.
 
I guess you could do that, but it's a personal choice not an industry standard.


Just out of curiosity, what's a better tool?


Really? This is not the standard the USAF uses. In metrology we use a 4:1 rule. That means a standard must be at least 4 times more accurate than the tool it is being used to calibrate. Your example is 100:1, but I'm sure it's a typo. 10:1 is a very difficult ratio to maintain.

I'm curious, what do you mean when you say discrimination?

A personal Opinion based on many years of using, maintaining and Calibrating Precision Tools. I have seen plastic calipers with a digital readout capable of 5 decimal placed but it did not make the tool accurate


Better Tool : For one "tenth" (.0001) or less? A Mahr Comparator, an Air Gage or a Supermike come to mind - Both have a discrimination and precision of 50 millionths (.00005) and less


Yes a Typo - Should have been .0001. The Air Force is much more forgiving than many of our customers. The Aerospace requirement flowed do from are customers is normally the 10 to 1 ratio as a preferred ratio with 4-1 often being the bare minimum that can be used and then we need to justify it. We are holding Gear index and involute and lead tolerances to 0.0004 and less. A normal tolerance for a bearing journal or bearing race is 0.0002 to 0.0005. Those are not typos they are X number of ten thousandths of an inch - The tools used to measure then must be capable if discriminating down to the nearest 10 to 50 millionth.
Also; we need to perform Gage R & R (repeatability and Reliability) studies to back up the use of a given gage. The results must show the combined gage and operator variation using no more than 20% of the tolerance.

Discrimination and precision go hand in hand and is the fineness of the scale on the tool. with digital readouts it can easily be out to 5 and more decimal places but with an analog gage you need to be able to clearly see the lines for .00005, .0001, .0005, .001 etc. This is why the finer the discrimination or precision the larger the dial face or tool in order to clearly see the spacing between the lines.


In actually inspection parts and making measurements : Discrimination is the preciseness of the readout, precision is how close the readings are to each other and accuracy is how close the readings are to the result wanted. I.E. Precision - Taking 10 measurements and the less variation between them the greater the precision. How Close the readings are to the actual size is the accuracy. You can be accurate but not precise and precise but not accurate. The goal is to be both precise and accurate.

Think of it as shooting at a target a large group centered on the X-Ring is accurate but not too precise, a small group in the upper corner of the target is accurate but not precise while a smell group placed exactly where you wanted it is both precise and accurate.

But all this is getting away from choosing a gage .
 
Last edited:
Think of it as shooting at a target a large group centered on the X-Ring is accurate but not too precise, a small group in the upper corner of the target is accurate but not precise while a smell group placed exactly where you wanted it is both precise and accurate.

This is the exact opposite of what I was taught and what I teach(I am an analytic chemist both by training and profession).

For shooting, getting all of the shots in the same place is precision. Getting them where you intended them to go is accuracy. PC textbooks use a dartboard analogy, but it's all the same.

In the "real world"-i.e. if I'm looking over a bunch of data-one of the things I look for is "repeatability", which is effectively the same as precision. In fact, if an instrument has a known but consistent error, we'll still use it and simply account for the inaccurate but precise measurement.

In graduate school, I spent a lot of time using one particular GC-MS and-being the good advisor that he was-my advisor would always ask me how much I trusted those numbers. Finally, I made up a sample of the compounds I was looking at, and made 30 sequential injections into the instrument. The precision guaranteed by Agilent(the instrument manufacturer) was a relative standard deviation of 10%, and my 30 injections gave me a 4% rsd. That was enough to satisfy my advisor :) . BTW, once I'd established what I was looking for, my first run(in triplicate) of the day was always a standard of the three compounds I was interested in both to verify retention time and mass accuracy from the mass spectrometer. All my samples were then run in triplicate, and I would run the standard again after the last sample.
 
Search Fleabay for Starrett and Brown & Sharp Micrometers and you can get excellent ones actually USA made for under $20 bucks. I own at least a dozen of them (all slightly different) and some are over 60 - 80 years old and still new in their original wooden boxes.

Used micrometers are usually still just as accurate as when they were new and unless someone actually went out of their way to ruin one, they are quite durable, adjustable & timeless.

Seems the younger generation prefers digital Calipers to Vernier Micrometers which is great for us! :D :D
 
Last edited:
Seems the younger generation prefers digital Calipers to Vernier Micrometers which is great for us! :D :D

28 here, and I much prefer analogue tools for serious work(although for a quick measurement I will grab a digital caliper).

I actually have some fairly interesting mics. One of mine is a German brand whose name I forget. In any case, though, it's an "automatic" dial bench mic that closes by spring pressure. I need to take some gauge blocks and measure its linearity, but I know it's at least repeatable.

I also have a US made "Federal" spring mic that sits on my desk. It's a beautiful piece with a two-part enamel dial(to allow zeroing by turning the bezel) and will repeatably go to .001".

In watchmaking, it's common that an interference type gauge is the best tool for the job. I have an "Obama" branded(German made, so I suspect no relation to the better known Obama family) hole gauge with a long, gently tapered spring loaded needed. It's simply pushed through the hole until the gauge bottoms out, and the hole size can be read directly off that. Also very useful is something that came in my Seitz jeweling kit. It's a metal plate with 40 Seitz jewels pressed into it and the corresponding ID of the jewel marked below it. When trying to correctly size a replacement jewel, you just take the part and go down the row until you find one that fits and allows the pivot to spin freely.
 
62 here and I was always taught to rely on Analog / mechanical or an actual read out for most accurate readings. In my admittedly old fashioned mind, Digital is UNverifiable and you are 100% reliant on a LED or LCD read out and condition of batteries. With the actual mechanical Starrett or B&S there are no mistakes (assuming there aren't any inherent damages to that tool).

I am very happy to see a 28 year old guy who actually has and uses mechanical instruments! Any vintage mechanical Micrometer in good working order will last it's owner a lifetime with simple basic maintenance and reasonable care. Digitals - - - who knows??
 
I'm just going to come out and say it. Some of you guys could use a hobby.
When you find a job doing something you enjoy, you never work a day in your life. Measurement is my hobby. It's not for everyone.

62 here and I was always taught to rely on Analog / mechanical or an actual read out for most accurate readings.
I've heard this a lot, but it's not true. Digital is every bit as good or better than analog. It's just a different way of doing things. People trust what they're comfortable with.

In my admittedly old fashioned mind, Digital is UNverifiable and you are 100% reliant on a LED or LCD read out and condition of batteries. With the actual mechanical Starrett or B&S there are no mistakes (assuming there aren't any inherent damages to that tool).
This is a misnomer. The digital instruments are designed so that battery condition is not a factor; if it's still lit up, it's still accurate. If the battery gets low enough to where it will interfere with the measurement, the unit will just turn off.

The problem with any mechanical device is backlash. This is the slop seen in the gears. No matter how finely a device is made, there will always be some room for the gears to move. If there weren't, they would bind. A very high quality set of gears will have very little backlash, but it will still be there. This is where the digital device shines. They can make a very cheap digital caliper that has no backlash (we can discuss hysteresis another time). To get a really good dial caliper with very little backlash will cost a bundle.

You can look it up, but all the general calipers on the market today are +/-.001" unless you fork out some serious cash or buy something for $5 which is just a waste of money.


I am very happy to see a 28 year old guy who actually has and uses mechanical instruments! Any vintage mechanical Micrometer in good working order will last it's owner a lifetime with simple basic maintenance and reasonable care. Digitals - - - who knows??[/QUOTE]
 
Better Tool : For one "tenth" (.0001) or less? A Mahr Comparator, an Air Gage or a Supermike come to mind - Both have a discrimination and precision of 50 millionths (.00005) and less
OK, sure, if you're going to compare it to a laboratory standard that weighs over 1K lbs and has to be maintained in a controlled environment, yes, there are a lot better tools. But I thought we were talking about hand held devices.


The Air Force is much more forgiving than many of our customers.
No, but I can see we are talking about different magnitudes of measurement here. Where I work, we calibrate the tools you are talking about. I have calibrated many Mahr tools such as height gauges and what we called a "super mic" which was +/-20 micro inches (.00002").

We also calibrate gauge blocks. This required a set of standard gauge blocks and a comparator that was capable of resolving down to .5 micro inches (.0000005"). So, yes, I'm familiar with making some very fine measurements.


Discrimination and precision go hand in hand and is the fineness of the scale on the tool.
OK, I see how you're using discrimination. I would say resolution, but it means the same thing.

A standard digital caliper has 3 1/2 digits of resolution. That means it can read as fine as .001" but has an extra digit that only reads 5 or 0. As you've said, this doesn't negate the fact that it's tolerance is still +/-.001".


Here is how I describe accuracy verses precision:
precision_accuracy.png


Note: this has almost nothing to do with discrimination/resolution.
 
Last edited:
"For shooting, getting all of the shots in the same place is precision. Getting them where you intended them to go is accuracy. PC textbooks use a dartboard analogy, but it's all the same."

I am always confounded by the numbers of those who think that firing a small group means "accuracy" when it actually means "precision." Accuracy simply means you can hit what you are aiming at, and it has little relationship to precision. Killing a deer standing still at 50 yards away requires accuracy (i.e., your sights are correctly adjusted), but not much precision (a rifle capable of shooting 1 MOA is unnecessary - 10 MOA is plenty good enough). A sniper hitting his target at 1500 yards requires both accuracy AND precision. You can be accurate without being precise, and vice-versa. But what you really want is both. Every time this comes up on this forum there are those who will argue otherwise. So it is ever thus.
 
Last edited:
Back
Top