View Full Version : Which Digital Calipers

12-09-2003, 03:15 PM
I need to replace my 6" digital calipers. I need good accuracy without breaking the bank. There are many choices, and most of the ones under consideration claim .001 accuracy. I know many here have a strong preference for Mitutoyo, but they make several similar models with a broad range in price. Any specific recommendations?

Al Messer
12-09-2003, 03:39 PM
Joel, Digital Calipers are a lot like women and sports teams: everybody has his own favorite. Wish I could REALLY advise you, but mine are an Oriental import that cost me the princely sum of $24.95. But, they are accurate enough for what I do.

12-09-2003, 03:52 PM
I love my 8" digital caliber I got from HF for 25$ on sale. They are the stainless steel ones.
I also have a nice mitutoyo 6" dial caliber that I used for first time in a long time, and oh man, I forgot how nice and silky smooth it is.

12-09-2003, 04:11 PM
Mitutoyo's best digital calipers have an SPC output. This is one thing you probably don't need, and can save you $30 or so. I wouldn't buy their cheapest units (some have said less accuracy) but either a conventional digital or a water-resistant digital if that's a concern for your application.

12-09-2003, 04:14 PM
Thanks, I still have a good set of the inexpensive ones. I need something good to .001 or better, and I figured many here have had good luck with some particular calipers.
J&L currently has on sale, a Mitutoyo MT5-00652E for $139. A B&S Dura-Cal IP65, 599-571-10S for $110. And a Starrett 721A-6/150 for $122. None list accuracy in the catalog.

[This message has been edited by Joel (edited 12-09-2003).]

12-09-2003, 04:22 PM
The best digital calipers I own are Mitutoyo Absolute Digimatic. Read to 0.005". Cost $5.00 & new battery.
I use my Starrett dial calipers for most work. I just have never gotten used to using a digital when machining. Same with indicators.

12-09-2003, 04:38 PM
How about micrometers? Am I at a huge disservice for not owning one? On my axles I need to cut, I have been using my digital calibers, and some tolerances are close to .0005.

12-09-2003, 04:40 PM

According to the technical specs on the Mitutoyo website the accuracy of nearly all thier digital calipers is +-.001. Some of the line have readouts with resolution of .0005 but still only accuracy of +- one thou. That isn't good enough for me which is why I still use dial calipers.

12-09-2003, 05:10 PM
A digital caliper that reads to 0.005" has an accuracy of 0.001".
You cannot expect better than 0.001" from dial calipers due to the mechanics of the rack & pinion drive and inherent loss of accuracy and repeatability in the mechanism not to mention the effect of parallax in taking the reading. No dial calipers that I know of spec better than 0.001" accuracy.

12-09-2003, 05:37 PM
BillH, to work to tight tolerances you need to use micrometers. Even cheap ones will work well for you compared to calipers.
Evan, when I am working to close tolerance, I use my mics. Often tolerances are not required to be that tight, and digitals are a great expedient. Years ago, I almost bought a Mitutoyo until I found out it was only accurate to .005. Glad I checked.
JC, my cheapies read to .0005 but are only good to maybe .003, good for roughing only. Do you think the Absolutes are superior to the Dura-Cals?

The B&S are looking pretty good. I can get them for $95 from the mfg. The Mitutoyo Absolutes are still right there though for under $100. Starrett doesn't give an accuracy on theirs, so I am ruling them out.

12-09-2003, 05:43 PM
Starrett are accurate to .001.



12-09-2003, 05:47 PM

I presume you mean .0005, not .005. Resolution and accuracy are not the same. Accuracy depends upon the quality and linearity of the encoder mechanism. Resolution depends upon the display capability. Even if the display can show steps of .0005 if the accuracy is only +-.001 then then possible error range is .002. A dial caliper may not be specified as having accuracy better than .001 but the needle position can be interpolated to better than that and the accuracy tested with traceable standards like guage blocks. You can't interpolate a digital display. Also, all digital displays have an inherent least significant digit inaccuracy of +- one count. In the case of the Mitutoyo since it can display .0005 steps that means the one count error band is +- .0005, or .001. That is display accuracy (resolution), not measuring accuracy.

One more thing; I often need to make things the same size. The absolute dimension isn't important. Under those conditions I can read the dial down to maybe .00025 or so. No can do with a digital.

[This message has been edited by Evan (edited 12-09-2003).]

12-09-2003, 06:02 PM
BTW, good quality ball bearings (ABEC-SP grade) make excellent calibration tools. The mean and max od is specified to +.0000 and minus .0001" for bearings under 18mm and +.0000 and -.00015" for larger ones up to 30mm. Even the lowest grades have decent specs, no more than +.0000/-.0003. The standard principle when calibrating is the standard should be ten times more accurate than the measuring tool.

[This message has been edited by Evan (edited 12-09-2003).]

12-09-2003, 06:05 PM
Joel, I think any of the better Mitutoyo calipers are probably pretty good, only get the features you need, the absolute retains the zero setting when shut off, which is nice, but not necessary. SPC output also adds to the cost as mentioned. I would avoid the plastic Mitutoyos. Travers has the 6" MyCal Absolute at $84.95 right now.
Evan, yes, I lost a decimal place, meant 0.0005". It is impossible to accurately interpolate the scale on a dial instrument unless it incorporates a mirrored scale to eliminate parallax as I stated. I do not know of any dial calipers with this feature. Just tilting the dial on my Starrett changes the reading at least 1/3 of a division.

12-09-2003, 06:22 PM

"Starrett are accurate to .001. " Nope. They are accurate to +-.001. That is a two thou range so the reading is somewhere within a two thou band of possible error. That is completely unusable for me.

Paul Alciatore
12-09-2003, 07:04 PM
<font face="Verdana, Arial" size="2">Originally posted by JCHannum:
..... It is impossible to accurately interpolate the scale on a dial instrument unless it incorporates a mirrored scale to eliminate parallax as I stated. I do not know of any dial calipers with this feature. Just tilting the dial on my Starrett changes the reading at least 1/3 of a division. </font>

JC, I disagree on the REQUIREMENT for a mirrowed scale in order to accurately interpolate between the lines. I will agree that it can be a big help and will increase the accuracy and the assurance of such a reading but with due care, interpolated readings can and are accurately made on instruments that lack this feature every day. The mirror is there to assure that you are looking at the scale from directly above to eliminate parallax. If you pay attention to your position, you can do almost as good without it. Just don't "tilt the scale". But the mirror absolutely does help. Without a mirror you can read to 1/4 or 1/5 of a division, with it perhaps to 1/8 or 1/10 and with more confidence.

Another point here is that each of your eyes views the scale from a slightly different direction so you should use only one eye (your best one) for the absolute best reading, mirror or no mirror.

One good point about dial readouts vs digital ones is the ability to discern small changes with a dial that would not change a digital display. In electronic setups it is often desired to adjust something for a maximum or minimum reading. This is far, far easier to do with a old fashioned meter that uses a moving pointer than with a digital meter that may be far more accurate. You can see small movements of the needle that are way below the meter's stated accuracy and recognize their direction almost instantly. With a digital meter, you must change things by at least one count and then stop and think to determine the direction of that change. Many high quality digital meters have an old fashioned mechanical meter built in for presicely this reason. It may not be "accurate" but it wiggles better. Cheaper ones have digital bar-graph scales that are OK, but not as useful.

On the 0.0005" steps built into most digital calipers, the reason that is choosen is because of the basic 0.001" accuracy. By chooding a least count that is half the accuracy, it sort of takes the +/- one count error out of the equation. But I said "sort of" because if it reads x.xxx5 it can still be either the thousanth shown or the next one.

Back to the original question, I feel that the real problem with calipers is the fit of the moveable jaw to the main body. I have a 8" dial import and it can cock (rotate) enough to change the reading by 0.001". And, yes I've played with the adjustment screws. I have a 30 year old German made vernier caliper that is a lot tighter. Germans vs the Chinese. But then it's hard to read to 0.001" without a magnifier so I rarely use it. I think this is why calipers are only accurate to 0.001" and by avoiding this kind of error, mikes are just basically the better instrument. With a mike, the measurement is made directly on the axis of the thread so there is no basic reason for it to rotate out of alignment.

Paul A.

12-09-2003, 08:06 PM
I believe Mitutoyo have the market on calipers,(not mic's, but calipers).I have used and owned a few other name brands, and I still love my Mitutoyos better than any of them...

Ask 10 different people, and get 10 different answers..


12-09-2003, 08:53 PM
I have Mitutoyo 6" digimatics and Starrett 721 full function with output. The Mitutoyo blows away the Starrett. It is absolute so you don't lose zero every time you turn them on. It has a thicker beam. Readings are stable even while driving a tank over them (well, almost). It has a better design for the enclosure and buttons and does not have a cheap stick on overlay like the Starrett.

Best part ... the Mit's are less than $100 while the Starretts were almost $270 (carbide face, full func., outputs).

I like Starrett for most things but not these.


12-09-2003, 10:59 PM
whether it is dial,vernier, or digital no calipers made have an accuracy of greater than +/- .001". if you want precision you should use a mike.

Al Messer
12-09-2003, 11:09 PM
Precision is determined not only by how well you can see, but also how well (sensitive) you can feel.

12-09-2003, 11:32 PM
I run Mito. absolutes at work 8" to be exact,they both readout to and hold .0002" so unless something has changed they are plenty accurate,mine are over six years old and other than batteries and just being ugly they work fine.

Evan is right on the ball races being used for calibration.The two pair I have hit dead on against my 2" Starrett mic,and my hole mics.

Also I have noticed one key advantage to the digitals,dust does not affect them.I used to go through a dial caliper a year before I got mine.

As I remember(it was years ago now)I paid $165.00 for mine from KBC.

12-10-2003, 12:01 AM
Weird, what model Mitutoyo do you have? Latest catalog I have, 2000A, shows no digital caliper with resolution greater than 0.0005" or +-0.001" accuracy.
Interestingly enough, Starret lists no +-accuracy range for their dial calipers. They state they read to 0.001". They offer a discrete model number with standard letter of certification. My price list is not new enough to show how much this paper costs. There is no mention of accuracy with or without the paper. Their digital calipers are +- 0.001" accuracy.
Mitutoyo includes a letter of certification and states accuracy of +- 0.001" for both the dial and digital calipers.
All calipers lose accuracy as size increases. 12" +- 0.0015".
Anybody notice the letters of certification that come with the Chinese stuff? Fancy light green print, important looking document with curlycues on border? They basically say this instrument is certified to be a dial caliper or some such wording. Nice to know they stand behind that.

12-10-2003, 02:30 AM
I have checked my Calipers and digital mics with certified Ceramic gauge blocks - all of my Digital Mitutoyo calipers and mics are dead nuts accurate. The digital mics that read to 50 millionths properly round up/down when the proper stack is selected.

It should be noted here that it is easy for a user to get inconsistant reading with a mic with either the friction or ratchet thimbles - technique is very important for consistant results.

I was quite impressed with the accuracy of the calipers as well and noted here that excessive pressure greatly affected accuracy on a known standard (certified gauge blocks).

My other analog Mitutoyo Mics .0001" reading are "close enough" not to worry about (within +/-.00005 given the uncertain nature of the vernier readings - dead on)

The Dial calipers because of the nature of the tool the readings to .oo1" +/- .ooo5" is easily acheived noting position of the needle between clock ticks. To expect more is pushing your luck.

What I was not able to check was anvil alignment and parallism as I do not have the 1/20wave 1/4turn optical flats (set of 4)

Technique being so important, is most likely why they under rate the tools - so when they are misused and abused they still come in under the expected accuracy and no one gets sued. But I could be wrong.

12-10-2003, 03:32 AM
There seems to be a basic lack of understanding of how a digital display works. A digital measuring device cannot present information beyond the limits of the display. In the case of the Mitutoyo digital calipers and all others I have found the best is a least significant digit of .0005. This is most likely because of the available chips for making such devices have that as a limit.

The least significant digit of a digital display always has an inherent uncertainty of plus or minus one count. It is impossible for the display to give a value more accurate than this. If the least significant value that can be displayed is .0005 then it may display .0010 or .0000 or .0005 for the almost same measurement, one that varies by only .00051. If the display is on the verge of switching from .0005 to .0010 the next .00001 may be what does it. That means it indicates what is an actual change of .00001 as a change of .0005.


The link above from Chipeater shows a page that does give the accuracy of the Starrett digital calipers as +-.001, an error band of .002. That means that the caliper might display .1000 when it should read .1015 for an actual distance of .10159999

Starrett does make a Master Vernier Dial Caliper that indicates in graduations of .0001 on the dial. It has a display resolution of .0001. The accuracy is equivalant to the calibration standard use to calibrate it.

There are various sources of error in such instruments. As I said before the accuracy and the resolution are two different things as they are not related to each other. Accuracy is the ability of the instrument to correctly measure the absolute distance. Then the display must correctly show that. Two different jobs. That is why Mitutoyo gives two separate and different specs, one for resolution and one for accuracy.

A good example of this is a plain old ruler. Imagine a ruler with marks showing only the inches. No smaller subdivisions. However, the inch marks are placed with an accuracy of +-.001. This ruler then has an accuracy of +-.001 inch but it has a resolution of 1 inch. As you can see, the resolution and the accuracy are not directly related.

There are many sources of inaccuracy. Worse yet many of these are non-linear. Most are periodic in nature. This means that the accuracy of a measuring device changes depending on the length being measured. Some of these changes are due to temperature but periodic errors are not. In a purely mechanical device such as a dial caliper it is due to minute variations in the rotating components. No gear is perfectly round. No hole is in the exact center. No bearing is perfect. Keep in mind that most parts are made by rotating devices. It is the nature of matter that exactness of material is impossible, we can only come close. Periodic errors are of great concern in the devices I build. They show up as tracking inaccuracies with a definite time period. The double arm drive I built has no detectable periodic error but it must have some, I just can't measure it. It has an accumulated inherent tracking error of less than one arc second over two hours. That is an accuracy of approx +-.00007

[This message has been edited by Evan (edited 12-10-2003).]

12-10-2003, 12:06 PM
I have not said that a digital caliper will be more accurate than 0.001". They read to 0.0005" resolution. A dial caliper is really no different. The best resolution to be expected is 1/2 division, or 0.0005".
Mitutoyo and Starrett both make electronic digital micrometers with resolution of 0.00005", stated acccuracy of +-0.0001".
I did say the Starrett catalog shows digital accuracy to be +- 0.001", they make no statement regarding dial caliper +- accuracy.
In the introductory column to slide calipers in the catalog, Starrett says; "The best digital and dial slide calipers, regardless of resolution, are accurate to within 0.001" or 0.03mm, every 6" or 150mm. The best vernier calipers are accurate to 0.0005" or 0.013mm per foot or 300mm." Within 0.001" means +- 0.001". I would think it safe assume they are knowledgeable in this area.
The Special Master Dial Indicator Vernier Caliper you mention is a special order item. The dimension is set with the vernier to 0.001" and slide is locked. Comparative +- variations to 0.0001" are then read on the dial indicator, much like a bench mic or over/under micrometer. It's range starts from 2", price on application. A calibration master accurate to 0.000050" is available at extra charge for checking the relationship between the vernier and dial indicator zero. Hardly likely any of us will have that puppy lying around in our shop.
As far as the mirror is concerned, I should have said that it is impossible to read accurately to 0.001" without a mirror, as that probably is closer to the truth, and that is why mirrored scales are used where the most accurate reading is required.
This whole thing is academic actually, as some of the finest machining done was done using spring and solid joint calipers. These guys developed the feel and techniques needed to use these tools, and in their hands amazing work was done. We have much more precise methods readily available today, but unless the proper techniques of feel and application are developed and observed, they are worthless.

Rich Carlstedt
12-10-2003, 05:50 PM
Wow. what a discussion
Here is my 2 cents...
I have a Starrett model 722 Digital caliper
I guess it is no longer made ?
It reads to .0001 ( NOT .001 )
I used it at work where we held tolerances to .0002 for high pressure seals (10K PSI +)
I always compared readings with a mike reading and it was /is deadnuts to the mike!

TWO Things however . It is not a C clamp
Most caliper users measure incorrectly ! and no pictures ever show you how to do it right !
Holding the bar is wrong...your body heat is transferred to the scales, wether it is a dial or digital caliper and error will result.
Hold the "head" with the right thumb and forefinger.
use the left forefinger and thumb to squese the jaws closed on the work piece.
The thumb and finger MUST be directly in line on the measured points..not below, or above and "feel" the jaws to make sure squareness is maintained.
only then will you come close to accurately and repeatitively measure such small tolerances.

For my money , the 722 is an outstanding digital, and it is American made !
keeps precision men on the job !

12-10-2003, 06:03 PM

It appears it isn't made any more. Now that would be of interest to me. Unfortunately I haven't been able to find anyone with a digital caliper that measures and reads to .0001. I'm not against digital at all, just the currently available models aren't good enough.

12-11-2003, 12:00 AM
Whoops!my bad,I was thinking mic and typing caliper,well anyway the ones I have are the Digimatic model and do measure to +/-.0005 which is good enough for 99.995% of what your apt to run into,any tighter tolerences than that won't have much bearing in the real world since most projects won't spend much time in a climate controlled vibration proof room.
I have found from experience that when a bearing mfg supplies a tolerence of +/-.0002 it means grap the loctite! http://bbs.homeshopmachinist.net//biggrin.gif
<font face="Verdana, Arial" size="2">Originally posted by JCHannum:
Weird, what model Mitutoyo do you have? Latest catalog I have, 2000A, shows no digital caliper with resolution greater than 0.0005" or +-0.001" accuracy.
Interestingly enough, Starret lists no +-accuracy range for their dial calipers. They state they read to 0.001". They offer a discrete model number with standard letter of certification. My price list is not new enough to show how much this paper costs. There is no mention of accuracy with or without the paper. Their digital calipers are +- 0.001" accuracy.
Mitutoyo includes a letter of certification and states accuracy of +- 0.001" for both the dial and digital calipers.
All calipers lose accuracy as size increases. 12" +- 0.0015".
Anybody notice the letters of certification that come with the Chinese stuff? Fancy light green print, important looking document with curlycues on border? They basically say this instrument is certified to be a dial caliper or some such wording. Nice to know they stand behind that. </font>

12-11-2003, 12:04 AM
BTW:The digitals I have all three points agree,meaning inside,outside and depth rod all read .0000,very few dials will do that as well as inside and outside be dead on.

No sir they are indeed worth the money,I will never go back!

12-11-2003, 12:23 AM
My cheap digital calipers I think are dead nuts accurate too.
IF you extend them all the way out, Run it back in, Back out in and out in and out, it allways comes back to .000. Now if it was not very accurate, wouldn't it not zero back in?
If You ask me, im willing to bet that the Harborfreight Stainless digital mics are made in the same exact factory as the Mitutoyo's, and my cheap ones even have the data output on them.
should get one just for kicks, and compare them.

12-11-2003, 01:07 AM
The fact that it returns to zero says nothing about accuracy. It just means it doesn't skip any counts each way. The question is: do the counts occur in the right places?

As an example you have a blank rule and scribe 100 marks on it by eyeball, no other help, 10 per inch, no ruler to compare with. If you count from one end starting at zero all the way to the other end to 99 and then back to zero you will always end up back at zero if you don't skip any. Does that mean it is perfectly accurate?

[This message has been edited by Evan (edited 12-11-2003).]

12-11-2003, 01:33 AM

I believe you misunderstood what I was trying to say, let me explain further. Calipers and Mics all employ round up/down in the display of the numbers. This is what I meant in stating the display changes correctly with the proper stack.

If you have a .2505" stack the tool reads this if you employ a .2506" stack it rounds to .2510" - if you change the stack to .2504" it drops to .2500" on a caliper that indicates .ooo5" mantissa. This is to be expected and within tolerance of the tool.

The tools I tested have greater accuracy than expected over their entire range. However, the larger the tool the greater the thermal effect as well as flexing - so the accuracy for larger tools becomes less certain as distance increases.

Thermal effects are dramatic, by placing an electronic gauge and amplifier on a gauge stack cupping your hands around a gauge stack (but not touching it) will slowly increase its size several millionths with ceramic blocks - much faster with steel. Even breathing on the stack affects it.

12-11-2003, 01:39 AM
Hmm, which begs the question, is it even plausible to work within .0001 tolerances if the room temperature will mess you up?

12-11-2003, 01:39 AM

Yes, I know what you mean. They do that by using an extra least significant digit that is not displayed. Standard practice in binary math that must be converted to decimal.

12-11-2003, 01:45 AM

It is possible as long as everything is at the same temp. Holding one of my Starrett inside mics in my hand for 15 seconds increases the length by about .0003. It really get interesting when checking the highest quality guage blocks. The are actually made slightly undersize as they are only used when wrung to another surface and the thickness of the wringing film is included in the dimension of the block. That film is very repeatable and is around 10 nanometers. When they are tested in an interfereometer they are assembled and then left to cool/thermal equalize overnight before testing.

12-11-2003, 03:44 AM

Further to your point, it matters what size of work you are making. I referred before in another thread to working on aircraft sheet metal. This is directly related to the subject at hand. I would be laying out a drilling pattern for a eight foot long aluminum ferry tank for a helicopter and someone would open the hanger door and let in a blast of cold winter air. The tolerance over eight feet was around .015. Change the temp of the hangar by 25 degrees and you might as well go for lunch while it all warms back up. You start laying out on the now cold metal and you have a large piece of scrap.

However, if you are making a part in the lathe that needs to be .250 diameter then a few degrees difference makes very little difference in size.

12-11-2003, 08:25 AM
If you are not buying Starrett, you are just wasting your money,imho

12-11-2003, 09:47 PM
<font face="Verdana, Arial" size="2">Originally posted by wierdscience:
BTW:The digitals I have all three points agree,meaning inside,outside and depth rod all read .0000,very few dials will do that as well as inside and outside be dead on.

No sir they are indeed worth the money,I will never go back!</font>
What I meant was all three points agree regardless of the decimal in question,a 1.00" measurement is 1.00" on the inside,outside and depth rod,usually the depth rod is a few thou plus of.000"and the inside/outside aren't exact in realationship either and must be calibrated before use.

Rich Carlstedt
12-11-2003, 10:07 PM
As usual, Thrud is right on !

temperature control is manditory for any work when you get down to .001 or less

When turning 60" 4140 round die forgings to such tolerances without coolant, you better know what you are doing !
But you do NOT need a temp controlled envirionment.
2 things-- your mikes should be at the same temp as the part, and always have a standard to varify.
I had our 60" King Vertical Turret Lathe's chuck miked (and Pi taped) exactly and engraved on the chuck OD.
This way the machinist could mike the part and the chuck and the chuck, being at the same temp as the part would confirm where he was at with the part....also I was one of those bosses who told my guy to go take a coffee break, while the Pi taped warmed upto the part temp.
On horizontal lathes, we laid the mike on top of the part, covered it with a shop towel(S) and got coffee till the mike and part matched..works great
engraved all chucks in the shop, by the way.

Chief is also right...Staritt is the only way

12-12-2003, 12:50 AM
Kinda sounds like what I said...

12-12-2003, 01:07 AM
well, im just going to use my calipers, and what they say,Iwill go with, and if my wheels have a loose fit on the axles, I will peen it a little bit. No sense in getting worked up about temperature differentials and metal expansion.

12-12-2003, 03:53 AM
A customer and friend of mine in town here is a live steamer. He knows a guy who spent several years building a steam loco with the utmost care and precision. It ran like a swiss watch on compressed air. The day came when he steamed it up for the first time. The heat expanded the parts so that some siezed and others about fell off because of the heat. You need to pay attention to the coefficient of expansion of the different metals.

12-12-2003, 11:36 AM
Bill, if you work carefully to the plan dimensions, your calipers will probably be sufficient for most, if not all that you will need. You need not worry about a final dimension as a number, but strive for the best fit. It does not need a number attached.
As far as manufacturer, I think Starrett and Mitutoyo are comparable in most respects. I prefer Mitutoyo calipers for a couple of reasons;
The digital seem more solid and stoutly built. I think someone else mentioned this. It is purely tactile and probably has nothing to do with ultimate accuracy.
The Mitutoyo dial calipers seem to do a better job of covering the rack to keep chips and junk out and jamming the pinion.
Too bad, but it seems Starrett has gone to China for dial and digital calipers.

Don Clement
12-12-2003, 12:12 PM
I own a Starrett 722 caliper that has a resolution of 0.0001" The accuracy however is only +- 0.001”... the same as most all calipers! Since the caliper line of measurement not the same as what is measured, a caliper suffers from abbe error. A micrometer does not have abbe error because the measurement is taken along the same axis.
The Starrett 722 has a major flaw...it eats the two 386 batteries every few months.

Don Clement
Running Springs, California

[This message has been edited by Don Clement (edited 12-12-2003).]

[This message has been edited by Don Clement (edited 12-12-2003).]

Ragarsed Raglan
12-12-2003, 02:56 PM
At the end of the day the caliper is a gauging tool, nothing more - nothing less. Operator error can at least double the in-built error readings pertaining to a caliper. I've seen idiots use calipers like an adjustable wrench! Screw down the 'lock wheel' so tight to be almost impossible to loosen without resorting to pliers. Not to mention my favourite gripe ~ when others persist in re-zeroing the scale all the time. A tool is only as good as the nut that holds it (like the joke about the unsafest part of an auto is the nut that holds the steering wheel!)

A micrometer is the absolute best way to measure a component dimension accurately. But then again don't let me catch you using the ratchet or friction drive! FEEL it. When I started in engineering as an apprentice, I was made to measure everything with a spring calliper - reason? It teaches you the feel! Then I got to use a Mike. And only after using a Mike was I allowed to use a Vernier calliper!! (this was to teach me how easy it is to use a Mike)

I had this discussion the other day about inspection departments tendency these days to rely on digital height gauges. Again I was always taught that the only sure way to measure anything accurately in a Standards room was to use a dial clock comparator set against a gauge block stack - I believe this is still the way to achieve 100% (or as near as possible) accuracy.

I'm not ruling out the use of DC's - only pointing out the error of our ways (pun intended)!


12-12-2003, 09:22 PM
holy hell boys.....what're ya makin' ???

lets not lose sight of the fact that engineering's not about precision....its about being "close enough"....

take care & work safe!!

12-12-2003, 09:37 PM
<font face="Verdana, Arial" size="2">Originally posted by chkz:
holy hell boys.....what're ya makin' ???

lets not lose sight of the fact that engineering's not about precision....its about being "close enough"....

take care & work safe!!
chris </font>

Well... It does depend on what you're making. Close enough had to be very close for this one.


12-12-2003, 10:22 PM
Wholy cow, 1 part per 1 million? I bet you made it on your southbend lathe too!

Don Clement
12-12-2003, 11:03 PM
As long as we are talking telescope drives...here is an extremely precision and rigid telescope drive I made about 10 years ago to hold a small 8" Mak-Cass scope. http://mysite.verizon.net/res0owmd/id4.html
It might be noted that the precision of this drive is not because of highly precision machined surfaces, but due to averaging of multiple cables with lower precision machining. Also dirt contamination or surface irregularities on machined surfeces do not affect drive performance.

Don Clement
Running Springs, California

[This message has been edited by Don Clement (edited 12-12-2003).]

12-13-2003, 12:34 AM
wow Evan....very nice work !!


12-13-2003, 01:04 AM
I have been nlessed with many Starret and Mito measuring instruments. A machinist left to my brother when the machinist retire to Florida. My brother gave them to me. Some of the tools I was not sure what they were, so I looked them up. The only thing I am missing is a good micrometer stand and I will probably make that.

I do have a set of HF 6 inch digitla calipers, got them on sale, they work, they are reasonable accurate.


12-13-2003, 02:40 AM

That is a beautiful drive Don. Have you ever tried to do it with stainless foil ribbons instead of the cable? The ribbon drive is used in many older hard drives for a zero backlash head posistioner before linear voicecoil motors were the norm.

[This message has been edited by Thrud (edited 12-13-2003).]

12-13-2003, 02:43 AM
Very nice Don,

I have considered doing something like that but with a metal tape instead of cable. The old stepper disk drive head mechanisms used metal tape drive. Absolutely no stretch.

Hmm... Thrud, post overlap http://bbs.homeshopmachinist.net//biggrin.gif Great minds think alike.

[This message has been edited by Evan (edited 12-13-2003).]

Don Clement
12-13-2003, 11:51 AM
I have built drives using thin metal bands. The only advantage that I see of using an endless metal band over a cable drive is that the cable drive has limited travel (not necessarily &lt; 360*). Even with idlers, a band drive will require up to several inches of unsupported free length. The cable drive shown at http://mysite.verizon.net/res0owmd/id4.html has 0.05” of free unsupported free cable length with stiffness approaching that of a solid. The cable drive requires almost no cable tension for extreme system stiffness. Typical endless band drives have greater than 800 pounds of band tension for acceptable system stiffness! Each cable is independent allowing for multiple cables. Multiple cables allows the drive rate to be averaged and therefore better drive performance with less precision machining.
I am presently working on a direct drive…no bands, no cables, no gears.

Don Clement
Running Springs, California