Announcement

Collapse
No announcement yet.

Measuring with spring calipers?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Measuring with spring calipers?

    I'm new to machining and still learning about how the basic tools work and measurements are taken. I'm curious about "spring calipers". I read that spring calipers were used for precise measurements before micrometers. How do you get an accurate reading from a tool that doesn't have a scale built into it?

  • #2
    you take a measurement by feel in places not accessible with calipers or mikes and quantify the measurement with those.

    Comment


    • #3
      There was a time when I was younger that I did not have the fancy gauges that I have now.
      I had to measure an inside bore for a bearing, something critical. All I had was internal spring
      divider calipers. Man it was difficult to get the same reading twice. Almost impossible.
      So I feel your pain and I understand the nature of your question.
      This "measure by feel" is not a reliable or repeatable method.
      In the real world where time and accuracy means money, 3 finger bore mikes are the go-to.
      Even telescoping gauges are touch and go, but better than internal spring divider calipers.
      3 finder bore mikes are expensive. At work I have used them, and you can measure bearing
      bores to the tenth of a thou. Like eating cake. A Dial bore gauge is nice too, but you need
      a repeatable way to set it. Micrometers work, but fiddly. They make setting fixtures, which
      make things a bit better.
      For stuff under 1", gauge pins are nice. I have a set from .010" to 1.000". Not practical for the
      beginner, but I guess I began a long time ago, so I have a set now.
      But "spring calipers" are a tough chicken to catch when trying to be accurate with them.
      No two ways about it.

      -Doozer
      Last edited by Doozer; 02-10-2021, 10:15 AM.
      DZER

      Comment


      • #4
        It is very difficult to get the "feel" when the natural flexability of the calipers is truing to fool you. You can try measuring known sizes to check what your actual readings are, and the more practice you have, the better you will get. Investing in a pair of cheap digital calipers will bring more accurate results.

        Comment


        • #5
          But don't toss the spring calipers if you have them.... both ID and OD types are handy.

          The most useful are the ones with an additional short arm on them, with a lock to release one of the measuring arms. You can get a setting behind a larger portion of a part, release the lock to remove the calipers over the larger part, then re-lock to measure the setting.

          If you do not have spring calipers, don't go out of your way to get any.
          2801 3147 6749 8779 4900 4900 4900

          Keep eye on ball.
          Hashim Khan


          It's just a box of rain, I don't know who put it there.

          Comment


          • #6
            Ha- too true.

            I inherited a set of spring calipers...

            and they taught me humility.

            rusting in Seattle

            Comment


            • #7
              Even micrometers rely on "feel". The difference is that after "feeling" with the calipers you need to compare that to some external, marked scale. I have, and use, spring calipers, but mostly for things that are beyond that range of the micrometers that I have, and that don't require more accuracy than I can achieve by eyeballing them against a precision scale - say not much less than 1/64 in.
              "A machinist's (WHAP!) best friend (WHAP! WHAP!) is his hammer. (WHAP!)" - Fred Tanner, foreman, Lunenburg Foundry and Engineering machine shop, circa 1979

              Comment


              • #8
                Originally posted by mickeyf View Post
                Even micrometers rely on "feel". The difference is that after "feeling" with the calipers you need to compare that to some external, marked scale. I have, and use, spring calipers, but mostly for things that are beyond that range of the micrometers that I have, and that don't require more accuracy than I can achieve by eyeballing them against a precision scale - say not much less than 1/64 in.
                I understand now. I'm still at the point where I'm just learning and having fun with the tools, not getting too frustrated. Learning about this stuff just teaches me how much I know nothing about! (Like, how were tolerances determined in the first place when you only had one prototype meter? LOL)

                I wonder if there is a way to get more precise with a traditional calipers and rule that could do better than 1/64th of an inch. Maybe something like this? https://en.wikipedia.org/wiki/Transv...trument_making) Forgive my ignorance, I'm just thinking out loud and trying to figure stuff out.

                Comment


                • #9
                  With a transversal type of caliper, you could probably measure to 1/100" but with any type of engineers calipers, digital, dial or vernier, you could manage 0.001" easily.

                  Comment


                  • #10
                    Originally posted by Stargazer View Post
                    I'm new to machining and still learning about how the basic tools work and measurements are taken. I'm curious about "spring calipers". I read that spring calipers were used for precise measurements before micrometers. How do you get an accurate reading from a tool that doesn't have a scale built into it?
                    Micrometers have been around for a long time. It might be more a case of how the amateur made do compared commercial practice...i.e. not sure how long or much precision metalworking was happen before the micrometer.

                    I think a lot of it is comparative vs quantitative measurement. inside caliper and outside caliper used together to make a shaft fit a bore. Its clearance that mattered not strict adherence to a nominal size. No doubt its was more time consuming and took some skill. I'd also wager its wasn't to sub 1 thou tolerances either. if lapping was allowed, I feel confident I could get a sub 1 thou clearance between bore and shaft without a mic.

                    otoh a great deal of watch work is done by comparison and fits with 10ths tolerances are achieved. Amazing how accurate you can get via comparison and a good loupe.
                    Last edited by Mcgyver; 02-10-2021, 01:19 PM.
                    in Toronto Ontario - where are you?

                    Comment


                    • #11
                      Originally posted by Stargazer View Post
                      I wonder if there is a way to get more precise with a traditional calipers and rule that could do better than 1/64th of an inch. Maybe something like this? https://en.wikipedia.org/wiki/Transv...trument_making) Forgive my ignorance, I'm just thinking out loud and trying to figure stuff out.
                      This is actually a very good question, because it highlights the history of these tools. For many years, people didn't work to a given dimension;
                      instead they worked to a "fit" from an existing prototype. I wasn't necessary to know the actual size so long as parts fit correctly. The very first factories and mass production was done this way, back in the 1700's. Using calipers and not much else.

                      As the Industrial Revolution gained steam (pun intended), people began to realize the value of mass production to absolute dimensions. This would give them interchangeable parts. Micrometers took favor and the calipers were set aside. Most everyone was still using fractions until Henry Ford began insisting on Decimal inches in the 1930's. Basically it was better economics.

                      The old calipers can still be used, with practice. I like to use different pieces of paper to see if I can feel the different thicknesses. The old timers from the 1800's could routinely get fits to .0075 with just calipers. That's 1/128th of an inch A master mechanic with scraping experience could achieve the same fits that we have today -- it just took him a lot longer to do it. And time is money.

                      Comment


                      • #12
                        i bent a pair of these calipers recently to get a mearurement in a place that could not be reached otherwise.

                        Comment


                        • #13
                          Originally posted by nickel-city-fab View Post

                          This is actually a very good question, because it highlights the history of these tools. For many years, people didn't work to a given dimension;
                          instead they worked to a "fit" from an existing prototype. I wasn't necessary to know the actual size so long as parts fit correctly. The very first factories and mass production was done this way, back in the 1700's. Using calipers and not much else.

                          As the Industrial Revolution gained steam (pun intended), people began to realize the value of mass production to absolute dimensions. This would give them interchangeable parts. Micrometers took favor and the calipers were set aside. Most everyone was still using fractions until Henry Ford began insisting on Decimal inches in the 1930's. Basically it was better economics.

                          The old calipers can still be used, with practice. I like to use different pieces of paper to see if I can feel the different thicknesses. The old timers from the 1800's could routinely get fits to .0075 with just calipers. That's 1/128th of an inch A master mechanic with scraping experience could achieve the same fits that we have today -- it just took him a lot longer to do it. And time is money.
                          Nickel City - you wouldn't happen to be in Buffalo too, would you?!

                          What really has me stuck - maybe I'm overthinking this is - is how "tolerances" were estimated when they only had one prototype standard. For example, when "the first" meter bar was created, they had to make a duplicate and so forth. But without any graduations, how did they "compare" the second, third, and so on, meter bars to the standard and estimate their error tolerances? That's where I'm really getting stuck. And that's what's what's confusing me about how accurate they could figure out the tolerances of the transversals, Vernier scales, or even the 25 millionth tolerance of the Whitworth plate. What am I missing in how these comparisons were made and tolerances established?

                          Thanks again for all the thoughtful replies. This is a great website!

                          Comment


                          • #14
                            This is a good read:

                            The Perfectionists: How Precision Engineers Created the Modern World – by Simon Winchester

                            https://www.amazon.com/Perfectionist.../dp/0062652559
                            Location: North Central Texas

                            Comment


                            • #15
                              As already pointed out it's not about a number. It's about comparative measurements only. We're not working with inches or mm's. We're working with a part which is a unit of "1 part". And we need to match the diameter on a second part or we need to make a hole to fit that part into in a press or running fit.

                              Watching my father in the machine shop when I was a kid I saw him do a fair amount of work with spring calipers. Later on when I wanted to make parts for my model airplane engines or some of my early tooling related to my model airplanes he taught me his method for using the calipers.

                              He had a set of micrometers and a vernier caliper but he only brought those out for special applications. And even then as I recall for ID measurements if he used the mic it was off the inside calipers which were used to take the measurement first. And it was only a lot of years later that he bought a set of 0-6" mics... which I have now... For the early days all he had was a 0-1 and a 1-2. All other things were fitted using spring calipers.

                              The trick for external measurements was just getting a slight drag. And I do mean slight. If we're getting a measurement for a press fit he would aim for a touch more drag than a running fit.

                              Then the REAL magic occurred. He would transfer the measurement from the outside to the inside calipers. Now THAT took a careful touch ! ! ! And in fact this is the biggest source of error in the whole process.

                              Geez, it's been 55 or more years back when I fitted lathe turned parts for my model airplane engines this way. As you can see it's all about the feel. And that has to be practiced, You can't teach that sort of thing. Oddly enough though it comes to you faster than you might think.

                              For giggles I'm going to dig out my spring calipers and turn down an arbitrary shaft with four or five steps of .002 as a gauge. Then I'll drill and bore a ring to aim for a running fit on the second step and a press fit on the third. Should be fun and humbling at the same time.

                              For that matter you can get a feel for the OD measurements this way too. With a stepped shaft with .002 diameter steps set it to one setting going for the very light drag just as it passes over the one point then feel how it changes one or two steps to each side. Fair warning though. This is certainly one of those "tongue out the corner of your mouth and pinky sticking out" delicate things.

                              It's an interesting chance to re-live what every machine shop apprentice had to learn to do well.
                              Chilliwack BC, Canada

                              Comment

                              Working...
                              X