Announcement

Collapse
No announcement yet.

Is that REALLY the size?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    This all make me think of a video from Dee Dee Donnie.
    He was going on about his 10EE and how much of an art it was
    to operate such an instrument and so on. Also what it takes on
    a lathe like that to achieve 2 tenths tolerance work, like making
    bearing fits for spindles and quill housings. He was talking about
    the machine can run 4000 rpm and that is what you need to make
    slender parts stiff enough to hold 2 tenths, and so on how many
    little tricks there are to get a precision lathe to perform such close
    tolerance work. I mean, I get he is proud of his experience and
    proud to own a nice 10EE.
    But I kept thinking when I was watching him talk about all this....
    If you need such tight tolerances, and it takes this expensive
    lathe, and it must me in good condition, and it takes all this
    50 years of machine shop experience....
    When do you invest in a cylindrical grinder ? ? ?
    I have a Brown Sharpe # 13, a Covel 512, and a Heald # 7 ID
    grinder. It makes holding tolerance like that fairly easy.
    Really the # 13 will do it all, with the swing down ID spindle.
    But in my mind, a lathe can do work to the .001" with careful
    attention to detail. A lathe can do work to .0005" when everything
    is tuned in just right and you have good measuring tools.
    But a grinder can hit .0002" pretty easily and .0001" with good
    technique and care.
    So it makes sense to me, if you are trying to get closer than .001"
    then maybe buy a cylindrical grinder. It does not take 50 years of
    experience to be somewhat good with a grinder. It just make more
    sense to me. And a clapped out cylindrical grinder is going to be
    way more accurate than a clapped out lathe. Just due to the bearing
    surface of the table alone.
    Anyhow, I guess I am saying, pick the best tool for the job.
    A Harbor Freight lathe and a roll of sand paper is not going to cut it.

    -Doozer
    Last edited by Doozer; 03-28-2023, 08:22 PM.
    DZER

    Comment


    • #17
      For me not machining. I doubt I could hold my machines to that detail and even if I could Im sure the number will be off the second I take it off the machine.

      Now measuring and matching to .0001" I do most of the time I am building an engine for an automobile. The bearings for the crank and camshaft are usually always measured to the tenth. That is a way to make sure oil pressures are within spec. JR

      Comment


      • #18
        Yep the grinder is the tool for tenths. No doubt about that.

        But, you do still need to measure those tenths. With something, and in an environment, that makes it reasonable to determine which tenth you have measured to. Of course that is if you need to have NIST and Boeing etc agree that your measurement is correct.

        To make this thing fit that thing, you only need relative measurement. It could be 40C or -40C and it will not matter much for that.
        CNC machines only go through the motions.

        Ideas expressed may be mine, or from anyone else in the universe.
        Not responsible for clerical errors. Or those made by lay people either.
        Number formats and units may be chosen at random depending on what day it is.
        I reserve the right to use a number system with any integer base without prior notice.
        Generalizations are understood to be "often" true, but not true in every case.

        Comment


        • #19
          I think my post about my R8 arbor was one of the most recent posts "announcing" measurements at various degrees of tenths. So I'd like to comment.

          First off, would it help if the person doing the posts showed that they were using a micrometer with the tenths vernier on the barrel? Is it the readings themselves or are you trying to say that the machines and persons cannot hit the numbers to this degree?

          As for the measurements I checked just now and my micrometer's barrel graduations are 0.070 wide, plus or minus a couple of thousandths, according to my headband magnifier and my calipers. That's a pretty big piece of room for the eye to work with. David, you mentioned you would be ashamed if you could not split that and measure to a half thousandth. Fair enough..... So would I with that width of graduation to work with.

          But with that in mind why would you not feel OK with going a little further and guesstimating two or three tenths when it looks like it's a quarter or third of a graduation? That sort of spacing is typically pretty clear as well. This being based on getting multiple consistent readings that stop within the width of the index line multiple times? That's what I aim for when taking micrometer measurements. If I can't get consistent readings to plus or minus the width of the index line then I'm not doing it right yet.

          Another one, that for me at least, jumps out is the symmetry of half and half a division. And the pretty clear non symmetry if the index mark is just a whisker to either side. And that would be why I posted in the other thread about one end of the part being at something +4 tenths and the other being bigger at something +6 tenths. These were based on multiple consistent readings and a clear shift to each side of the middle between the two spots.

          Of course what IS lacking is any sort of NIST or other standards for checking and calibration. But in a home shop we're generally working off a sample part and only need the sort of relative readings that this provides. And we're often not working at room temperature. So it's a relative accuracy for that day and those two parts. In my case it was three commercial samples and I must have measured them easily a dozen times each to ensure that I was getting consistent readings that varied by no more than the width of the index line. And that's pretty well my confirmation. When I can take three readings that vary by the width of the index line or less I call that consistent and go with what I see.

          Chilliwack BC, Canada

          Comment


          • #20
            Originally posted by BCRider View Post
            .................................................. ........

            Of course what IS lacking is any sort of NIST or other standards for checking and calibration. But in a home shop we're generally working off a sample part and only need the sort of relative readings that this provides. And we're often not working at room temperature. So it's a relative accuracy for that day and those two parts. In my case it was three commercial samples and I must have measured them easily a dozen times each to ensure that I was getting consistent readings that varied by no more than the width of the index line. And that's pretty well my confirmation. When I can take three readings that vary by the width of the index line or less I call that consistent and go with what I see.
            Maybe.

            If you are working with steel, and your mic is steel also, if everything is at the same temp (basic for any sort of accuracy), then your measurements are still "pretty accurate". We can argue about the temperature coefficient of expansion for different alloys, but if the alloy is the same for all, same harness, etc, then the mm or inch thermal error is the same for all, and they measure the same dimension as at 20C.

            If it says 1.000", it should be 1.000" at any other temp as well. All things made of the same stuff at the same temp should expand or contract the same. It's still an inch, but the actual inch measured may not be the same "absolute inch" as at Boeing, even if they agree it is an inch. (Obviously same for mm or whatever).

            Obviously if you are measuring aluminum, with a steel mic, you have at least got some calculating to do.
            CNC machines only go through the motions.

            Ideas expressed may be mine, or from anyone else in the universe.
            Not responsible for clerical errors. Or those made by lay people either.
            Number formats and units may be chosen at random depending on what day it is.
            I reserve the right to use a number system with any integer base without prior notice.
            Generalizations are understood to be "often" true, but not true in every case.

            Comment


            • #21
              J Tiers, wrote about " Home shop Harry" Well I knew a Harry very well, his name was Harry Boneham, a relative of the Bonehams of Boneham and Turner, the firm who make machine tools for other machine tool makers, and a large proportion of the world's supply of precision drill bushings. In retirement he used a 50$ Rivett lathe rescued from the scrap and a home made Dore Westbury mill. He never claimed to be able to measure in tenths, but did produce many steam and gas engine models with lapped bores and pistons which fitted as intended and lasted for much use. His parts slid smoothly, shakelessly together. I inherited his micrometers, they really are just about worn out.
              My main point with this posting was to encourage beginners that , even if they cannot measure to a repeatable tenth they CAN produce acceptable fits,
              Regards David Powell

              Comment


              • #22
                The size of the part also matters. Remember (if measuring in inches) that CTE is in millionths of an inch per degree per inch. the longer the part, the more its length moves as the temperature changes. All factors come into play for more precise measurements. User error with the measuring device, calibration error, temperature differential, geometry errors on the workpiece, etc.

                Interpolating sizes on a micrometer is not a very good way to convince others of your measurements' precision... I've heard it said that a measuring device needs to have so many times finer graduations than the increments you actually want to measure if you want any real certainty in those measurements. That's probably not inaccurate. So using a .001" micrometer and guessing at the in-between measurements is not going to do much for tenths level precision or accuracy.

                Comment


                • #23
                  AND if you give the same part to ten professional QC people, in ten different controlled environments (same temperature and same other controls, just different locations), with ten measurement devices of the same design (all with current calibrations, only differing by serial numbers), just how many sizes would you expect them to report?

                  The range of the differences may be different but there will be variances. Nature of the beast!

                  Heck, professional scientists can measure the same thing multiple times in the best labs in the world, using the same equipment and get different values. Just read any scientific paper that includes data.



                  Originally posted by J Tiers View Post

                  ...<snip>...

                  Third: People

                  It is often said that you can give ten professional machinists a part, a mic, and ask the size, with the result of ten different numbers. Never did that, but it makes sense. So your results depend on your ability to work the mic with the same force every time.
                  Last edited by Paul Alciatore; 03-28-2023, 04:38 PM.
                  Paul A.
                  Golden Triangle, SE Texas

                  And if you look REAL close at an analog signal,
                  You will find that it has discrete steps.

                  Comment


                  • #24
                    I'd agree with all this if the actual stopping point of the gauge or micrometer index line varies by a significant amount between measurements. But if we can't get consistent and repeatable readings within two widths of the index line on a mic or a single needle width of a gauge then I'd say we've got other problems.

                    I don't consider myself a miracle machinist by any stretch. And if I haven't used a mic in a while I do find that it takes me a half dozen or more measurement repetitions to get the touch back. I don't trust what I'm getting until I get consistent readings that match within that two line widths idea. That seems pretty basic to me.

                    But once I'm back in tune I don't understand why it would not be valid for measurements BETWEEN PARTS ON HAND during a session to split a thousandth division to something a bit finer than a half thousandth. Especially when it's pretty easy to see the difference between a quarter, third or half a division. And when the mic stops consistently on those division portions over a half dozen repeated measurements of one item.

                    I'm not offended by any of this either. I hope I'm not coming across that way. I'm more curious at why the rest of you don't feel that at least some extrapolation down to finer values isn't valid. Especially if we're getting repeated consistency to within that two widths of the index line on a mic. And especially when working with parts/samples on hand which are at the same temperature and where the measurements are taken in reasonably short time frames. Here again we are not working to any certified calibration. Just to the samples on hand.




                    Last edited by BCRider; 03-28-2023, 05:05 PM.
                    Chilliwack BC, Canada

                    Comment


                    • #25
                      I am educated in measuring stuff. Have a micrometer and digital caliper. When I measure a gauge block, micrometer and caliper give the "same" value. When I measure a fresh turned bar, the caliper most of the time, measures 0.02 mm (0.001") less. I have 3 more calipers and micrometers and they all have the same "problem". So when I have to make parts for somebody, I ask how he measures the parts made.
                      Personally I use the values from the micrometer that gives better results for tight fits.
                      Yes, I know the differences between micrometer and calipers come from the roughness of the part, the width of the contact area and the contact pressure.
                      When I take the time,on the lathes I get within 0.01 mm (0.0005") of my target value. But that doesn't mean it is that accurate.

                      Comment


                      • #26
                        Originally posted by Huub Buis View Post
                        ..... When I take the time, on the lathes I get within 0.01 mm (0.0005") of my target value. But that doesn't mean it is that accurate.
                        But if you're doing that to match a component on hand?

                        Good point on the mic vs caliper on a turned surface. It's that "thread like waviness" left by the cutter. I'll have to try with my own calipers and see what I get with the sharp knife like tips vs the flatter wide area just inside of the knife edges and see if it reflects the caliper vs mic difference.

                        Chilliwack BC, Canada

                        Comment


                        • #27
                          Originally posted by BCRider View Post
                          I'm more curious at why the rest of you don't feel that at least some extrapolation down to finer values isn't valid. Especially if we're getting repeated consistency to within that two widths of the index line on a mic.
                          I do readings on my micrometer on half a division (0.005 mm (0.0002"). If you can do better and get that same value day after day, you do a better job than me. Beware that 10 mm aluminum will extend 0.23 um for every degree in temperature rise. So better don't hold that (or your micrometer) in your hand when measuring.

                          Comment


                          • #28
                            Originally posted by BCRider View Post

                            But if you're doing that to match a component on hand?
                            Even for just comparing an other part it is difficult (unreliable) is the surface finish of that part differs from my part.

                            I use the half division mostly to compare parts that I have made the same way (small series) to make them more equal (not more accurate).

                            Comment


                            • #29
                              Don't know about you, but I frequently have to measure small parts to a tenth. It is easy to do if you have a good tool. I would not use a 100 year old Starrett with .001" graduation for such jobs. I have a very good digital Mitutoyo micrometer, which reads to a .00005" and is accurate and repeatable to .0001". If I really need precision, I would set this micrometer on a correct size gauge block and then use it to measure the part. This method never failed me. You need to watch the temperatures of course.

                              Comment


                              • #30
                                Originally posted by Willy View Post

                                This is exactly my plight as well while in the shop.

                                I do a lot of target shooting and go to great lengths to achieve 5 shot one hole groups at various distances down range, usually 100-150 yards.
                                I try to make each cartridge as identical as humanly possible dimensionally and each component of that cartridge will of course have the same lot number in order to remain as consistent as humanly possible.
                                I always achieve that one hole group on every outing, and then I take that second shot.............
                                Accuracy at the range, whether it be rifle , cartridge or shooter shows up at 300-500+yds. It’s not possible at all possible with commercial Ammo, very much like the home shop environs is to 0.0001” accuracy.

                                Comment

                                Working...
                                X
                                😀
                                🥰
                                🤢
                                😎
                                😡
                                👍
                                👎