I always thought it boiled down to whether you're talking to a scientist (achademic) or an engineer. HP calculators have settings for FIX, SCI, and ENG which determine how decimals are displayed. The difference between SCI and ENG is that ENG makes the exponent a multiple of 3, so you see thousanths, millionths, billionths, etc. (Ok, milli-, micro-, and nano-...) The SCI setting likes to put a single digit to the left of the decimal, the rest on the right, and adjust the exponent to make it all fit.
I know it sounds like gobbledigook to type it out like this, but when speaking, it's easier to say "millionths", "thousanths" etc than it is to say "ten-thousanths", "hundred-thousandths", etc. I guess that makes me an engineer. I'd rather say "50 millionths" than "five hundred-thousandths" or perhaps more correctly, "five one-hundred-thousanths".
What did the "cell operator" say .00005 is?
The curse of having precise measuring tools is being able to actually see how imperfect everything is.