Announcement

Collapse
No announcement yet.

"Tessy" only the beginning

Collapse
This topic is closed.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • A.K. Boomer
    replied
    Originally posted by dp View Post
    https://www.youtube.com/watch?v=sXls4cdEv7c

    Computers will not take stupid out of the car.

    People will have to take the stupid out of the autopilot.
    https://www.youtube.com/watch?v=MrwxEX8qOxA

    Great perspective DP thanks for jarring everyone back into reality ---- the guy sleeping is just plain sad, the other guy just got a wake up call.

    what are the odds of that happening on my car with a direct rack & pinon unit and good well checked out tie-rod ends? absolutely ZERO, How would I feel about letting go of a steering wheel and letting the car drive itself with oncoming traffic just three feet to the other side? I'll tell you how I feel - not gonna happen - ever,,,
    wait till "tin whiskers" start growing into those electronic components --- what a joke people have become... choke on it world, you really have both earned and deserve this...

    Leave a comment:


  • danlb
    replied
    Originally posted by Rosco-P View Post
    If the automated system is only reliable in good weather, on good roads and in "normal" driving and traffic situations, then it's useless. It couldn't avoid a crash in an all too common situation, another vehicle suddenly turning into its path.
    That presupposes that the crash was avoidable by the Tesla and/or the driver of the Tesla. Looking at the area, it appears that there was less than 500 feet between visual contact and impact. Can you stop a car that's traveling at freeway speeds in less than 500 feet from the time you realize that there is an obstacle??? That's a maybe. The Model S is able to go 60 to 0 in only 108 feet. That's encouraging, but reports say he was prone to speeding. But how long does it take the driver (or AI) to determine that the truck that's ahead is moving slowly, and in which direction? Every second of evaluating the situation takes him 100 feet closer. How long does it take him to hit the brakes? Another second and you are another 100 feet closer.

    It seems to me that a couple of seconds of thinking that he could get past the truck is all it takes to change it from "avoidable" to "unavoidable". I know that's happened to me with disastrous results.

    One report mentioned that there was a 'rise' in the road that blocked the car from the view of the truck driver. I checked google maps, and found the rise in the road was about 500 feet from the accident. You can see how bad the visibility is in that area when you use google street view. The google camera is usually mounted on a mast that puts it about the same height as the truck driver.

    Leave a comment:


  • danlb
    replied
    Sometimes progress is incremental. It looks like AEB ( Automatic Emergency Braking ) will be standard in all cars within 6 years. It might not stop the car, but should slow down enough to reduce damage/carnage.




    Dan

    Leave a comment:


  • Magicniner
    replied
    Originally posted by Rosco-P View Post
    If the automated system is only reliable in good weather, on good roads and in "normal" driving and traffic situations, then it's useless. It couldn't avoid a crash in an all too common situation, another vehicle suddenly turning into its path.
    You are preaching to the choir her sunshine ;-)

    Leave a comment:


  • dp
    replied
    article: http://electrek.co/2016/05/23/tesla-model-s-driver-caught-sleeping-wheel-autopilot-video/source: http://i.imgur.com/E3joXpL.gifv####################...


    Computers will not take stupid out of the car.

    People will have to take the stupid out of the autopilot.
    I am the proud owner of a 2015 Tesla SP90D, purchased with all available options. It is the best car I have ever owned and I love it dearly. I also own a lar...
    Last edited by dp; 07-03-2016, 12:02 PM.

    Leave a comment:


  • Danl
    replied
    I saw a report on TV this morning that by the year 2020 every single major automobile manufacturer will either have, or will be working on these AI cars.

    That is real soon! Sheesh.

    Dan

    Leave a comment:


  • RB211
    replied
    LA TIMES PAGE ONE SUNDAY: GOOGLE to make self-driving cars 'human-proof'... Developing...
    Curtesy of "Drudge Report" Sorry, the pilot shall always have the final say!

    Leave a comment:


  • fixerdave
    replied
    Originally posted by dave_r View Post
    Tesla will probably be sued, but Google shouldn't be involved, as Tesla is using their own stuff for the driver-assist feature, and not google's self-driving code...
    Sorry, my statement was ambiguous. What I meant to say is that Google may pay the legal bills TO sue Tesla, not defend them. Google does not like Tesla's approach (rush) to driver AI as they feel it will generate negative publicity (just proven right) and set back acceptance of AI driving (remains to be seen).

    Google's approach seems to be to get the AI better and better while at the same time working towards legal approval of "self driving". Tesla's approach is to bypass legal approval by calling it "assisted" and requiring a driver... even though, as others have pointed out, said driver will often be entirely distracted and pathetically useless when the AI messes up.

    A case in point is the recent law in, I think, Ontario that requires people to have a license to NOT drive a car, said car being driven by AI. In other words, an incremental law that allows for testing of AI driven cars on public roads, if they are supervised by people licensed to do so. It's this kind of incremental approach by Google that I think will eventually lead to success.

    David...

    Leave a comment:


  • Yow Ling
    replied
    The big question is how much govt and federal funding will Tesla be able to get to defend the lawsuits ?

    Leave a comment:


  • RB211
    replied
    Originally posted by garyhlucas View Post
    Some years ago Firestone got sued over supposedly 'bad' tires. Tire failures caused rollovers of SUVs. Shortly after this I got a flat and damaged tire on my own Safari van while out of town. The only a available replacement tire was a Firestone, but a different model than the problem one. The next day I was driving on the NJ turnpike at about 70 mph and heard an unusual noise, that sounded like a tire. I gently released the wheel to see if it pulled one way or the other. I have had lots of flats and the vehicle always pulled to side. My van tracked perfectly. A few minutes later the noise got much louder and I pulled off the road to check. The brand new tire was flat, completely destroyed and ready to burst into flames! I changed the tire and took it to the dealer the next day, where they found a large nail embedded in the tread. So I believe that tire was actually too good. It performed well even completely flat. If you were distracted, say by 1/2 dozen passengers as some of the rollovers had it would be very easy to miss what was happening until too late.
    Self driving cars will be the same, they'll work so good that failure is inconcievable, and they'll get the blame when it happens no matter the real circumstances.
    In the cockpit, reliance on automation has been a large factor in recent history for air accidents. In the USA, most pilots learned to fly on small airplanes with no automation for thousands of hours ever before getting on with an airline. In Asia, Europe, South America, and elsewhere, the same cannot be said. Air France is a prime example of what happens with too much reliance on automation. Also the 777 in SFO.
    Most people are stupid and lazy. You have to beat into them the importance of being on guard, and have to go through intense training to understand the pitfalls, and how to detect it early on before a problem arises.
    The typical spoiled millennial buying a Tesla or other advanced technology car already knows everything...

    Leave a comment:


  • garyhlucas
    replied
    Some years ago Firestone got sued over supposedly 'bad' tires. Tire failures caused rollovers of SUVs. Shortly after this I got a flat and damaged tire on my own Safari van while out of town. The only a available replacement tire was a Firestone, but a different model than the problem one. The next day I was driving on the NJ turnpike at about 70 mph and heard an unusual noise, that sounded like a tire. I gently released the wheel to see if it pulled one way or the other. I have had lots of flats and the vehicle always pulled to side. My van tracked perfectly. A few minutes later the noise got much louder and I pulled off the road to check. The brand new tire was flat, completely destroyed and ready to burst into flames! I changed the tire and took it to the dealer the next day, where they found a large nail embedded in the tread. So I believe that tire was actually too good. It performed well even completely flat. If you were distracted, say by 1/2 dozen passengers as some of the rollovers had it would be very easy to miss what was happening until too late.
    Self driving cars will be the same, they'll work so good that failure is inconcievable, and they'll get the blame when it happens no matter the real circumstances.

    Leave a comment:


  • danlb
    replied
    Originally posted by J Tiers View Post
    Drivers cause lots of accidents, or try to. Its the nature of driving, by computer or by human.

    Most of them are avoided.

    People are STILL making excuses about this. And it is STILL exactly what driver assist is supposed to deal with.....The unexpected that the driver misses.

    NO the truck did not cause the accident. The truck was a normal hazard of everyday driving. It was not noticed, so there was an accident, or better description, a collision. It does not matter if the truck even violated laws. Other drivers deal with that several times per mile. It's no excuse.
    I think the flaw in your assertion is two fold.

    First is your implication that the automation is supposed to be able to do what the driver can't 100% of the time. That's not the way it works. It can't keep you from hitting the deer that jumps out of the bushes without notice. It can't help you when you come to a stop and the guy behind you rear ends you when he does not see what you did. A vision based system will always have the same limitations of a human when the view is obstructed. It will always have to obey the same laws of physics when you have to stop in a very short distance due to a moving truck becoming a stationary obstacle.

    Second is your assertion that "Other drivers deal with that several times per mile". That implies that they do it successfully all the time, and that's patently false. The main cause of multiple car accidents is that one car does something dumb and the other can't deal with it.

    Dan
    Seconds

    Leave a comment:


  • danlb
    replied
    Originally posted by dave_r View Post
    Google has their own testing cars, that from what I've read, are limited to lower speeds (around 20-30 mph) and areas (very well mapped out parts of a few cities).
    Google has several fleets of cars. The original ones were Priuses. I've seen them on I-680, downtown Pleasanton, outskirts of Dublin and several places in Silicon Valley. They drive just like all the other cars, but they hold the lane position better and actually use their turn signals.

    They don't need "very well mapped" information since they have GPS and can use maps and traffic data available via the Internet to get the general location. Then they use cameras and LIDAR ( I think. It could be Radar) to do the fine navigation, lane position, etc.

    Google has proposed and built cars that are low speed (35 mph or less) and only for local use. They are doing this as a stepping stone to build public confidence. I'd like to see them or something like them to replace taxi cabs and city buses. Our taxi service is terrible (rude and bad drivers), and the bus routes require that you walk half a mile or more to the nearest bus stop. A half mile is a long way in 110 degree heat or during a downpour.

    Dan

    Leave a comment:


  • dave_r
    replied
    Originally posted by fixerdave View Post
    Will Tesla be sued over this... probably. I wouldn't be entirely surprised if Google paid the legal bill. I've read they are rather miffed at the high-speed rollout Tesla is doing... they're expecting this kind of bad publicity and would rather avoid it. If they can't avoid, I expect they'd rather be seen on the "right" side of the debate. Google's approach has been much more methodical while Tesla's is to legally require a driver in the seat paying attention. But, as A.K. Boomer points out... that isn't going to happen. Given the possibility of not paying attention, far too many people will find other things to do. That is inevitable. Google is right; Tesla is wrong. Driver "assist" that approaches total control, thus allowing the driver to stop actually driving, is not really driver assist but rather driver replacement. Calling it driver assist to skirt the law is asking to be sued, and they will.
    Tesla will probably be sued, but Google shouldn't be involved, as Tesla is using their own stuff for the driver-assist feature, and not google's self-driving code.

    Google has their own testing cars, that from what I've read, are limited to lower speeds (around 20-30 mph) and areas (very well mapped out parts of a few cities).

    Leave a comment:


  • PStechPaul
    replied
    A few points:

    1. The truck driver said that he had waited for one car to clear before he started his turn, but when he had already committed, he saw the Tesla crest the hill at high speed. He tried to hit the gas to get out of the way, but with a full load he was unable to get far enough to avoid the collision.

    2. The Tesla moved from the fast lane to the slow lane, rather than swerving the other way toward the rear of the trailer, which would have been the correct response. But moving to the right was the same as it had done in the video clip where it had avoided collision with a truck moving in the same direction, turning into its path.

    3. The truck driver had several violations, mostly not severe, and the truck's tires were badly worn, but these were not contributable to the crash. The Tesla driver had numerous speeding tickets, and was known as a "speed demon", which seems to be a prime reason for the crash.

    4. "Drunk Driving" laws seem to be aimed more at the casual social drinker, with DUI levels of about 0.03% to 0.05%, and DWI at 0.07% to 0.10%, while most severe crashes usually cite driver BAL of 0.2% and as high as 0.4%, which indicate long-term alcoholism and tolerance. I'll admit to occasionally driving while having detectable amounts of alcohol in my system, but probably never over 0.05% (at least not for 20 years or more), and when I have felt "buzzed", I always consciously drove slower and more carefully (but not extremely so, which can also arouse suspicion from LEOs). I think the danger comes when someone has latent tendencies to become aggressive or angry, and alcohol strips their inhibitions.

    5. Punishment is only effective when administered properly, which means it must be immediate, consistent, and severe enough to form a deep association between the unwanted behavior and consequences. This has been proven for many years in traditional dog training, and the lack of punishment (as promoted by "positive" trainers) has been shown to be ineffective, especially in serious cases. Inhibitions against self-rewarding but dangerous behavior cannot be established by "positive reinforcement", and mild or poorly timed aversives also fail.

    See: http://clickandtreat.com/html/offleadinhibitions.HTM

    Many dog training principles can apply to humans as well. Unfortunately, punishment has become increasingly taboo in our age of PC and permissiveness. Some of Dr. Spock's guidelines have been misused and misinterpreted to produce nasty brats who grow up to be dysfunctional adults who feel entitled to "enjoy" destructive behavior with no fear of consequences. And "positive" dog trainers ignore and demonize the +P (applied punishment) quadrant of Skinner's four principles of operant conditioning, while relying almost entirely on +R (rewards for good behavior and to distract from bad behavior).

    Leave a comment:

Working...
X
😀
🥰
🤢
😎
😡
👍
👎