opinion -- keep a.i. driven vehicles away from people?
Should we keep a.i. driven vehicles away from people?
|
|
Absolutely. I don't want to be killed by a computer.
|
LOL you guys. Have you considered sickness, hunger, cell phones, drugs/alcohol and all the other things that distract human drivers that have zero effect on AI? The proof is in the pudding. There are lots of data on autopilot accidents (including aircraft for decades) but due to the fear level with cars some sites have popped up as global repositories for ALL autopilot capable cars and even separate accidents that occurred while autopilot was engaged vs/ when not engaged.
Here's a sample from tesladeaths.com Quote:
|
Absolutely not.
There are already autonomous semi-trucks being tested on public highways with a rider in the passengers seat. In case of Deus Ex starting while he was on board. |
Quote:
Quote:
|
The problem is people will get used to dreaming in their AI-driven cars, not watching the road and so on. And then suddenly the AI will say, "Emergency situation detected. You must take over NOW!" What's the odds that the driver will panic?
|
My father owns a Tesla and I drive it a lot but it is not an autonomous self driving vehicle. Although some accidents are probably due to the autopilot my guess is that most are in some way driver error or not understanding its limitations. You can not include the idiots that bypass the hands on steering wheel detection and sit in the back seat... True self-driving vehicles like the Waymo or Uber is something completely different.
There are many cars with adaptive cruise with lane centering and have stop and start capabilities with some that do not have some sort of driver awake/alert detection feature. Some have a camera to detect if the driver is awake but I suppose you could bypass this somehow as well. |
There aren't any real hard numbers yet but I'd guess the "problems" caused by AI vehicles is a fraction percentage wise of problems caused by human drivers. People don't seem to bat an eye at other numbers
https://crashstats.nhtsa.dot.gov/Api...es%20in%202018. Alcohol was involved in 29% of vehicle deaths in 2018. 10,511. 1 every 50 minutes in the US. I'd guess AI would be far lower than that. People are just afraid because it's new. Years ago trains ran at the "breakneck speed" of 25 mph. People freaked. Now bullet trains easily push 250+ or so? Same with cars. Always the same story. They sky is falling. After a number of years and kinks worked out they are exponentially safer than they've ever been now. Same will happen with AI vehicles. |
The real problem is that human drivers are unpredictable, often doing dangerous or illegal maneuvers. AI drivers are not going to be successful until all drivers follow the rules. This means getting human drivers off of the roads.
Ed |
If you're going to be running "computer-controlled, automatic" vehicles, then you're going to have to build a parallel railroad to run them on.
|
I know a guy who works for TuSimple and this is cutting edge AI technology:
https://www.youtube.com/watch?v=f2aocKWrPG8 |
Quote:
|
"I'm the Human!" Suck It Up! :D
I refuse to have it inscribed upon my tombstone: "He was killed by Abort? Retry? Ignore?" |
If you look at the accidents involving driverless vehicles (including vehicles in which the purported driver was not paying attention), you will see that many of them include situations which the AI did not anticipate. Because it did not anticipate them, it did not react to them.
I must admit that I am skeptical that AI will reach a point at which it will be able to anticipate all the screwy things that drivers do. I will draw an analogy. I worked for the railroad for many years, and railroads have lots of safety rules (the first of which is that "safety is of the first importance in the discharge of duty"). One of the things I learned is that every safety rule was a reflection of an injury or death. In other words, the rules were a consequence, not an anticipation. In building AI for self-driving vehicles, each algorithm is a consequence, not an anticipation. If the programmers don't expect a bicyclist to be on the wrong side of the road, they won't build in a rule for it. If they don't expect a pedestrian to jaywalk, they won't build in a rule for it. If they don't expect a delivery truck to double-park because no parking space is available, they won't build in a rule for it. If they don't expect that a driver will be going the wrong way on a limited access highway, they won't build a rule for it. If they don't think a skateboarder might be on the street, they won't build a rule for it. Heck, we can't even take the prejudice out of facial recognition. It's not the computers I distrust. It's the programmers, because they are human. And flawed. Like you and me. I seriously doubt that programmers will be able to anticipate everything that can go wrong on the public highways. I seriously believe that "self-driving" vehicles on public thoroughfares are a pipe-dream of the computer-enamored, and a hazard to real live human beings driving their automotive conveyances. |
All times are GMT -5. The time now is 07:22 PM. |