LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   General (https://www.linuxquestions.org/questions/general-10/)
-   -   opinion -- keep a.i. driven vehicles away from people? (https://www.linuxquestions.org/questions/general-10/opinion-keep-a-i-driven-vehicles-away-from-people-4175697064/)

rico001 06-29-2021 04:57 PM

opinion -- keep a.i. driven vehicles away from people?
 
Should we keep a.i. driven vehicles away from people?

frankbell 06-29-2021 07:48 PM

Given their track record, yes.

And away from the highways.

sundialsvcs 06-29-2021 07:55 PM

Absolutely. I don't want to be killed by a computer.

enorbet 06-29-2021 11:26 PM

LOL you guys. Have you considered sickness, hunger, cell phones, drugs/alcohol and all the other things that distract human drivers that have zero effect on AI? The proof is in the pudding. There are lots of data on autopilot accidents (including aircraft for decades) but due to the fear level with cars some sites have popped up as global repositories for ALL autopilot capable cars and even separate accidents that occurred while autopilot was engaged vs/ when not engaged.

Here's a sample from tesladeaths.com

Quote:

Originally Posted by Non-Affiliated Survey - 2021
Tesla Deaths Total as of : 192 | Tesla Autopilot Deaths Count: 6

This doesn't mean I'm in favor of AI piloted vehicles. The real danger is a time will very likely arrive when few humans even know how to drive. Hmmm although that might also result in more walking and bicycling.... ;)

Trihexagonal 06-29-2021 11:40 PM

Absolutely not.

There are already autonomous semi-trucks being tested on public highways with a rider in the passengers seat.

In case of Deus Ex starting while he was on board.

fatmac 06-30-2021 03:37 AM

Quote:

The real danger is a time will very likely arrive when few humans even know how to drive.
From my experiences, there are quite a few on the roads now!


Quote:

Hmmm although that might also result in more walking and bicycling....
That would be better for all of us, (we're getting very lazy these days), a massive boost to our health!

hazel 06-30-2021 05:18 AM

The problem is people will get used to dreaming in their AI-driven cars, not watching the road and so on. And then suddenly the AI will say, "Emergency situation detected. You must take over NOW!" What's the odds that the driver will panic?

michaelk 06-30-2021 08:15 AM

My father owns a Tesla and I drive it a lot but it is not an autonomous self driving vehicle. Although some accidents are probably due to the autopilot my guess is that most are in some way driver error or not understanding its limitations. You can not include the idiots that bypass the hands on steering wheel detection and sit in the back seat... True self-driving vehicles like the Waymo or Uber is something completely different.

There are many cars with adaptive cruise with lane centering and have stop and start capabilities with some that do not have some sort of driver awake/alert detection feature. Some have a camera to detect if the driver is awake but I suppose you could bypass this somehow as well.

jmgibson1981 06-30-2021 12:03 PM

There aren't any real hard numbers yet but I'd guess the "problems" caused by AI vehicles is a fraction percentage wise of problems caused by human drivers. People don't seem to bat an eye at other numbers

https://crashstats.nhtsa.dot.gov/Api...es%20in%202018.

Alcohol was involved in 29% of vehicle deaths in 2018. 10,511. 1 every 50 minutes in the US.

I'd guess AI would be far lower than that. People are just afraid because it's new. Years ago trains ran at the "breakneck speed" of 25 mph. People freaked. Now bullet trains easily push 250+ or so? Same with cars. Always the same story. They sky is falling. After a number of years and kinks worked out they are exponentially safer than they've ever been now. Same will happen with AI vehicles.

EdGr 06-30-2021 04:40 PM

The real problem is that human drivers are unpredictable, often doing dangerous or illegal maneuvers. AI drivers are not going to be successful until all drivers follow the rules. This means getting human drivers off of the roads.
Ed

sundialsvcs 06-30-2021 05:11 PM

If you're going to be running "computer-controlled, automatic" vehicles, then you're going to have to build a parallel railroad to run them on.

Trihexagonal 06-30-2021 06:34 PM

I know a guy who works for TuSimple and this is cutting edge AI technology:


https://www.youtube.com/watch?v=f2aocKWrPG8

frankbell 06-30-2021 07:46 PM

Quote:

The real problem is that human drivers are unpredictable, often doing dangerous or illegal maneuvers. AI drivers are not going to be successful until all drivers follow the rules.
And that's not going to happen.

sundialsvcs 06-30-2021 08:12 PM

"I'm the Human!" Suck It Up! :D

I refuse to have it inscribed upon my tombstone: "He was killed by Abort? Retry? Ignore?"

frankbell 06-30-2021 10:03 PM

If you look at the accidents involving driverless vehicles (including vehicles in which the purported driver was not paying attention), you will see that many of them include situations which the AI did not anticipate. Because it did not anticipate them, it did not react to them.

I must admit that I am skeptical that AI will reach a point at which it will be able to anticipate all the screwy things that drivers do.

I will draw an analogy. I worked for the railroad for many years, and railroads have lots of safety rules (the first of which is that "safety is of the first importance in the discharge of duty"). One of the things I learned is that every safety rule was a reflection of an injury or death. In other words, the rules were a consequence, not an anticipation.

In building AI for self-driving vehicles, each algorithm is a consequence, not an anticipation. If the programmers don't expect a bicyclist to be on the wrong side of the road, they won't build in a rule for it. If they don't expect a pedestrian to jaywalk, they won't build in a rule for it. If they don't expect a delivery truck to double-park because no parking space is available, they won't build in a rule for it. If they don't expect that a driver will be going the wrong way on a limited access highway, they won't build a rule for it. If they don't think a skateboarder might be on the street, they won't build a rule for it.

Heck, we can't even take the prejudice out of facial recognition.

It's not the computers I distrust. It's the programmers, because they are human. And flawed. Like you and me.

I seriously doubt that programmers will be able to anticipate everything that can go wrong on the public highways. I seriously believe that "self-driving" vehicles on public thoroughfares are a pipe-dream of the computer-enamored, and a hazard to real live human beings driving their automotive conveyances.


All times are GMT -5. The time now is 07:22 PM.