LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   General (https://www.linuxquestions.org/questions/general-10/)
-   -   Autonomous systems: why they might not work as planned (https://www.linuxquestions.org/questions/general-10/autonomous-systems-why-they-might-not-work-as-planned-4175595163/)

hazel 12-10-2016 01:48 AM

Autonomous systems: why they might not work as planned
 
I've just seen an item in the BBC News tech magazine programme (it's called Click) which worried me. They were talking about autonomous systems and why the current model, in which these systems carry out routine operations but hand over control to humans in an emergency, will not work in practice.

The problem is simple: humans can't cope with emergencies unless they have plenty of experience of working the machinery in non-emergency situations. Sullenberger was able to land his plane on the Hudson because he was a very experienced pilot. But automation will take away the opportunity for people to acquire this kind of experience.

Imagine it! You're cruising along in your car, a cup of coffee in one hand, a bagel in the other, reading the newspaper, while the car drives itself as it has done for months. Suddenly a voice from the console says "Emergency! Automated system shutting down. Over to manual control." You look up in horror and there's a bus coming straight at you.

What do you do? You crash, that's what.

Myk267 12-10-2016 04:09 AM

People already get crashed into when they're driving all the time and paying attention.

I think it's interesting that the predicted failure case in the given scenario is the computer handing over control to the drive during a catastrophic system failure. A few months back there was some robo car news of some poor guy and his car auto-driving through the side of a truck, not because the computer gave up control, but because the computer didn't sense anything in the first place.

How often does a computer suffer a catastrophic failure? They're pretty reliable. How often does software crash* because something unexpected happened? All the time.

*Edit: Not always a crash. Sometimes it's using up all the CPU or RAM, or other odd behavior the program ending up in some "impossible" state.

273 12-10-2016 05:07 AM

To me the issue with automated systems is just an issue of liability. With a car I'm driving (for example) if it crashes and kills a few people I am likely liable but if an automated car does the same who is liable?
Until that fundamental question is resolved then, for most people, it is likely safer to take control oneself since, at least, the cause and effect will be under one's control.
When the first driverless car kills somebody that will be a very important legal event.

hazel 12-10-2016 06:51 AM

Quote:

Originally Posted by Myk267 (Post 5640202)
I think it's interesting that the predicted failure case in the given scenario is the computer handing over control to the drive during a catastrophic system failure.

How often does a computer suffer a catastrophic failure? They're pretty reliable. How often does software crash because something unexpected happened? All the time.

He wasn't talking about the system failing. He was talking about the system doing what it was supposed to do. People always promise us that autonomous systems will let us take back control in potentially dangerous situations. The point is that we won't be able to cope with those situations if we're too accustomed to machines doing everything for us.

He gave the example of that French plane that crashed into the Atlantic in 2009 with the loss of all aboard. When it was finally located and the black boxes retrieved, it was found that the pilot had crashed the plane because he got confused while flying at a high altitude. He wasn't used to having to do that because the autopilot usually did it.

ntubski 12-10-2016 08:54 AM

Quote:

Originally Posted by hazel (Post 5640232)
People always promise us that autonomous systems will let us take back control in potentially dangerous situations.

Um, have they? I don't recall such promises. Here are some promises for the opposite:

Coming in 2021: A self-driving Ford car with no steering wheels or pedals

Google self-driving car: "We removed the steering wheel and pedals, and instead designed a prototype that lets the software and sensors handle the driving.".

michaelk 12-10-2016 09:05 AM

The Airbus crash is an extreme example because it was caused by both human error and mechanical problems. It was much more complicated then the pilot flying at high altitude without autopilot. Pilots do practice simulated emergency conditions to help quickly recognize when it happens for real and to develop an automatic response.

In the US after we get a drivers license we don't have to take recurrent training or do much of anything else except maybe retake the written test (for what it is worth) to make sure we drive safely. I agree that it would take many seconds for the human mind to figure out what to do and by that time it is to late to prevent an accident. On the other hand if all vehicles and traffic signals communicated with each other then an accident is less likely to happen. Most errors are caused by humans versus mechanical failures.

sundialsvcs 12-11-2016 07:31 PM

Well, we have already experienced human death(!) caused by "self-driving computers."

As I recall the event, a white truck turned out in front of the car ... whose human driver was not paying attention. The computer's scanner did not recognize the presence of the truck, so the car ran underneath it ... killing the human (non-)driver.

I have no objection to "autonomous systems," as long as they do not share a roadway with me.

Because, after all ... "that non-driven(!) vehicle just smashed into me!"

How lucky we are, that this collision, although fatal to the driver of the car, was not also fatal to the driver of the other vehicle ... which happened to be a great big (white) truck. What if it had instead been a white minivan filled with a family and their four kids, all of whom were "wiped out by a driverless vehicle that slammed into them, when any human driver would have just: "hit the horn(!), stepped on the brakes, and changed lanes?" :jawa:

273 12-12-2016 12:45 AM

I think the fact the car killed the "driver" and nobody else probably allowed Tesla to deny all responsibility. It's when a third-party is killed an the driver is facing huge costs or jail time that things will heat up or, if they don't, it will be obvious that the government is being bribed by somebody.

sundialsvcs 12-12-2016 11:28 AM

I'm not willing to be present on any public highway in which any of the drivers around me are not drivers.

I'm also not willing to share my breathing-space with flying things that are not being piloted by people.

hazel 12-12-2016 11:31 AM

Quote:

Originally Posted by sundialsvcs (Post 5641084)
I'm not willing to be present on any public highway in which any of the drivers around me are not drivers.

I'm also not willing to share my breathing-space with flying things that are not being piloted by people.

So what exactly are you proposing to do about it?

273 12-12-2016 12:19 PM

To my mind the solution could be simple -- make the human designated as "driver" responsible. Up to them whether they self-pilot, go full-auto of some other option.
Now, I think I can anticipate "What happens if the producer of the vehicle is negligent or criminal?". Well, we know they are, so..?

sundialsvcs 12-12-2016 01:54 PM

Dunno ... daresay that "our first human example" had just a couple of seconds to realize ... "oh, sh*t!!" ... that the high-technology of which he apparently was so fond had just killed failed him.

How, then, could we seriously have held him responsible (had he survived, of course ...), for having merely "embraced" a technological innovation(?) that he had, apparently, "not only 'embraced,' but even 'championed?'"

After all: "we proffered this high-technology to him, in the first place." We encouraged him to use it. We implicitly told him that it was 'safe,' as evidenced by the fact that he embraced it (publicly!), and actually believed it ... up to, of course, that fatal "oh, sh*t!!" moment.

"Yeah, he was a high-tech fool." But ... who, exactly, induced him(!!!) ... to become so foolish? His family should right now be engaging lawyers, and those lawyers should be filing multi-billion dollar lawsuits ... (IMHO).

- - -
"Y'know, yeah," on the one hand, "all this new-fangled technology is 'just wunnerful.'" Sure, it definitely is 'exciting' that "we now live in interesting times." But are we showing that we are actually prepared, as a human society to live (and, to survive(!)) in such times?

Frankly, I wonder . . . . .

273 12-12-2016 02:59 PM

Nobody induced him in the same way that nobody induces one to get drunk and drive t 200MPH.
If somebody wants to commit suicide then so be it but if they're happy to contemplate manslaughter that's a different story.
To my mind liability has to lay somewhere and there isn't a chance in hell that the corporations in charge of our countries will let it lie anywhere near them. So, if you're happy to infill a new technology set up by an entity set up purely for the making of money of shareholders on the world for your own gratification then I deem you responsible if that technology kills somebody.

Don't get me wrong, I'm sure in a few hundred years*, when every vehicle is automated and enough people have lost their lives and families we'll be in a better situation but for now it just looks grim.


*Make up your own timeline, I just mean "Defiantly not in my lifetime and, I think, longer than it ought to take.".

cynwulf 12-13-2016 08:33 AM

As with electric cars, driverless cars are controversial. There is a fear of losing control and independence. For many the car is about much more than just four wheel transport.

I sit on the fence on this issue, but the status quo is far from perfect. Deaths on the roads in any given country in the world are already unacceptably high. It's an accepted problem with no solution because the gains of having and driving your own car are "worth it" and it's always others who die (and of course the global multi-billion $ automobile and oil industries profit, so it's all good).

Sharing your roads with driverless cars may seem objectionable to some, but you also share them with drunks, overworked and tired people, partially sighted octogenarians, plus the multitudes of idiots who are 'better' drivers than everyone else and can of course drive while on the phone (and faster).

On any given motorway, you are trusting all of these personality types to do the right thing - you just don't think about the risks when you're hurtling along at 70MPH because the bad things happen to others.

273 12-13-2016 01:13 PM

OK, as a slightly contentious point -- what are humans doing walking on roads?
I would need statistics but I can make a pretty decent guess that a significant number of people hit by cars did so because they were idiots -- I lump myself into the idiots here.
Making roads more like railways, though, is almost impossible to do usefully -- at least as far as I can think.


All times are GMT -5. The time now is 03:41 AM.