LinuxQuestions.org
Review your favorite Linux distribution.
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > General
User Name
Password
General This forum is for non-technical general discussion which can include both Linux and non-Linux topics. Have fun!

Notices


Reply
  Search this Thread
Old 12-10-2016, 01:48 AM   #1
hazel
LQ Guru
 
Registered: Mar 2016
Location: Harrow, UK
Distribution: LFS, AntiX, Slackware
Posts: 7,494
Blog Entries: 19

Rep: Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410
Autonomous systems: why they might not work as planned


I've just seen an item in the BBC News tech magazine programme (it's called Click) which worried me. They were talking about autonomous systems and why the current model, in which these systems carry out routine operations but hand over control to humans in an emergency, will not work in practice.

The problem is simple: humans can't cope with emergencies unless they have plenty of experience of working the machinery in non-emergency situations. Sullenberger was able to land his plane on the Hudson because he was a very experienced pilot. But automation will take away the opportunity for people to acquire this kind of experience.

Imagine it! You're cruising along in your car, a cup of coffee in one hand, a bagel in the other, reading the newspaper, while the car drives itself as it has done for months. Suddenly a voice from the console says "Emergency! Automated system shutting down. Over to manual control." You look up in horror and there's a bus coming straight at you.

What do you do? You crash, that's what.
 
Old 12-10-2016, 04:09 AM   #2
Myk267
Member
 
Registered: Apr 2012
Location: California
Posts: 422
Blog Entries: 16

Rep: Reputation: Disabled
People already get crashed into when they're driving all the time and paying attention.

I think it's interesting that the predicted failure case in the given scenario is the computer handing over control to the drive during a catastrophic system failure. A few months back there was some robo car news of some poor guy and his car auto-driving through the side of a truck, not because the computer gave up control, but because the computer didn't sense anything in the first place.

How often does a computer suffer a catastrophic failure? They're pretty reliable. How often does software crash* because something unexpected happened? All the time.

*Edit: Not always a crash. Sometimes it's using up all the CPU or RAM, or other odd behavior the program ending up in some "impossible" state.

Last edited by Myk267; 12-10-2016 at 11:24 AM. Reason: I should have went to sleep before posting. :)
 
Old 12-10-2016, 05:07 AM   #3
273
LQ Addict
 
Registered: Dec 2011
Location: UK
Distribution: Debian Sid AMD64, Raspbian Wheezy, various VMs
Posts: 7,680

Rep: Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373
To me the issue with automated systems is just an issue of liability. With a car I'm driving (for example) if it crashes and kills a few people I am likely liable but if an automated car does the same who is liable?
Until that fundamental question is resolved then, for most people, it is likely safer to take control oneself since, at least, the cause and effect will be under one's control.
When the first driverless car kills somebody that will be a very important legal event.
 
Old 12-10-2016, 06:51 AM   #4
hazel
LQ Guru
 
Registered: Mar 2016
Location: Harrow, UK
Distribution: LFS, AntiX, Slackware
Posts: 7,494

Original Poster
Blog Entries: 19

Rep: Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410
Quote:
Originally Posted by Myk267 View Post
I think it's interesting that the predicted failure case in the given scenario is the computer handing over control to the drive during a catastrophic system failure.

How often does a computer suffer a catastrophic failure? They're pretty reliable. How often does software crash because something unexpected happened? All the time.
He wasn't talking about the system failing. He was talking about the system doing what it was supposed to do. People always promise us that autonomous systems will let us take back control in potentially dangerous situations. The point is that we won't be able to cope with those situations if we're too accustomed to machines doing everything for us.

He gave the example of that French plane that crashed into the Atlantic in 2009 with the loss of all aboard. When it was finally located and the black boxes retrieved, it was found that the pilot had crashed the plane because he got confused while flying at a high altitude. He wasn't used to having to do that because the autopilot usually did it.

Last edited by hazel; 12-10-2016 at 07:14 AM.
 
Old 12-10-2016, 08:54 AM   #5
ntubski
Senior Member
 
Registered: Nov 2005
Distribution: Debian, Arch
Posts: 3,774

Rep: Reputation: 2081Reputation: 2081Reputation: 2081Reputation: 2081Reputation: 2081Reputation: 2081Reputation: 2081Reputation: 2081Reputation: 2081Reputation: 2081Reputation: 2081
Quote:
Originally Posted by hazel View Post
People always promise us that autonomous systems will let us take back control in potentially dangerous situations.
Um, have they? I don't recall such promises. Here are some promises for the opposite:

Coming in 2021: A self-driving Ford car with no steering wheels or pedals

Google self-driving car: "We removed the steering wheel and pedals, and instead designed a prototype that lets the software and sensors handle the driving.".
 
Old 12-10-2016, 09:05 AM   #6
michaelk
Moderator
 
Registered: Aug 2002
Posts: 25,592

Rep: Reputation: 5880Reputation: 5880Reputation: 5880Reputation: 5880Reputation: 5880Reputation: 5880Reputation: 5880Reputation: 5880Reputation: 5880Reputation: 5880Reputation: 5880
The Airbus crash is an extreme example because it was caused by both human error and mechanical problems. It was much more complicated then the pilot flying at high altitude without autopilot. Pilots do practice simulated emergency conditions to help quickly recognize when it happens for real and to develop an automatic response.

In the US after we get a drivers license we don't have to take recurrent training or do much of anything else except maybe retake the written test (for what it is worth) to make sure we drive safely. I agree that it would take many seconds for the human mind to figure out what to do and by that time it is to late to prevent an accident. On the other hand if all vehicles and traffic signals communicated with each other then an accident is less likely to happen. Most errors are caused by humans versus mechanical failures.
 
Old 12-11-2016, 07:31 PM   #7
sundialsvcs
LQ Guru
 
Registered: Feb 2004
Location: SE Tennessee, USA
Distribution: Gentoo, LFS
Posts: 10,610
Blog Entries: 4

Rep: Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905
Well, we have already experienced human death(!) caused by "self-driving computers."

As I recall the event, a white truck turned out in front of the car ... whose human driver was not paying attention. The computer's scanner did not recognize the presence of the truck, so the car ran underneath it ... killing the human (non-)driver.

I have no objection to "autonomous systems," as long as they do not share a roadway with me.

Because, after all ... "that non-driven(!) vehicle just smashed into me!"

How lucky we are, that this collision, although fatal to the driver of the car, was not also fatal to the driver of the other vehicle ... which happened to be a great big (white) truck. What if it had instead been a white minivan filled with a family and their four kids, all of whom were "wiped out by a driverless vehicle that slammed into them, when any human driver would have just: "hit the horn(!), stepped on the brakes, and changed lanes?"
 
Old 12-12-2016, 12:45 AM   #8
273
LQ Addict
 
Registered: Dec 2011
Location: UK
Distribution: Debian Sid AMD64, Raspbian Wheezy, various VMs
Posts: 7,680

Rep: Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373
I think the fact the car killed the "driver" and nobody else probably allowed Tesla to deny all responsibility. It's when a third-party is killed an the driver is facing huge costs or jail time that things will heat up or, if they don't, it will be obvious that the government is being bribed by somebody.
 
Old 12-12-2016, 11:28 AM   #9
sundialsvcs
LQ Guru
 
Registered: Feb 2004
Location: SE Tennessee, USA
Distribution: Gentoo, LFS
Posts: 10,610
Blog Entries: 4

Rep: Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905
I'm not willing to be present on any public highway in which any of the drivers around me are not drivers.

I'm also not willing to share my breathing-space with flying things that are not being piloted by people.
 
Old 12-12-2016, 11:31 AM   #10
hazel
LQ Guru
 
Registered: Mar 2016
Location: Harrow, UK
Distribution: LFS, AntiX, Slackware
Posts: 7,494

Original Poster
Blog Entries: 19

Rep: Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410Reputation: 4410
Quote:
Originally Posted by sundialsvcs View Post
I'm not willing to be present on any public highway in which any of the drivers around me are not drivers.

I'm also not willing to share my breathing-space with flying things that are not being piloted by people.
So what exactly are you proposing to do about it?
 
Old 12-12-2016, 12:19 PM   #11
273
LQ Addict
 
Registered: Dec 2011
Location: UK
Distribution: Debian Sid AMD64, Raspbian Wheezy, various VMs
Posts: 7,680

Rep: Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373
To my mind the solution could be simple -- make the human designated as "driver" responsible. Up to them whether they self-pilot, go full-auto of some other option.
Now, I think I can anticipate "What happens if the producer of the vehicle is negligent or criminal?". Well, we know they are, so..?
 
Old 12-12-2016, 01:54 PM   #12
sundialsvcs
LQ Guru
 
Registered: Feb 2004
Location: SE Tennessee, USA
Distribution: Gentoo, LFS
Posts: 10,610
Blog Entries: 4

Rep: Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905Reputation: 3905
Dunno ... daresay that "our first human example" had just a couple of seconds to realize ... "oh, sh*t!!" ... that the high-technology of which he apparently was so fond had just killed failed him.

How, then, could we seriously have held him responsible (had he survived, of course ...), for having merely "embraced" a technological innovation(?) that he had, apparently, "not only 'embraced,' but even 'championed?'"

After all: "we proffered this high-technology to him, in the first place." We encouraged him to use it. We implicitly told him that it was 'safe,' as evidenced by the fact that he embraced it (publicly!), and actually believed it ... up to, of course, that fatal "oh, sh*t!!" moment.

"Yeah, he was a high-tech fool." But ... who, exactly, induced him(!!!) ... to become so foolish? His family should right now be engaging lawyers, and those lawyers should be filing multi-billion dollar lawsuits ... (IMHO).

- - -
"Y'know, yeah," on the one hand, "all this new-fangled technology is 'just wunnerful.'" Sure, it definitely is 'exciting' that "we now live in interesting times." But are we showing that we are actually prepared, as a human society to live (and, to survive(!)) in such times?

Frankly, I wonder . . . . .

Last edited by sundialsvcs; 12-12-2016 at 02:03 PM.
 
Old 12-12-2016, 02:59 PM   #13
273
LQ Addict
 
Registered: Dec 2011
Location: UK
Distribution: Debian Sid AMD64, Raspbian Wheezy, various VMs
Posts: 7,680

Rep: Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373
Nobody induced him in the same way that nobody induces one to get drunk and drive t 200MPH.
If somebody wants to commit suicide then so be it but if they're happy to contemplate manslaughter that's a different story.
To my mind liability has to lay somewhere and there isn't a chance in hell that the corporations in charge of our countries will let it lie anywhere near them. So, if you're happy to infill a new technology set up by an entity set up purely for the making of money of shareholders on the world for your own gratification then I deem you responsible if that technology kills somebody.

Don't get me wrong, I'm sure in a few hundred years*, when every vehicle is automated and enough people have lost their lives and families we'll be in a better situation but for now it just looks grim.


*Make up your own timeline, I just mean "Defiantly not in my lifetime and, I think, longer than it ought to take.".
 
Old 12-13-2016, 08:33 AM   #14
cynwulf
Senior Member
 
Registered: Apr 2005
Posts: 2,727

Rep: Reputation: 2367Reputation: 2367Reputation: 2367Reputation: 2367Reputation: 2367Reputation: 2367Reputation: 2367Reputation: 2367Reputation: 2367Reputation: 2367Reputation: 2367
As with electric cars, driverless cars are controversial. There is a fear of losing control and independence. For many the car is about much more than just four wheel transport.

I sit on the fence on this issue, but the status quo is far from perfect. Deaths on the roads in any given country in the world are already unacceptably high. It's an accepted problem with no solution because the gains of having and driving your own car are "worth it" and it's always others who die (and of course the global multi-billion $ automobile and oil industries profit, so it's all good).

Sharing your roads with driverless cars may seem objectionable to some, but you also share them with drunks, overworked and tired people, partially sighted octogenarians, plus the multitudes of idiots who are 'better' drivers than everyone else and can of course drive while on the phone (and faster).

On any given motorway, you are trusting all of these personality types to do the right thing - you just don't think about the risks when you're hurtling along at 70MPH because the bad things happen to others.
 
Old 12-13-2016, 01:13 PM   #15
273
LQ Addict
 
Registered: Dec 2011
Location: UK
Distribution: Debian Sid AMD64, Raspbian Wheezy, various VMs
Posts: 7,680

Rep: Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373Reputation: 2373
OK, as a slightly contentious point -- what are humans doing walking on roads?
I would need statistics but I can make a pretty decent guess that a significant number of people hit by cars did so because they were idiots -- I lump myself into the idiots here.
Making roads more like railways, though, is almost impossible to do usefully -- at least as far as I can think.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
LXer: US Navy's First Autonomous Swarmboats Are Controlled with Ubuntu LXer Syndicated Linux News 0 10-10-2014 04:21 PM
LXer: Autonomous Android smartwatch ships LXer Syndicated Linux News 0 08-09-2014 12:10 AM
LXer: Autonomous sub powers up with Wheezy on Haswell LXer Syndicated Linux News 0 07-26-2014 03:02 AM
Autonomous network install Skaperen Linux - Distributions 3 12-27-2010 09:54 AM
LXer: Joshua: Autonomous Robot Chess LXer Syndicated Linux News 0 06-07-2006 12:54 PM

LinuxQuestions.org > Forums > Non-*NIX Forums > General

All times are GMT -5. The time now is 07:41 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration