Self-Driving Cars, aka Accidents Waiting to Happen
GeneralThis forum is for non-technical general discussion which can include both Linux and non-Linux topics. Have fun!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Self-Driving Cars, aka Accidents Waiting to Happen
The Sunday New York Times had an interesting article as to why the paradise of self-driving cars has not materialized.
The short version is that, when the rubber hits the road, creating an automatic auto that can coexist with humans--drivers, cyclists, pedestrians, highway workers, delivery persons, and so on--is a lot more complicated than the rosy-eyed optimists working on them envisioned in their computer simulations.
The Sunday New York Times had an interesting article as to why the paradise of self-driving cars has not materialized...
And, I sincerely hope, never will; bearing in mind that the industry has a history of deciding by cost analysis the advantages of being sued by victims rather than fixing a known safety problem.
(thinks about mentioning recent aircraft software no-nos, dismounts and stables hobby horse instead)
Actually IMHO I think it will happen in time. It's not like there aren't already from human drivers a staggering amount of accidents, mayhem and deaths as it is now. I think it is likely that at first certain roads will be allowed for self-driving and others may be disallowed but more and more will become allowed as the AI improves. My concern is for a future I won't live to see. At some point almost nobody will know how to drive and then most humans will be completely at the mercy of technology and those who control it for any manner of mobility. That will likely have very serious consequences.
A lot of the London rail system has been automated with no problems, and that's on routes with a train every few minutes. I suspect that such a service is more carefully checked than a Tasmanian freight train.
1) What will stop the car system getting hacked by ill-intentionned persons?
2) If someone gets killed by a self-driving car, who is legally responsible? The car owner? The manufacturer? The AI programmer?
1) What will stop the car system getting hacked by ill-intentionned persons?
Most self-driving vehicle manufacturers and software companies have vague "safety" plans, including cybersecurity.
The problem is not enough information is being shared among companies. They are desperate to get their vehicles out first. More collaboration will help ease the public's skeptical perception on these vehicles.
Quote:
2) If someone gets killed by a self-driving car, who is legally responsible? The car owner? The manufacturer? The AI programmer?
That is a great question. I haven't seen any news media or research groups look into that. I'm sure the problem lies in the varying rules and regulations each state imposes on self-driving cars.
Last edited by WideOpenSkies; 07-24-2019 at 09:42 AM.
Have they? Then why do the train drivers keep striking? Surely we can do without them.
They are on board to open and close the doors, and they could always lead people to safety if the train broke down in a tunnel. They go on strike just to flex their muscles and remind us to continue paying them twice the income of the average passenger. Completely driverless trains are common on the Continent, as on the Paris Metro.
As with the subway and other automation, as soon as it appears with solid evidence that there would be less injuries and deaths from automated vehicles than human controlled vehicles, even if at first under very specific conditions, we are on the way to a growing adoption. That will very likely occur rather soon in my estimation.
2) If someone gets killed by a self-driving car, who is legally responsible? The car owner? The manufacturer? The AI programmer?
I guess it depends on the legislation from each country, but I'd say, if it's a human error (by the AI programmer), the manufacturer should be legally responsible for not establishing proper procedures to avoid these type of incidents. However, if it's a deed in bad faith, then the employer should be legally responsible (both, to the law and to the company itself).
1) What will stop the car system getting hacked by ill-intentionned persons?
2) If someone gets killed by a self-driving car, who is legally responsible? The car owner? The manufacturer? The AI programmer?
1) Safety precaution counter measures? There has to be way of preventing hacking in first place. Why not create similar system like with LiveCD|DVD OS? Read only?
2) That i dunno but those who know can answer. Law issue.
3) https://www.youtube.com/watch?v=iHzzSao6ypE
Just FTR, Arcane, absolute security is only possible if the system is entirely isolated and cannot communicate with anyhing. If you think a Live OS by virtue of being loaded from a Read Only disk is impenetrable, you are quite mistaken. First there is the underlying software called "firmware" of the underlying hardware required to run at all, and then, the fact that Live OSes are loaded into a RamDrive creates vulnerability. Being on a Read Only disk only insures most evidence that you were hacked will disappear when it is shutdown or rebooted.
Self-driving cars cannot be anymore secure than any other computer service but that does not mean they can't be far more reliable than people.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.