GeneralThis forum is for non-technical general discussion which can include both Linux and non-Linux topics. Have fun!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Self-driving cars have already logged a million and a half driverless miles on the roads here in California, and the state already has legislated rules for self-driving vehicles.
Quote:
Why would you want a driverless car in the first place? Driving is one of lifes great pleasures.
Enjoying the scenery that you're driving through is also a pleasure, and you certainly can see more as a passenger. Oh, LA Freeway driving usually doesn't qualify as "one of life's great pleasures"
Anyway, consumer self-driving cars are in the works. Mercedes exhibited a self-driving car at CES recently. Forecasts are for 10-20 million on the road in the next decade.
I am sure that self driving cars still have to be insured.
So, the insurance company would be responsible.
OK, whose insurance company? I can see the car owner's insurance refusing to pay up on the grounds that it wasn't the car that had failed but the onboard computer or the software it was running. Those things might well be excluded by the policy conditions.
There is already a documented case of a driver being killed when his self-driving car slammed into a white truck.
The driver of the truck was not seriously injured, because it was, well, a truck.
It would have been an entirely different matter if it had slammed into a small car, or a child. . . .
In any case, "the legal rights of everyone else on the road" include "being protected from this mad, Tom Swift idea." Likewise robot package-delivery drones, et al. Someday, someone's gonna get killed, just for having stood in the wrong place at the wrong time. And their families are going to have "a prodigious legal case," and they are going to win it.
Anything's got bugs to work out but over time change is going to happen. (Like maybe not needing billions of frivolous gas guzzlers in the first place and what ever happened to staggering shifts, damn rush hours!? )
Imagine a closed system. Where all the vehicles are: required, communicate and confined (e.g: freeways,) the chance of accidents would drop so staggeringly?!
Last edited by jamison20000e; 07-05-2016 at 12:48 PM.
Imagine a closed system. Where all the vehicles are: required, communicate and confined (e.g: freeways,) the chance of accidents would drop so staggeringly?!
Uh huh ... but only if the software worked, flawlessly, 100% of the time.
Trouble is, you're now putting thousands of human lives at risk ... at risk of death ... whether or not(!) they are driving or riding in one of those cars. You're exposing people who merely "happen to be driving at the wrong place at the wrong time" at risk of being killedby a software implement that does not possess human judgment.
This, to me, should be made illegal. National-level laws should be passed throughout the country mandating that all vehicles present on public roadways must be controlled exclusively(!) by human operators, who are legally responsible for their actions.
(To ensure that the legislation actually gets passed, the public should "Crowd-Fund" an initiative to openly bribe the public officials in question, in an amount greater than Google (etc.) is capable of or willing to pay to them to do otherwise. "Yes, public law-making has de-evolved into a simple auction, wherein only Money talks." Therefore, let the public purchase common sense, so that billions of dollars of "Google Money" cannot inflict life-threatening dangers upon the unsuspecting general public. It will take billions of dollars to counter the billions that are already being spent to buy influence.)
Last edited by sundialsvcs; 07-05-2016 at 02:50 PM.
but way less than now (especially by those who cheat the DMV or costs &c,) the "cars" would not need to be .www but short range with the immediate surrounding cars. "We" build and run dangerous machines every day but automated (by humans, ) fails way less, plain and simple. IMHO beyond that some seem to be just superstitious or perhaps paranoid fearing change like we all need to get to the nick nack factories ("they took our jobs?" ) (Or, becoming fat like on Walle? )
Transparencies are always awesome tools for us (unless with some basic human privacies,) too bad we can't legally get it from many ę: "corporations.gov,,," Mono(Irony)poly!
Edited\added.
Last edited by jamison20000e; 07-05-2016 at 08:12 PM.
Reason: semantics
Basically, I assert that it is m-y prerogative to demand that my life not be forfeit to a programming error in a machine that is hurtling past me at 70 miles per hour.
And, I would include(!) in that assertion a great amount of the software that today is already embedded in vehicles ... software, the powers of which the vehicle owners (and lawmakers) "know not of."
We have wholesale-swallowed the mechanical-Utopian notion that "anything that we are technically capable of inventing and deploying, should be." Not so. Not so.
We have wholesale-swallowed the mechanical-Utopian notion that "anything that we are technically capable of inventing and deploying, should be." Not so. Not so.
Yes...."Just because you can, doesn't mean you (necessarily) should"
I am highly sceptical that driverless (or robo-driven) cars and human driven cars can coexist in the same roadways. (I can see driverless cars in a segregated environment. Heck, the Washington, D. C., Metro is for practical purposes driverless trains--there are train drivers, but they are there to make sure the computers do their jobs.)
Humans will always come up with ways to drive stupidly that even the latest algorithm will not anticipate.
In the case of the poor guy in the Tesla, he disregarded Tesla's own advice and took his eyes off the road, then encountered a situation that the algorithm did not anticipate--a white truck-trailer directly cross-wise against a background against which the trailer did not stand out.
If you live by the algorithm, you may die by the algorithm.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.