Page 1 of Self driving cars and morality
I think this is something that never occured to me before, but it's a real issue for self driving cars.
And that raises some difficult issues. How should the car be programmed to act in the event of an unavoidable accident? Should it minimize the loss of life, even if it means sacrificing the occupants, or should it protect the occupants at all costs? Should it choose between these extremes at random?
Humans generally will put themselves above other people, but the question is should their computerised chauffeur?
My Flickr Photostream
I was always under the impression that the driver of a vehicle must be in total control a the vehicle when the vehicle is in motion. How does that square with the parking systems offered by Ford / Jeep etc. Yes we are now in the 21st century but unless the law has changed, probably so so apologies if that is the case, surely a driver using park assist could still be charged for not being in control of the vehicle.
Choagy FFCUK the SPL 45%
we were talking about this at work the other day, with the ethical dilema, it's not as simple as "it's better that 1 person than more than 1 dies"
does age come into the equation, gender etc? what about calculating the age, gender, and number of ppl in vehcile vs. potential targets, would the time taken to calculate all that be too long?
the article that article links to mentions similar issues
As the technology advances, however, and cars become capable of interpreting more complex scenes, automated driving systems may need to make split-second decisions that raise real ethical questions.
At a recent industry event, Gerdes gave an example of one such scenario: a child suddenly dashing into the road, forcing the self-driving car to choose between hitting the child or swerving into an oncoming van.
“As we see this with human eyes, one of these obstacles has a lot more value than the other,” Gerdes said. “What is the car’s responsibility?”
Gerdes pointed out that it might even be ethically preferable to put the passengers of the self-driving car at risk. “If that would avoid the child, if it would save the child’s life, could we injure the occupant of the vehicle? These are very tough decisions that those that design control algorithms for automated vehicles face every day,” he said.
I think most ppl would hit the breaks, swerve out the way, do whatever necessary to avoid hitting a child, or anyone else without thinking, because you may not have time to check mirrors etc :(
Morality is an interesting question, but that is the future, but consider the present.....
The age of losing control of your car to car electronics is here now and you have to choose do you want these aids or not.
Start/stop tech....it is now getting forced on you by green legislation, and most new cars have this but do have an off button, but it is not permanent and must be pressed each journey to disable it.
Some manufacturers have been forced to change this by car owners forums, such as the BMW series 2, so that the driver can decide if they want it set to dissable mode on for good.
I personally hate start stop having owned pathetic Fords in the past which cut out at the lights due to crummy and damp ignition systems, so never want to have an engine stop on me. I also think there is dangers involved by the car stop starting outwith your control, as it has to meeet computer peramitters, and could cause delays in operation at dangerous busy junctions, etc.
Auto parking......Watching for example the Range rover Evoque forum website, most owners who have this option do not use it as they are worried that it will go wrong during reverse or parallel parking, and hit another car or a person or kid. In fact they are not sure how insurance covers an automated accident. This would include in future driverless cars.
Onstar.....GM motors in US in the recent decade, and now GM motors in Australia and Vauxhall in the UK are now running this through the 2015 range of their cars, and it is big brother at its worst, where a car can be located via GPS, and have its engine cutout remotely by police. It is been glossed up as a satalite aid, internet and bluetooth connectivicy, all wonderful stuff, etc, but it means that your car engine can be cut off remotely. Would you want that?
So I will not be going near a Vauxhall dealer soon.
Cars with connection to internet........Dangers of remote hacking and disruption of car electronics and even controls.
DRL's ...daylightdriving lights.....was supposed to be for safety but its a nest of worms, as legislation in each country is not clear on whether rear DRLs should be on at the back as well as the front. There has been a number of accidents where drivers think all lights are on at night, but in fact they have no lights at the back, as their front only DRLs were on with a back light on the dash which fooled the driver. Hence the chances of getting hit up the back with no lights on increases.
Overall..... A lot of recent new technologies over this decade have created more problems to drivers than being helpful. God help us all with future technologies in cars, as it makes driving a lot less safer.
This item was edited on Sunday, 25th October 2015, 01:32
Now let us consider the OP's original question which is in our near future, and i know the answer already to the delemma of Morality of a self drive car.
The answer is the technology of the cell phone you have on you today.
Do you know what all the electronics do within your phone..the answer is NO. Did you know you could be GPS tracked, well it took a while till joe public found out that this could be done, Did you know your microphone can be remotely switched on, well again we found this out only recently, as described in a USA incident where criminals were recorded by remote cell phone activation of microphone.
So the simple answer to the driverless car, just like the mobile phone is that you will not know what algorithms and programs are hidden within the electronics on board.
So if a sneaky programmer (and cell phone tech has been proved to be very 'very' sneaky) decides that your car may have situations where it wants to smash you off a brick wall, then you will not know about it till its too late, as you will be dead.
As I said above god help us all with new technologies that are totally unnecessary
maybe bad taste after the recent Tesla self driving car incident?
I test-drove a Google X self-driving car and ended up in jail.
Mark this one down in the calendar.
I agree with Bandi.
My new Flash Fiction blog. All my own work
I used to be with it, but then they changed what `it` was.
Now, what I`m with isn`t it, and what`s `it` seems weird and scary
would be interesting to know if a human driving could have avoided another vehicle jumping the lights?
Google’s self-driving car was today in what appears to be its worst accident yet
Google’s self-driving cars are no stranger to accidents, but rarely are the autonomous cars at fault and rarely do those accidents cause any significant damage. Today, it seems, may be an exception for the latter case, with one of the Mountain View company’s Lexus self-driving vehicles sustaining major damage in an accident involving a commercial van…
As far as we know, this is yet another case where the human — driving what appears to be a commercial van (as you can see being towed in the background) — was at fault. It’s still notable, however, as one of the worst — if not the worst — accidents one of Google’s cars has ever been in. As you can see in the image above, the entire right door on the Lexus is crumpled in along with a broken window or two.