CreateDebate is a social debate community built around ideas, discussion and democracy.
If this is your first time checking out a debate, here are some quick tips to help get you started:
Arguments with the highest score are displayed first.
Argument replies (both in favor and in opposition) are displayed below the original argument.
To follow along, you may find it helpful to show and hide the replies displayed below each argument.
To vote for an argument, use these icons:
You have the power to cast exactly one vote (either up or down) for each argument.
Once you vote, the icon will become grayed out and the argument's score will change.
Yes, you can change your vote.
Debate scores, side scores and tag scores are automatically calculated by an algorithm that primarily takes argument scores into account.
All scores are updated in real-time.
To learn more about the CreateDebate scoring system, check out the FAQ.
When you are ready to voice your opinion, use the Add Argument button to create an argument.
If you would like to address an existing argument, use the Support and Dispute link within that argument to create a new reply.
I'm sure you could read the obituaries in the news paper and have the similar response of "Meh". Haha. However, it does not mean their life was not important or mean that they never impacted someone else's life. Know what I mean?
You know? - My car has a climate control temperature slider, and I set the temperature. I also control the speed with the accelerator, select the limiter or cruise control and even how much concentration I invest in the driving and how much in how to solve the problems at work.
So - how about we put a "morality slider" in the car? After an hour's Buddhist meditation, I slide it over to "100% the other guy". But after a big fight with the girlfriend I reset it to "20% the other guy".
We make these moral decisions all the time when we drive a car. We will usually swerve to avoid the child in the pushchair - even though a huge truck is coming the other way - because we still think we can survive.
So, as the pilot of the car (the person who decides why he is in the car and where he is going) we can set the morality slider ourselves! Relieves the manufacturer of responsibility and puts the moral decision firmly on the pilot - where it should always be!
If you're on foot, you probably can'take afford a car because you have no job and are homeless..... either that or you are a health enthusiast (biker, jogger, etc.). In either case, you are and obstacle and should perish.
The article was not meant to be read. It started having you imagine a scenario that would never happen, then continue reading. Since, there is no reason to imagine the scenario suggested, there was no reason to continue reading. Just because many people read it doesn't mean it was supposed to be read.
Yes the laws are fictional but have rationality in this fictitious premise that the car would be unable to destroy itself or any of the human beings.
The idea that a driverless car would kill its passengers is also a purely fictitious philosphical ideation, like Science Fiction, that would mean that the car has self awareness to decide whether to kill the pedestrians or the passengers.
Either way neither event could possibly exist because it would be essential that driverless cars would be programmed to cease to function in the case of a failure or impending adverse event.
The car doesn't have to be self aware. It just needs software telling it how to handle certain situations.
The car has software that basically says, "If there is something in your way and there is enough stopping distance, slow down or stop. If there isn't enough stopping distance, swerve right (in order to avoid having to deal with on-coming traffic). If there is a cliff on the right swerve left. If there is on-coming traffic, go to the beginning of this block of code."
Now, notice that the block of code cannot handle the following scenario.
Imagine a tall mountain with a road up high. It is a two way road. There is an accident around the blind side of a turn. A person gets out of the car and walks to where a car coming around the blind turn could hit and kill him. The car comes around the blind turn and "sees" the pedestrian in the way. There isn't enough stopping distance. the car wants to swerve right but there's a cliff. The car wants to swerve right but there is on-coming traffic. The car's options are:
1. hit and the person on the street,
2. go off the cliff and kill the people inside the car, or
3. swerve into on-coming traffic and kill the people inside the car and those inside the other car in a head on collision.
The software, as it stands, does not make a decision. A human needs to decide and put the decision into the software.
As we continue to discuss a fictional philosophical scenario the problem with your synopsis of an incident does not allow for the possibility that by the time we have driver-less cars we will also have passenger safety pods within all vehicles that will totally protect all inside from injury so the car will be able to be involved in a crash scenario without doing any harm to anyone...?
Uh. What? Why would a driverless car be forced in such a situation? The AI functioning capabilities in terms to response aer far superior than humans. They would never need to have the option to choose. AI is not capable of functioning in a moral aptitude. If you follow the rules of the road, you will not have to worry about choosing who should live or die.
Edit: read the article. Again such a hypothetical situation like this is pretty far fetched, to the point of why even debate?
There's to many "what if's" not answered to address this ethically. So I'll take the question directly: should AI be able to choose life or death in certain events. The answer would be no, they shouldn't.
The ethical situations are pretty much endles in this situation. To allow a machine to make a ethical decision with out understanding ethics in itself, doesn't make any sense.
For fun : the passenger is s single parent of 3 kidswith no existing relatives. However, at the time of the accident they were at school. (The parent was only passenger).
The pedestrians are convicted criminals assigned for highway clean up. These criminals were all (let's say just two of them) convicted of dui's that resulted in the death of others. There's criminals have no children but a vast amount of siblings let's say 8 total.
Who should should be saved? If we cannot even calculate the moral ramifications of this situation, how would a computer be able to?
I say pick the solution that requires the least amount of energy. If stopping the car and getting the car back up too speed requires more energy than just running them over, run them over. If going off a cliff is more energy efficient than plowing through them, then go off the cliff. Simple!
You know? - My car has a climate control temperature slider, and I set the temperature. I also control the speed with the accelerator, select the limiter or cruise control and even how much concentration I invest in the driving and how much in how to solve the problems at work.
So - how about we put a "morality slider" in the car? After an hour's Buddhist meditation, I slide it over to "100% the other guy". But after a big fight with the girlfriend I reset it to "20% the other guy".
We make these moral decisions all the time when we drive a car. We will usually swerve to avoid the child in the pushchair - even though a huge truck is coming the other way - because we still think we can survive.
So, as the pilot of the car (the person who decides why he is in the car and where he is going) we can set the morality slider ourselves! Relieves the manufacturer of responsibility and puts the moral decision firmly on the pilot - where it should always be!