„I pull this lever and suddenly it’s not my problem anymore“
Someone needs to stop tying people to those train tracks or this trolley problem will never go away.
Step in front of the train: Tell your manager this whole project is dumb, provide a list of reasons why it’s a bad idea and explain you are prepared to resign rather than enable its further development.
MULTI-TRACK DRIFTING!! Which also kills the other lever guy, bonus!
Half-pull the lever so that the points get stuck midway between the two tracks. That should derail the trolley. Someone could conceivably still get hurt, but it improves everyone’s chances.
(What? You mean it isn’t a literal trolley that has to obey the laws of physics? Damn.)
Philosophy problems vs all real world problems
News next day, 10 dead in derailment.
There is one person in danger.
Now I pull the lever.
Now there are two _______
person in dangers
I’m afraid you failed the wug test, or rather one of many wugs test.
Napkin math, from the last time I saw this:
I’ve been thinking about this. I estimate a few people per 1000 would do an atrocity for no reason if they were guaranteed no consequences, and the deaths if the switch is pulled are 2^(n-1) for the nth switch. The expected deaths will cross 1 somewhere in the high single-digits, then (since it’s outcome*chance), so the death minimising strategy is actually to pull yours if the chain is at least that long.
Edit: This assumes the length of the chain is variable but finite, and the trolley stops afterwards. If it’s infinite obviously you pull the switch.
If we all collectively agree to just pass it on, then either:
-
It’s infinite, and it just passes on forever, or…
-
It’s not infinite and somebody at the end has no choice, in which case nobody in charge of a lever has killed anyone
So yeah, I say pass it on.
Except that somewhere down that chain someone is almost certainly going to choose to kill people, so by passing the trolley on down to them you’re responsible for killing a lot more than if you ended it right now.
And since every rational person down the line is going to think that, they’ll all be itching to pull the “kill” lever first chance they get. So you know that you need to pull the kill lever immediately to minimize the number of deaths.
Only the person pulling the lever is responsible for his/her action though. There is a difference between passively passing on and actively murder someone
Dentological ethics: you have a duty to not murder people, so you don’t pull the lever
Utilitarian ethics: pulling the lever will kill less people
If I hand a machete to Jason Voorhees I think I’m at least partly responsible for the people he hits with it. I know what he’s going to do with that thing.
Except you’re not passing a machete to Jason Voorhees. That would be “double it and pass it to the next person who you know is going to pull the lever.”
You’re passing a machete to the next person in line. You don’t know who that is. They may or may not pass the machete down the line. Considering I would not expect a person chosen at random to kill someone when handed a machete, it seems unethical for me to kill someone with a machete just to prevent handing it to someone else.
I know Jason is somewhere down that line I’m handing the machete off to. And the farther down the line he is the more people he’s going to kill.
-
If we keep doubling, will I eventually be a person on the tracks? There are a finite number of people, so eventually I would be, right? So, passing the buck would be equivalent to handing my fate to a stranger.
OTOH, if there are an infinite number of people, then this thought experiment is creating people out of thin air. Do these imaginary people’s rhetorical lives even matter?
Either way, it seems better to kill 1 person at the start.
If it creates infinite number of people, it could solve world hunger with some good ol’ Soylent green thinking. Although you might want to figure out how to slow down the trolley at some point.
I’d pull the lever to kill one person immediately. Assuming the decision maker at each stage is a different person with different opinions on moral, ethical, religious, and logical questions, then it’s a near certainty that someone is going to pull the lever to kill the people at their stage. If you’re lucky, it’s the very next guy. If you’re not, it’s the guy killing a million people a couple of iterations later. If I’m the first guy, I’ll take the moral hit to save the larger number of people.
I think this is a good metaphor for how humanity has “dealt” with problems like climate change.
If you make a tough decision, it causes hardship now, but prevents hardship in the future. If you don’t make a tough decision now, someone in the future has to either kill a lot of people, or just pass the buck to the next guy. People justify not making the tough decisions by saying that maybe eventually down the line someone will have an easy decision and there will be nobody on the side path, even though all observable evidence says that the number of people on that path just keeps growing exponentially.
Just keep doubling forever until the number is more than everyone alive, free s-risk emergency button.
This might cause a buffer overload that crashes the programming and we can escape the matrix together once and for all