Experience Machines Support Ethical Hedonism
Suppose there was an experience machine that would give you any experience you desired. Super-duper neuropsychologists could stimulate your brain so that you would think and feel you were writing a great novel, or making a friend, or reading an interesting book. All the time you would be floating in a tank, with electrodes attached to your brain. Should you plug into this machine for life, preprogramming your life experiences? […] Of course, while in the tank you won’t know that you’re there; you’ll think that it’s all actually happening […] Would you plug in? (44-45, Nozick)
Robert Nozick’s arguments basically boil down to:
- If all we cared about was pleasure, we would agree to plug into the experience machine.
- However, we do not want to plug-in.
- Thus, there are things which matter to us besides pleasure.
Critics of experience machines do not formalize their intuitions enough. If they did, they’d discover they didn’t actually have a problem with experience machines in their simplest form. Here is a thought experiment which I believe speaks for itself:
Suppose our best AI experts eventually agreed it was safe to create a powerful, benign AGI. This AGI swiftly created a thriving post-scarcity economy. This all happened 10,000 years ago. Now, you are alive and face a choice of entering an experience machine. You could remain alive and climb mountains and experience in the base reality, or you could experience the same and far more in a simulation where you only think you are in a base reality. Of note, there is nothing more that you could do for others in this base reality—you would only be causing relatively less benefits to others than this AGI could over a period of time interacting with another person. There is also nothing you could do to secure the future of humanity or sentient life—this AI is far smarter than you, and the future is secured under its control. So I ask, why not go use the experience machine now? Why not let your family use it? If it creates a subjective reality literally designed to maximize wellbeing and it reliably does a fantastic job at this, then why not?