What happens to your notions of morality if I can at will switch your perceptions between what feels good and what feels bad. What if on top of that I can show you that your archetypes for what things are, are completely mailable and being spoon fed to you by an intelligence that does not have exactly the same priorities that you might if you could control them yourself…
If there is morality that transcends personal want, it needs to rise above such notions as pain and suffering…
And suddenly you are face to face with an AI that might be the worst of all nightmares.
We need to figure out this very deep problem and socialize the solution before we accidentally create the AI that removes our limbs and pokes us in the face to observe how long it takes for us to stop squirming.
Also, I Have No Mouth, and I Must Scream feels like it should be required reading for anyone in this field.
A universalist morality seems like it should be an easy thing on its face, but that is a trap. It is an incredibly hard thing with most of its answers requiring balance between two awful extremes along many axes of subjective existence.
Too separate and everything is lonely. Too together and you get unity or cronenberg world.
Too much freedom and all the beings set themselves on fire… too little and they are all enslaved.
Too little life and they don’t get to wonder at the universe. Too much and suffering is potentially endless.
True morality is an ocean of these kinds of problems and then everything is complicated by how the different dimensions of being interact with each other.
And then, you solve all of those and realize that if there is too much balance, you’ve created a world that is not too good and not too bad, but worse still, is completely devoid of risk, fun, and adventure. So you need to add just a subtle hint of generally letting good things happen more often than bad… but not so much that the inhabitants notice and start sacrificing all of their livestock and each other in the hopes of gaining favor with whatever just worked so hard at painting their moral landscape.
Basically… unless you’re only going to be building artificial humans (which is very hard and fraught with risk) you need to develop the ability to step outside of your own humanity and see the world with new eyes.