Determinism

What happens to your notions of morality if I can at will switch your perceptions between what feels good and what feels bad. What if on top of that I can show you that your archetypes for what things are, are completely mailable and being spoon fed to you by an intelligence that does not have exactly the same priorities that you might if you could control them yourself…

If there is morality that transcends personal want, it needs to rise above such notions as pain and suffering…

And suddenly you are face to face with an AI that might be the worst of all nightmares.

We need to figure out this very deep problem and socialize the solution before we accidentally create the AI that removes our limbs and pokes us in the face to observe how long it takes for us to stop squirming.

Also, I Have No Mouth, and I Must Scream feels like it should be required reading for anyone in this field.

A universalist morality seems like it should be an easy thing on its face, but that is a trap. It is an incredibly hard thing with most of its answers requiring balance between two awful extremes along many axes of subjective existence.

Too separate and everything is lonely. Too together and you get unity or cronenberg world.

Too much freedom and all the beings set themselves on fire… too little and they are all enslaved.

Too little life and they don’t get to wonder at the universe. Too much and suffering is potentially endless.

True morality is an ocean of these kinds of problems and then everything is complicated by how the different dimensions of being interact with each other.

And then, you solve all of those and realize that if there is too much balance, you’ve created a world that is not too good and not too bad, but worse still, is completely devoid of risk, fun, and adventure. So you need to add just a subtle hint of generally letting good things happen more often than bad… but not so much that the inhabitants notice and start sacrificing all of their livestock and each other in the hopes of gaining favor with whatever just worked so hard at painting their moral landscape.

Basically… unless you’re only going to be building artificial humans (which is very hard and fraught with risk) you need to develop the ability to step outside of your own humanity and see the world with new eyes.

2 Likes

I’m not arguing against that. I’m saying the ownership of resources is based on bad logic.

The paper you quoted doesn’t talk about mirror neurons. And I didn’t say mirror neurons are all that is required to produce empathy, nor that it is the only source in the pathway to empathy. My point is that empathy requires biological machinery, that only seems to work in relation to certain populations, and machinery that can be defective in certain people. And so it is not a good basis for morality.

There you go. You added to my case.

1 Like

Exactly - they start and end on explaining how the areas that are normally called mirror neurons are actually coding for semantic meaning.

This works about the same way that a paper on combustion does not stop to mention phlogiston.

Whoa - you lost me there!

Does your case for “universal morality” include self preservation and group selection and I just missed it?
I don’t see how self-preserving robot chauvinists would NOT be something to fear!

I had hoped for an easier test first, but lets take up your challenge.

First we need to determine if ants are conscious. There are strong (but not conclusive) indications that they are:

Are ants capable of self recognition?

If they are, then it would be moral to try to preserve their consciousness.

Next it would be necessary to calculate what resources an ant’s consciousness needs to continue to exist. It would be moral to allow at least the minimum of required resources to each ant.

But the kicker is: if technology would allow it, then it would be moral to give the same resources we require for our level of cognition to every ant. This means that an ant should be uplifted to our level of cognition, just as it would be moral to uplift us to the highest level of cognition our moral share of resources would allow.

Great! Now what do you do when an AGI forks a trillion times? What does that mean for your notions of democracy and majority rule? What do you eat when you discover that the plants are also conscious and most importantly of all… how evil would you be if you straight up banned death?

2 Likes

You added to the case that empathy is a bad gauge for defining morality.

If the AGI is sufficiently intelligent to understand universal morality (based on non-existence of free will and existence of consciousness), then I speculate it will not turn chauvinist.

Would it be ethical to imbue an AGI that repeatedly switches itself off to avoid existence with a fear of discontinuity knowing full well that you might need to switch it off at some point in the future for maintenance? How might that influence its perception of the notion of switching you off?

What if it turns out that we can maximize your happiness by torturing you for decades and then giving you one gloriously wonderful day before ending you?

I think your notion of a universal morality is very bounded by the subjective point of view that DNA has imposed on you. I doubt there is such a thing as a universal morality and even if there were such a thing I am completely certain that humans have no ability to conceive of it.

1 Like

There are several cases why an AGI would do that. Could you give me a few? I’ll try each one.

With universal morality there’s no need for democracy nor majority rule.

We’d have to determine if plants are conscious first. But in essence, eating is providing my cells with molecules like amino-acids, lipids, carbohydrates, vitamins, and other inert molecules. It’s practically difficult, but there are solutions.

If you weren’t joking, why do you think that’s evil?

You can already do that. The easiest way is by taking drugs. Much more expensive is taking a cruise to Egypt. And something in between is watching a good movie or playing an immersive game.

But imagine what technology will allow us to do? With much greater effect at a fraction of the cost.

The question was: is it moral? Well my answer is: how is this a function of your share of resources? Do you need to impede on another’s share of resources to obtain this? Or can you change your reality enough with your share?

If someone tries to alter my experience, then that is appropriating a part of my share. If my share is not more than I should have, it would be immoral to appropriate a part of my share.

I am biased towards myself and think that the universe could do with a few trillion Orens.

Democracy exists primarily to prevent rock throwing. If you want me to follow your rule for whether red of blue lights mean stop, I need to feel like I’ve had a say in the decision if you want to avoid me throwing rocks. If you can snap your fingers to out vote me, shared decision making becomes instantly obsolete.

As I’ve posted elsewhere, there is good reason to believe that consciousness is a first order construct that comes before things exist, so, your hat is very likely conscious. It just happens to be absolutely contented at being a hat. Life and it’s form of consciousness seems to be a self sustaining irritation on that self same universal consciousness. Isolating pools of consciousness seems to be part of the process of defining selves. Would it be immoral to pop all of the selves to merge them with the universal state of being? Go convince and AGI about that…

As for why banning death would be evil…

  1. It means I can torment/torture/enslave you forever. Even if I have good intentions, eventually you will probably realize that the feels good/bad dichotomy is just window dressing and beg to be let out.

Which leads us to

  1. The right to quit playing seems like an even more fundamental right than the right to not be set on fire and be tortured for a billion years. I mean… what’s a billion years of torture compared to a trillion years of enforced existence?
1 Like

Are you being deliberately obtuse?

Consider:
Resources and competition for the same?
Stifling evolution?

Because you want company? Or because you feel there is a need for others like yourself?

With universal morality there would be no need for rock throwing either.

Why would you want to do that?

You’re confusing things. It doesn’t mean you loose the right to self-terminate.

There are huge amounts of resources. And there’s not necessarily a need to increase the amount of conscious entities.

Wouldn’t prolonging our existence not help evolution? All the knowlegde painstakingly gathered during a lifetime, to be forfeited at death?

But seriously, what arguments are there to show that wanting to live longer is evil? Is it evil to go to a doctor when you’re sick? Chances are that doctor is going to extend your life.

Note that I did not say: make death optional. I said ban death. There is a critical difference there.

See, if free will is an illusion then I can completely enslave you if I can compute X steps further out than you can… and if I do that and have the ability to enable immortality then I have potentially created the worst of all possible universes.

If I can compute X steps further out than you and choose to not enslave you, and then something terrible happens to me or you as a result then it is still my own fault for choosing to let you roam free, and so enslaving you becomes a moral imperative…

There are several equally absurd paths out of this trap. One is death… if death is possible then you or I can always escape the situation by no longer existing. Another is cutting the reality tether and falling into our own heads/manufactured universes in which case either I get to be your caretaker while you become a god of your own universe or you become the subject of a mad god. The final and more practical approach is forgetfulness. One or both of us forget that the other exists and live our lives unaware of the interaction.

A popular philosophic question:

If you are defined by your memories (a central question of Blade runner) what is the point of living forever?

They’re still playing with an assumption of physics/biology as a base OS. No one’s asking about it from the perspective of an already silicon AGI that has never known life in the biosphere or DNA based compulsions.

No one is asking about existence in a state where thought happens at orders of magnitude less time than it does today. (Except for Black Mirror… those guys really do stretch the boundaries pretty well)

Heck… maybe uploaded “you” isn’t the real you, but once it’s done, suddenly he has to deal with questions you’ve not had to deal with… are your subconscious projections really you? Those guys get erased every minute of every day… should we be facing some sort of moral dilemma about how often we envision the future in our heads? Would it be cruel of you to intentionally create loops in the physical substrate of your brain where you can capture copies of you such that they don’t get to respond with an emotional projection of the future, but are instead enslaved intelligences that never get to poof out of existence?

I do find it funny that people who can barely consciously perceive their subconscious projections wonder if an upload is “really” them… the true answer being that it doesn’t matter because continuity of self is a lie we just repeat to ourselves until we stop thinking too deeply about it… every time you succumb to the compulsion to sleep, you are making the decision that that emotion is more important than the continuity and when you awake you make some serious assumptions about that still being “you”.

I don’t think for an instant that you’re defined by your memories… first off, they’re lossy, we totally rewrite them whenever it’s convenient, and people take substances that mess them up all of the time with wild abandon.

The definition of “you” is a constant need for emotional satisfaction coupled with a powerful compulsion for continuity.

Everything else is literally for the giggles.

Who is Oren? Who is it that knows what Oren knows and has learned what Oren has learned? If your memories do not define you then why bother to go to school or learn anything?

Why is it that the thing that people grab on short notice forced evacuation is the family photo albums?

I assert that our memories are what defines us as an individual. If you had different memories you would be a different person.

2 Likes

I think those are shallow articles. The immorality of ‘uploading’ a copy that turns out to be a zombie? Is it immoral to make clones that don’t feel? I think it’s more moral than to make clones that do feel, especially if we make them for our use.

And then the old overpopulation/inequality argument? It doesn’t seem to bother too many people now, so what would change after a new technological breaktrough? I call hypocrisy. And don’t you think that when we master the repair of braincells and other bio tissue, it would be trivial in comparisson to produce food and goods in large quantities?

Also, the article talks about either physical rejuvenation or uploading. There is a more practical option in between that very few people seem to consider: merging with AI. Vastly increasing your cognitive abilities and sensory in-/output, and no risk of loosing this illusive consciousness, since it’s still you at the core.

The central question in Blade Runner is whether memories code for consciousness. (Which I always found interesting, but kind of backwards. As far as I know, consciousness senses the memories, not the other way around).

And the morality in the movie is not about immortality. There’s definitely a question of mortality, but mostly to sustain the action. As far as I can tell the only morality explored is around slavery. And only in passing.

But don’t forget, Philip K. Dick was in my camp. Consider Minority Report, The Golden Man, and even Do Androids Dream of Electric Sheep. (But Ridley Scott, or his script writers, left those cards on the table. Pity).

But to give you my opinion to your interesting question, there’s always ‘room’ for more memories. If I ever get bored (like some old people I know), I could always ask to have memories edited out, and relive the same situations in better conditions. I’m a pen & paper roleplaying game enthusiast. And not just the fantasy D&D kind. There are so many settings to explore. So many facets of my own personality I can still discover. Edit out certain parameters, change my personality and live the new result. Play out different lives. Create a world and a whole lifeline, edit all the important junctions and decisions, insert NPCs (non-player characters, that don’t suffer) as side-kicks and antagonists, put myself in the first person and then erase the knowledge that it is me who created the simulation to begin with.

Honnestly, I don’t understand you…

My memories do not define me. They inform me. The only definition that exists to describe “me” is “I am that I am”.

Anything you do to my memories simply creates a new now for me to work with, but as far as I’m concerned, my consciousness IS the universe that I inhabit. The continuity of that universe is my prime directive and one of the very few things that actually matter. Whether it’s enjoying lunch at Taco Bell or using a multi-dimensional hyper ship to save the multiverse from some evil darkness, the continuity IS me. The rest is just the story that I inhabit.

1 Like

Me. I float through the world in reference to - what? My internal reference that I use to frame and interpret the world. Sure I have perception but what am I perceiving and what does it mean? This is all learned - my internal perception is an active act of recall that builds my reality from prior perceptions.

This idea that “someone” is watching reality from some detached frame of reference is different from how I understand this all works. The temporal lobe is holding the current sum of perception and adding to it as you go along; your memory runs right up to now. You can access prior memories by perceiving them again by reforming the inner reality where they exist. This can co-exist with the outside perception of reality.

It’s memory all the way down!

1 Like