bullshit ethics

Consciousness

Background

What is consciousness? Sentience? Sapience?

I'm not particularly interested in any of these questions, but I'm very interested in a closely related query: what determine an entity's moral worth? The natural background assumption to this line of inquiry isis that these concepts are closely related.

I've been influenced by a variety of people on this subject, but (unfortunately) at this point who I heard what from has been lost to the mists of my memory. That being said, I give Brian Tomasik a lot of credit - see for instance Tomasik.

Some Thought Experiments

  1. Suppose I took two twins and put them each in a windowless spaceship. I freeze both and leave ship A be while accelerating ship B to 99% the speed of light. When each twin reaches the age of 60, I freeze them again. Because of time dilation, twin B was in space for 425 Earth years while twin B was in space for only 60 Earth years. However, both twins had the exact same subjective experience; for this reason, it seems obvious that the twins' lives have equal moral value. An alternative phrasing is that each Earth-year of twin B's life was worth ~1/7th an Earth-year of twin A's life - that is, the fact that twin A's brain ran 7x faster gave an Earth-year of twin A's time 7x the value. From this we can conclude, moral weight scales with processing speed [EDIT: actually, all we can really conclude is that it scales with reality speed; it's entirely consistent to believe Brain B is 2x faster than Brain A, but that they have ~equal moral weight since they are both encountering reality at the same rate ].
  2. Suppose I took a sentient AI computer and made a replicate 1000x larger but with the exact same informational properties - same processing speed, same memory, same storage, same program. I give you terminal access to each machine and you won't be able to tell the difference. Neither machine will be able to guess how large it is via introspection alone. Finally, from an information-theoretical perspective the two machines are identical. It seems clear then that mere size doesn't grant moral weight.
  3. Suppose I took a tiny brain (like a flat worm) and kept adding cells until I got a human brain. This seems to obviously make the brain more valuable. So, even though size doesn't matter, complexity (at least in some sense) does.
  4. Suppose I took a deterministic brain and made two perfect copies. I hook all three brains up to the same input sources and I hook them up to the same output sources such that the three brains vote on each output bit. Again, from an external, internal, and information-theoretical perspective, this fusion-brain is completely equivalent one of those three brains, so it seems clear the fusion-brain gains no moral worth from this redundant circuitry. But not so fast - as soon as I modify a single input channel to one of those three brains, the brains can start disagreeing at which point it seems like they should be accorded separate moral weight.

Finally, to go back to (3), it seems incredibly bizarre to believe that (a) a flatworm has the same moral value as a human and (b) that a single synaptic connection is the difference between flatworm-level value and human-level human value. From this its obvious that moral value grows with brain complexity in some way and its not simply an on-off switch.

That being said, it's obvious that not all complexity has moral value. For instance, we can replicate any brain's actions via a lookup-table, and (from an information-theoretical perspective) a lookup-table is as complex as it gets.

It has been suggested that an important component of consciousness is the ability for a brain to think about itself Hofstadter.

Wikipedia contributors. (2020, October 20). Twin paradox. In Wikipedia, The Free Encyclopedia. Retrieved 20:00, October 24, 2020, from https://en.wikipedia.org/w/index.php?title=Twin_paradox&oldid=984427474 Munroe, R. (2008). A Bunch of Rocks. https://xkcd.com/505/ Hofstadter, D/, R. (1999). Gödel, Escher, Bach: An External Golden Braid. https://smile.amazon.com/G%C3%B6del-Escher-Bach-Eternal-Golden/dp/0465026567 Tomasik, B. (2013). Is Brain Size Morally Relevant?. https://reducing-suffering.org/is-brain-size-morally-relevant/