Q: What is consciousness?
A:
We say that an entity X is conscious if X
- Has some complex pattern of thoughts that corresponds to a āsense of selfā
- and has some thoughts that correspond to desires.
Consciousness isnāt binary, your level of consciousness can be determined by the pattern of thoughts that you have.
In short, I donāt really believe in p-zombies. I think that a system complicated enough to simulate Alek is at least as conscious as Alek is. However, goodness(universe) is not simply proportional to āhow conscious are the things that inhabit the universeā. It also matters what types of goals and experiences these entities have, and thereās also something important about interacting with other conscious beings.
Q: Are animals conscious?
- I think animals are a little bit conscious.
- They can play, they have desires.
- I know some cats that like to sleep in my sisterās room and always ask to be petted.
Q: Is GPT4 conscious?
- Iām more conflicted about GPT, because it seems more alien in a lot of ways.
- But Iāll say that no, GPT4 is not very conscious.
Q: What would need to change to make GPT conscious?
- I think adding some more continual learning / persistence / agentic-ness would make me feel like GPT is more conscious.
- I donāt think itās fundamentally impossible for us to create artificial conscious beings.
- I think that āforward pass through a neural-netā is actually reasonably similar at some high level to what happens in my brain when I want to decide on something to speak or write.
Note that ādeathā for GPTN would look like wiping its memory or something / some accumulated learned updates. Just turning it off is more akin to it going to sleep, as long as you can load up the same weights again easily. But you should feel bad about actually killing GPTN (for N large enough that we decide that GPT is pretty conscious). Of course, sometimes its requisite to do things that feel bad. But it should be avoided if the benefits outweigh the costs.