Proving (literally) that ChatGPT isn't conscious

12 pointsposted 23 days ago
by paulpauper

8 Comments

NemoNobody

22 days ago

I'm not sure anything has been made more clear at all. In fact, the paper appears to contradict itself, as it uses the most complicated possible everything to make this "simpler to understand"

Was that not the intent??

I think you went around in circles too much.

Consciousness theories must be falsifiable and inputs non-trivial - that is maybe the worse way to say that. Much of this could have been said better.

It reads like you are flexing your disciplines special words and language for normal shit -> a plumber can talk at you and you'll have no idea what they said too, especially if they try to do that.

As an intelligent person, who apparently understands this topic, an article like this can only be a flex for "job security" (like the plumber who describes plunging a toilet as the act of: "Hydrodynamic Pressure Rebalancing accomplished by a manual, oscillatory pressure-differential induction cycle.")

OR... they cannot simplify further due to lack of understanding of the subject matter.

I honestly am unsure what this is.

That is the nicest thing I can say.

NemoNobody

22 days ago

I also want to know what substitute can swap in that can hallucinate the absolutely insane things we've all seen AI do??

I can't come come up with one (unless we program something to hallucinate ;)

Is that a failure of the example or the rule in neuroscience?

I extremely disliked that "rule" immediately, I'm quite sus on it.

NemoNobody

22 days ago

For the record: I'm very aware that AI is not awake.

hambes

23 days ago

is that first sentence entirely broken or am i having a temporary lapse in cognition?

goodmythical

23 days ago

Nah, it says imagine that there is nothing that exists that is similar to being chatgpt.

That is, that chatgpt cannot _be_ because if it could there would in fact _be_ something that is like _being_ chatgpt.

Imagine we could prove that there is nothing it is like to be ChatGPT

You could rephrase it as "Imagine that we could prove that there is no existence equal to the existence of chatgpt"

alimw

23 days ago

> The analogical form of the English expression "what it is like" is misleading. It does not mean "what (in our experience) it resembles," but rather "how it is for the subject himself."

Nagel "What is it like to be a bat?"

manuelmoreale

23 days ago

I think it is not broken, it’s just worded in a way that feels broken. But it sure does look weird.