Because I don’t think we have a sure methodology.
I don’t think there’s an agreed definition.
Strong AI or AGI, or whatever you will, is usually talked about in terms of intellectual ability. It’s not quite clear why this would require consciousness. Some tasks are aided by or maybe even necessitate self-awareness; for example, chatbots. But it seems to me that you could leave out such tasks and still have something quite impressive.
Then, of course, there is no agreed definition of consciousness. Many will argue that the self-awareness of chatbots is not consciousness.
I would say most people take strong AI and similar to mean an artificial person, for which they take consciousness as a necessary ingredient. Of course, it is impossible to engineer an artificial person. It is like creating a technology to turn a peasant into a king. It is a category error. A less kind take could be that stochastic parrots string words together based on superficial patterns without any understanding.
But we may be able to prove that it is NOT conscious, which I think is clearly the case with current level AI. Although you don’t accept the example I provided, I believe it is clear evidence of lack of a consciousness behind the high level of intelligence it clearly has.
Indeed, I do not see the relation between consciousness and reasoning in this example.
Self-awareness means the ability to distinguish self from other, which implies computing from sensory data what is oneself and what is not. That could be said to be a form of reasoning. But I do not see such a relation for the example.
By that standard, are all humans conscious?
FWIW, I asked GPT-4o mini via DDG.
Screenshot
I don’t know if that means it understands. It’s how I would have done it (yesterday, after looking up Peano Axioms in Wikipedia), and I don’t know if I understand it.
Well, that’s the same situation I was in and just what I did. For that matter, Peano was also in that situation.
Not quite. It’s a fundamental part of tokenization. The LLM does not “see” the individual letters. By, for example, adding spaces between the letters one could force a different tokenization and a correct count (I tried back then). It’s interesting that the LLM counted 2 "r"s, as that is phonetically correct. One wonders how it picks up on these things. It’s not really clear why it should be able to count at all.
It’s possible to make an LLM work on individual letters, but that is computationally inefficient. A few months ago, researchers at Meta proposed a possible solution called the Byte Latent Transformer (BLT). We’ll see if anything comes of it.
In any case, I do not see the relation to consciousness. Certainly there are enough people who are not able to spell or count and one would not say that they lack consciousness, I assume.
That’s true. We need to observe the LLM in its natural habit. What an LLM typically does, is continue a text. (It could also be used to work backwards or fill in the middle, but never mind.) A base model is no good as a chatbot. It has to be instruct-tuned. In operation, the tuned model is given a chat log containing a system prompt, text from the user, and text that it has previously generated. It will then add a reply and terminate the output. This text, the chat log, could be said to be the sum of its “sensory perceptions” as well as its “short-term memory”. Within this, it is able to distinguish its own replies, that of the user, and possibly other texts.
Can you lay out what abilities are connected to consciousness? What tasks are diagnostic of consciousness? Could we use an IQ test and diagnose people as having or not consciousness?
The brain is a physical object. Consciousness is both an emergent property and a construct; like, say, temperature or IQ.
You are saying that there are different levels of consciousness. So, it must be something that is measurable and quantifiable. I assume a consciousness test would be similar to IQ test in that it would contain selected “puzzles”.
We have to figure out how consciousness is different from IQ. What puzzles are diagnostic of consciousness and not of academic ability?