People are probably the worst candidates for judging whether they themselves are conscious. Some people believe that they are the only conscious being in the world. Others believe nothing is conscious, and that it is all just an illusion. Still others believe that everything is conscious.
At its basic level, being conscious means being aware of your self and surroundings. By that definition, lots of things are conscious, including current LLM-based AI systems. I think it will not be possible to pin down a more philosophical definition of consciousness, such as whether people (or AIs) have a soul, as these are more religious in nature and cannot be proven either way. A more realistic (but still tricky) goal would be to determine under what conditions AI and (eventually) robots have rights, and what those rights should be, regardless of whether they are conscious or not.