

It is not. “Formally dubbed” by people who want you to believe these LLMs are more than just a collection of GPUs. LLMs don’t “understand” anything. These errors pop up because it can’t think, learn, or adapt.
Personifying them like this headline does is stupid and dangerous. LLMs do not “think” because there is no thought. It doesn’t “hallucinate” any more than a rock does.
I cannot wait for everything from research papers to grade school reports to have product placement based on whoever’s paying OpenAI the most.