'Hallucinate' is named Cambridge Dictionary's word of the year: "[It reminds us] humans still need to bring their critical thinking skills to the use of AI"

The popular English dictionary warned about the risk of "false information" generated by artificial intelligence.

The risks of artificial intelligence (AI) have become a hot topic again this week after Sam Altman was briefly fired from the secret OpenAI project, which according to various reports could endanger humanity. The growing debate around its threats has led the Cambridge Dictionary to crown hallucinate as the word of the year. 

To the traditional definition of "to seem to see, hear, feel, or smell something that does not exist, usually because of a health condition or because you have taken a drug," the Cambridge team has added another less human one: "When an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human) hallucinates, it produces false information."

"The fact that AIs can ‘hallucinate’ reminds us that humans still need to bring their critical thinking skills to the use of these tools," said Wendalyn Nichols, publishing director at Cambridge Dictionary. "At their best, large language models can only be as reliable as their training data. Human expertise is arguably more important – and sought after – than ever, to create the authoritative and up-to-date information that LLMs can be trained on."

The choice of hallucinate as the word of the year is a reflection on the appropriateness of words that describe human beings when referring to artificial intelligence (such as, for example, "intelligence" itself). That is, "anthropomorphize": "To show or treat an animal, god, or object as if it is human in appearance, character, or behavior."

"Whereas these are normally thought of as human products, ‘hallucinate’ is an evocative verb implying an agent experiencing a disconnect from reality," explained ethicist Henry Shevlin. He said:

This linguistic choice reflects a subtle yet profound shift in perception: the AI, not the user, is the one 'hallucinating.' While this doesn't suggest a widespread belief in AI sentience, it underscores our readiness to ascribe human-like attributes to AI.

The word of the year also has another characteristic that may have led the team behind the Cambridge Dictionary to choose it. It emphasizes that a traditional dictionary, created by professionals, will always be more accurate than AI. It says on its website:

Researchers have not found a way to prevent AIs from hallucinating, which means that you can’t be sure that the information they give you is accurate.

The Cambridge team has many years of English language teaching experience, and we improve our dictionary based on the latest research.