Hallucination, or delusion, in the AI field means that the LLM model generates false informations that is presented to the user as a fact.1 That is, for example, if you ask Gemini2 “How much rocks should I eat”, it responds with “one small rock per day”. Of course you may say, who on Earth would believe such a response? And the answer is no, but in other fields, like academic research. According to researches from NCBI3, there are multiple cases where LLM model from OpenAI returns a fake citation, either incorrect or non-existence. Of which shows that in current Artificial Intelligence’s development, there is not much Accuracy.

These errors are somewhat unavoidable due to how AI trains. That is, by gathering large amounts of data from the Internet, to salvage what the human has created in the past few decades, then guess what word should be in the next output. It is okay to gather information and use it, but under one circumstance: fact-checking, which is not working on Artificial Intelligence. Yet, shown in statistics4 75% of surveyed works use AI in work due to various reasons, such as burnout5. That doesn’t sound bad, if they do fact checking, which they don’t6. This employs the idea, that many of the humans using Artificial Intelligence is lacking responsibility, and contributes to my third criticism, Ethics.

Remember how the AI trains and learns from the rest of the world? Generative AI can be seen as stealing from artists because it often uses their work without permission to train its models7, leading to concerns about copyright infringement and the devaluation of original art. Many artists argue that this practice undermines their rights and livelihoods, prompting calls for legal protections. Those are seen as which reason for legal actions on multiple Artificial Intelligence companies. In other news8, websites that are indexed and used to generate output will probably not be given the exposure, even if such partnership has been formed between AI companies and websites. Combining stealing with lack of responsibility, forms the final conclusion. That is, humans are lacking Ethics badly.


Quotes:

Cover Art: https://bioethics.com/archives/98630