Someone’s hallucination may be another’s masterpiece. In our daily life, we humans often hallucinate. We call it ‘daydreaming’ or ‘being creative’. We make stuff up all the time when we can’t answer our children’s questions. We adopt different personas that are not “who we are”, and we “roleplay” in various ways depending on social expectations. In humans, we praise “creativity”, “thinking outside the box”, or being different from others. Generally, this “diversity” is deemed good for society.
Because AI relies on computing power, we expect that AI should be “hallucination-free”. But generative AI is not a traditional computing task; it is a stochastic process – a gigantic pinball machine in which words or images are manipulated to generate plausible results. Hallucination is inherent to the methodology. There are ways to reduce it, and there are ways to identify it, but maybe the best course is to embrace it. Maybe the next generation of IT systems will have to use hallucinations creatively, in a way that creates systems with “personality”, that uses hallucinations to find new solutions to problems. To borrow a trope, hallucinate early, hallucinate often.