
A faulty reward signal during training caused ChatGPT models to start dropping goblins, gremlins, and other mythical creatures into their answers at a surprising rate. OpenAI says it's an example of how small, poorly tuned training incentives can produce unexpected side effects.
The article ChatGPT's goblin obsession may be hilarious, but it points to a deeper problem in AI training appeared first on The Decoder.



