Chatgpt 5 Finally says ‘I don’t know’, here is why that is a big problem



Large language models have an awkward story when telling the truth, especially if they cannot provide a real answer. Hallucinations have been a danger to AI chatbots since technology debuted a few years ago. But Chatgpt 5 seems to be looking for a new and humble approach to not knowing answers; Admit it.

Although most of Chatbot’s responses are precise, it is impossible to interact with an AI chatbot for a long time before it provides partial or complete manufacturing in response. The AI ​​shows the same confidence in their answers, regardless of its precision. The hallucinations of AI have affected users and have even led to shameful moments for developers during demonstrations.



Leave a Comment

Your email address will not be published. Required fields are marked *