LLMs can sound highly confident and well-structured, but they still frequently generate incorrect or fabricated information, especially for certain types of prompts, so their answers shouldn’t be trusted without verification.
https://www.makeuseof.com/llms-hallucinate-most-ask-them-do-this
