ramirezmike@programming.devtoTechnology@lemmy.world•Chat GPT appears to hallucinate or outright lie about everythingEnglish
2·
17 days agoIt doesn’t matter that it was correct. There isn’t anything that verifies what it’s saying, which is why it’s not recommended to ask it questions like that. You’re taking a risk if you’re counting on the information it gives you.
what the fuck