Forum Discussion

NomdeGuerre's avatar
NomdeGuerre
Community Member
3 months ago

AI Hallucinations Clarification

I have been using AI engines (Claude, Perplexity, ChatGPT, MS Copilot, etc.) for a while and am quite familiar with AI Hallucination.

So, when it comes to use of AI in Articulate (Rise and Storyline) is AI Hallucination an issue to be aware of? ChatGPT, Claude, MS Pilot, Bard and Gemini have all demonstrated AI Hallucinations when generating responses.

This would be important, as one does not want wrong/incorrect content to be displayed for employees in corporate-wide rollout. Thank you.

  • Yes, AI Hallucination is a reality with any GPT based system. Quality of prompting can control degree of hallucination.

    Any content that is generated with AI, still needs a human reviewer as part of standard QA check. 

    Even if you had a human assistant for your task, you would still review the output before relaying it futher. 

    AI Assistant of any tool Articulate or otherwise, has to be treated as Assistant, who does the grunt work for us but its us, the Human Masters who need to finally dress it up before rolling it out.