Forum Discussion
Can we use AI to assess assessment answers?
I would love to bring in the ability for AI to provide answers to free text (Short or Essay questions) as this will increase the personalization of the course.
Has anyone managed to utilize AI in this way on any Articulate product?
4 Replies
- GarimaGupta-0f5Community Member
Hi Seanx!
I am a big proponent of personalized learning experiences and I love that you want to use AI for this.
Here's how I would approach this problem.
Step 1: Create a text entry field to capture your learners' answers to the essay questions and name its variable as learnerAnswer.
Step 2: Add an "evaluate" button next to the text entry field which the learner can click to receive their evaluation.
Step 3: To this "evaluate" button add a Storyline trigger to Execute Javascript.
Step 4: Copy paste this JavaScript to this trigger (This code works with our service but can be modified to work with any AI API with a little bit of coding:var player = GetPlayer(); var learnerAnswer = player.GetVar('learnerAnswer'); var systemPrompt = ` Roleplay as an evaluator. You will receive an answer to an essay question from a learner. The question is this <insert question here>. Respond to user in 10 lines. Ensure that your feedback follows this rubric <insert rubric here>. Do not include "Learner: " tag in your response. `; AIKey='YOUR AI API KEY'; //Here, you can add your AI key from the LLM provider of your choice. If you would like to //purchase one, contact us at aiready@arthalearning.com var AIPrompt = systemPrompt + "\n\n" + "Learner: " + learnerAnswer; // API call fetch(AIKey, { method: 'POST', body: JSON.stringify(AIPrompt), headers: { "Content-Type": "application/json", }, }) .then(response => { if (!response.ok) { return response.text().then(body => { throw new Error(`HTTP ${response.status} - ${body}`); }); } return response.json(); }) .then(data => { data = JSON.parse(data.body); player.SetVar('GPT_Response', data); }) .catch(error => { console.error('Error fetching GPT response:', error.message); // Provide a standard error response const gptResponse = "We can't analyse your answer right now. Please try again later. In the meantime, you could review and reflect on your course content."; // Set a variable in Articulate Storyline to store the response player.SetVar('GPT_Response', gptResponse); });Step 5: Edit the code to add your variable name if you used something different and add your api key for the AI. If you would like to purchase an api key, reach out to us at mailto:aiready@arthalearning.com
Step 6: Add a output box for the AI feedback and create and insert a variable named "GPT_Response" here.
Step 7: Preview, test, and iterate on the prompt.If you would like to view a demo of another implementation to see of this works, check out this link: https://arthademos.s3.us-east-2.amazonaws.com/AIReady+Demo+by+Artha+Learning/index.html.
Happy to answer any other questions you might have!
- JeanMarrapodi-cCommunity Member
Oooooooo. I like this @GarimaGupta-0f5. I wonder if this could also be set to remove the random extra space that Storyline (or something!) adds so the answers are marked as incorrect when they are really correct. Thanks for the inspiration.
- JohnCooper-be3cCommunity Member
@GarimaGupta-0f5 Please be VERY careful about this approach. Running your demo means I would be able to see your API Service Key i.e. the API key you are using to call your arthalearning.com AI proxy service. It is definitely NOT OK to include an API service key in your client side JavaScript code. Anything that runs in the browser means:
- Learners can see it
- Anyone can copy it
- Anyone can reuse it outside the course
That means:
- Unauthorized API usage
- Possible abuse of the AI service
If you were to expose an OpenAI API Key in this way you would almost certainly be breaching their terms of use.
The correct way to do this is
Client (Storyline / browser):
- Sends learner text only
- NO API keys
Server (secure backend app that):
- Stores the AI key safely (as an env variable)
- Validates requests
- Calls AI Service
- Returns sanitized results
- GarimaGupta-473Community Member
Hi John,
You are absolutely right, direct api calls via Javascript from Storyline exposes the API key. There is no way to send it as a secure environmental variable from client side though, so you must have a dedicated server that can put in your api key for you.
The way we handle it in AIReady is an interesting workaround. API keys can be 'tied' to your domain, and that is checked on server end. Traffic is only let through if it's coming from a whitelisted domain, such as your LMS. Also, the api key is not openai's key - it is a api to the server itself - and has hard limits set to avoid abuse.
Happy to chat more about it, it is fascinating stuff balancing security, innovation and learner experience.
Cheers,
Garima.
Related Content
- 1 year ago
- 11 years ago