Forum Discussion
Adding Chat GPT functionality to Storyline
Hi,
I am operating at the limit of my small knowledge about JavaScript here, but I hope someone can help.
I am trying to integrate a Q & A function to a course using Chat GPT. I have set up a test page with the relevant information secure key from Chat GPT (and yes, I am aware that having it client side is not best practice, but I want to get it to work in principle). I have a script that I have checked (using Chat GPT, ironically) and am told it looks fine. I have published it, and it doesn't work - I have been through it time and again, and I can't see anything wrong. I opened the console in the browser and saw an error message 'status 429' I looked this up, and it doesn't apply here - but there is a response from the API from open AI - telling me that it can't see the secure key, that I know is there. The JS is 'looking' in the right place, but this doesn't appear to be transmitted to Chat GPT.
Does anyone have any idea what I am doing wrong? I hope it may be obvious to someone more experienced with Java than me (not difficult), but any help is appreciated. I have attached the script if it helps.
- MathNotermans-9Community Member
As is this is hard too debug. Code indeed looks ok. You might want to add several console.log("Now this happens"+somevariable); so you can check where the error is.
Do you get something back in the console.log(data); thats in your code ? - MichaelOSull894Community Member
Thanks for the quick response - the AI mentioned this when I asked it to check the code
"You are logging the error in the catch block, which is good for debugging, but you might want to handle errors more gracefully, e.g., show a message to the user in the event of an API failure". I assumed it meant that any error would show some sort of message - which I don't get.
I published the SL page to the web, and when I opened the Console I saw this as a URL - I know the key is there and only set up today. Does this make any sense to you?
{ "error": { "message": "You didn't provide an API key. You need to provide your API key in an Authorization header using Bearer auth (i.e. Authorization: Bearer YOUR_KEY), or as the password field (with blank username) if you're accessing the API from your browser and are prompted for a username and password. You can obtain an API key from https://platform.openai.com/account/api-keys.", "type": "invalid_request_error", "param": null, "code": null } }
- MichaelOSull894Community Member
- Jürgen_Schoene_Community Member
error message ...
"You didn't provide an API key.
means - the fetch command is working, but with the wrong value 'Authorization'
add after line 21
console.log( "auth", auth );
do you see in the console your (paid) chatGPT API key?
- AdamScormitCommunity Member
Having your API code exposed is really not the best practice. It's almost like sending your credit card and expiration date through email. You can get the code to work, but as soon as it goes semi-live, you're transmitting the key on an open network.
What will happen is that your key will be revoked by OpenAI if the key becomes exposed, and it will eventually become exposed.
You really need an endpoint to work with and have the code on your endpoint work directly with OpenAI and your Storyline project.- MichaelOSull894Community Member
Thanks for this - I am aware it's not best practice - and I saw in the GPT notes that they will invalidate the key if they think it has been compromised etc., but this is not being published - it's a proof of concept - then I will look at a more secure method
- MichaelOSull894Community Member
No - the API key doesn't appear - it is sitting as the default value in a variable Token.
- NatalieMohrCommunity Member
Can you PW protect the lesson so that only people you authorize can use it? For example, if you wanted to use it as a demo if you're looking to add it to your portfolio, but you don't want your key floating out there but you want a prospective employer to be able to view it? Or, I guess you could create video of your Storyline with the ChatGPT integration? How will people ever be able to use it if it's a security risk?
- Jürgen_Schoene_Community Member
if you use javascript trigger to save the API-key, it can be easy extract just before the first slide is started - user.js (with the 'secret' key) is loaded very early
You need a second server, if you want to use Chat GPT on any website or any storyline course
LMS server -> send prompt to special Backend server (with the secret API key)
Backend server -> OpenAI server
OpenAI server -> Backend server
Backend server -> LMS server (with the storyline course)there is no alternative that I know of
- MathNotermans-9Community Member
Easy solution is creating a Google Cloud function ( NodeJS ) that you can call for a OpenAI request. Then it is secure and works well.
The code listed below works when using it directly in either Lectora or Storyline. For Storyline you would need to change several lines. Especially you need to change the getting and setting of variables....and the aiBearer is offcourse your own key. But using it like this ISNOT SAFE !/*
OpenAI API related functions
*/
function setAIsettings(){
aiCompletions = "https://api.openai.com/v1/chat/completions";
DALEurl = "https://api.openai.com/v1/images/generations";
aiModel = VaraiModel.getValue(); // Lectora variable
aiBearer ="xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"; //Bearer code from OpenAI
aiInstructions = VaraiInstructions.getValue(); // Lectora variable
console.log("instruct: "+aiInstructions);
inlineQuestion = VarinlineQuestion.getValue(); // Lectora variable
chatMessages = [
{role: "system",content: aiInstructions },
{role: "user", content: inlineQuestion },
];
OpenAIArray.push({role: "system",content: aiInstructions });
window.localStorage.setItem('AIcalls', JSON.stringify(OpenAIArray));
}
function OpenaiFetchAPI() {
console.log("Calling OpenAI and getting localStorage");
/*
For history in chatGPT we use localStorage.
So we first get all previous questions and then we create a new 'message' for chatGPT
*/
savedQuestionsArray = JSON.parse(window.localStorage.getItem('AIcalls'));
console.log("savedQuestions "+savedQuestionsArray);
inlineQuestion = VarAIprompt_01.getValue();// Variable in your tool. This is for Lectora. Storyline uses other syntax
savedQuestionsArray.push({role: "user", content: inlineQuestion });
window.localStorage.setItem('AIcalls', JSON.stringify(savedQuestionsArray));
var url = aiCompletions;
var bearer = 'Bearer ' + aiBearer;
fetch(url, {
method: 'POST',
headers: {
'Authorization': bearer,
'Content-Type': 'application/json'
},
body: JSON.stringify({
messages:savedQuestionsArray,
max_tokens: 1000,
temperature: 1,
model:aiModel,
top_p: 1,
n: 1,
stream: false
})
}).then(response => {
return response.json();
}).then(data=>{
console.log(data);
console.log(data.choices[0].message.content);
AIresponse = data.choices[0].message.content;
console.log("response: "+AIresponse);
showResults();
})
.catch(error => {
console.log('Something bad happened ' + error);
});
}
/*
To ensure AI remembers questions and answers given we use LocalStorage
*/
function setLocalStorage(_arr){
window.localStorage.setItem('AIcalls', JSON.stringify(_arr));
}
function getLocalStorage(){
let tmpArray = JSON.parse(window.localStorage.getItem('AIcalls'));
return tmpArray;
}
function clearLocalStorage(){
window.localStorage.clear();
}
Do remember you have to change these things. It will not work just copying and pasting.
When you have this code setup properly, you can set your variables. Then callsetAISettings( );
on start of a slide... and then when a user asks a question...fill the variable 'inlineQuestion' and callOpenAIFetchAPI( );
The functionshowResults( )
does exactly what it says ;-)- AdamScormitCommunity Member
This easily exposes your API key which is against the OpenAI TOS and can be
easily hacked by anybody using your course. This is NOT the proper way to
use ChatGPT inside of a course.You should recall this message.
- MathNotermans-9Community Member
I allready stated that its not safe. And people should use it serverbased or on a cloudservice. Nevertheless this is the principle to get it working.
- Dave-RuckleyCommunity Member
The 429 error is probably because you don't have any API credits. I had the same issue with that code until I added some credits to my OpenAI account: https://platform.openai.com/account.
Once I did that the code worked perfectly.
- NatalieMohrCommunity Member
That happened to me too.