Adding Chat GPT functionality to Storyline
Aug 09, 2023
Hi,
I am operating at the limit of my small knowledge about JavaScript here, but I hope someone can help.
I am trying to integrate a Q & A function to a course using Chat GPT. I have set up a test page with the relevant information secure key from Chat GPT (and yes, I am aware that having it client side is not best practice, but I want to get it to work in principle). I have a script that I have checked (using Chat GPT, ironically) and am told it looks fine. I have published it, and it doesn't work - I have been through it time and again, and I can't see anything wrong. I opened the console in the browser and saw an error message 'status 429' I looked this up, and it doesn't apply here - but there is a response from the API from open AI - telling me that it can't see the secure key, that I know is there. The JS is 'looking' in the right place, but this doesn't appear to be transmitted to Chat GPT.
Does anyone have any idea what I am doing wrong? I hope it may be obvious to someone more experienced with Java than me (not difficult), but any help is appreciated. I have attached the script if it helps.
7 Replies
As is this is hard too debug. Code indeed looks ok. You might want to add several console.log("Now this happens"+somevariable); so you can check where the error is.
Do you get something back in the console.log(data); thats in your code ?
Thanks for the quick response - the AI mentioned this when I asked it to check the code
"You are logging the error in the catch block, which is good for debugging, but you might want to handle errors more gracefully, e.g., show a message to the user in the event of an API failure". I assumed it meant that any error would show some sort of message - which I don't get.
I published the SL page to the web, and when I opened the Console I saw this as a URL - I know the key is there and only set up today. Does this make any sense to you?
Not sure if this helps - but I have just reset and run the sequence - and this appeared in the Console
Having your API code exposed is really not the best practice. It's almost like sending your credit card and expiration date through email. You can get the code to work, but as soon as it goes semi-live, you're transmitting the key on an open network.
What will happen is that your key will be revoked by OpenAI if the key becomes exposed, and it will eventually become exposed.
You really need an endpoint to work with and have the code on your endpoint work directly with OpenAI and your Storyline project.
error message ...
means - the fetch command is working, but with the wrong value 'Authorization'
add after line 21
do you see in the console your (paid) chatGPT API key?
No - the API key doesn't appear - it is sitting as the default value in a variable Token.
Thanks for this - I am aware it's not best practice - and I saw in the GPT notes that they will invalidate the key if they think it has been compromised etc., but this is not being published - it's a proof of concept - then I will look at a more secure method