Forum Discussion
Using OpenAI from within Articulate Storyline - a working Story
In this discussion I share how to query and get responses from OpenAI within Storyline.
Try out my Review360 upload: https://360.articulate.com/review/content/ff1ca535-74d2-4389-a79a-33609210ae76/review [this is updated with encryption for storing browser variables.]
You can find the Story for this example attached in the resources below [a revised Story is found in a response below which incorporates encryption for browser variables]. You might like to follow along with this discussion.
The resources below also contains links to enable you to find out more about OpenAI, makers of chat-gpt.
1. My example here uses Javascript to make the OpenAI calls, invoked from within a Story.
The story contains two slides.
2.1 We’ll start with slide 1.2 webObject
2.2 This slide is never run, but is a container for a webObject. That webObject contains files, bundled when the webObject is created, and included when the story is published. You can find a link in Resources below on how to build Stories that access the content of webObjects.
2.3 The webObject is given the address of a sub-directory on my laptop when it is inserted into the slide. That sub-directory looks like this:
…/webObject/
|--> index.html
|-->globals/
|--> globalScripts.js
2.3.1 The index.html is empty in this case (but required for the webObject).
2.3.2 The sub-directory, globals, containing my global javascript code, globalScripts.js. This file contains functions for querying the OpenAI models and for parsing the responses. This global code is available to all slides in a Story. It is also attached in the resources below for you to follow along with and re-work into your own stories [a revised version is found in the responses below which includes encryption for storing browser variables].
2.4 Main slide 1.1
When the Story runs, it starts on the base layer.
2.4.1 The base layer is more or less a container to execute Javascript which dynamically loads the jquery library and my globalScripts.js personal library from the webObject on slide 1.2.
2.4.2 These are loaded asynchronously, and a Story variable is triggered when the dynamic loading is done. Once done the Story flows on to the Introduction layer.
2.5 Introduction layer
2.5.1 The Introduction layer plays a spoken outline, informing you what to expect from the interaction.
2.5.2 It then checks if you have a saved OpenAI key. If so, then it shows the Query layer. If not, is shows the Get Key layer.
2.6 Get Key layer
2.6.1 The Get Key layer in essence contains an input field, where you enter your Open AI key.
2.6.2 I you do not yet have an Open AI key you can follow the link provided to get one. The link is also provided in the resources below (api-keys).
2.6.2.1 An Open AI key is freely obtainable. There is an available usage credit that will expire after a number of interactions with the A.I.
2.6.2.1 Obtaining a key is very straightforward and consists of a couple of steps. You do not need to give your personal details in obtaining a key. And your usage of the A.I. is not recorded.
2.7 Query layer
2.7.1 The query layer consists of one text input box, called, Query, and two text boxes, called Response and Receipt.
2.7.1.1 Query is where you type your queries to be sent to the A.I.
2.7.1.2 If you type 'help' then the Help layer is displayed (see 2.8)
2.7.1.3 After you enter a Query value, the Ask button will become visible.
2.7.2 Clicking the Ask button executes Javascript.
2.7.2.1 This selects the hard-coded "text-curie-001" engine, which together with your text Query are sent to the queryOpenAI() function. This is found in my globalscripts.js file.
The available engines can be found from the resources below (see models). They differ in capability and price. Price is the cost of processing and responding to each query.
2.7.2.2 queryOpenAI() is the core of the Open AI interface.
2.7.2.3 It runs asynchronously (meaning it waits for a reply while other code may continue to run).
2.7.2.4 It posts the query to the OpenAI completions website, with various parameters (refer to the resources below), and awaits for a response.
2.7.3 When data comes back from the OpenAI engine, by way of json content, it is parsed into a response part and a receipt part.
2.7.4 Back on the Query layer, Response and Receipt are text boxes where the response from the A.I. are displayed.
2.7.4.1 Response is a textual reply, generated from the A.I. Generated responses make for the dynamic element of using OpenAI in training material.
2.7.5 Receipt is the 'price' of the Query. This 'price' or calculated cost is the approximate amount that will have been deducted from your Open AI key credit.
2.7.5.1 To minimise cost, you should phrase your queries precisely.
2.8 Help layer
2.8.1 This contains some example starter questions, to get you going.
2.9 About layer
2.9.1 This says a little bit about the Story and use of OpenAI.
3. Modifying
3.1 You can copy the Story and the globalscripts.js file to familiarise and try out your own ideas.
3.2 See how to build stories with web objects in the resources below; which you will need to do when you first build the story, and each time you make changes to globalscripts.js.
3.3 My story hardcodes the curie engine. You can try different engines (see models in the resources).
4. Pointers to articulate features wanted:
4.1 It would be nice to have a text-to-speech capability at runtime. For example, to "say" a response, rather than display it as text.
4.2 It would be nice to have a GetPlayer( ).GetState( "object" ) and .SetState( "object", "state" ) capability. I had to make do with setting Storyline variables from within Javascript, which invoked triggers in Storyline. This works fine but is hard to remember what triggers what after a time.
Resources:
https://platform.openai.com/docs/introduction
https://platform.openai.com/account/api-keys
https://platform.openai.com/docs/models
https://360.articulate.com/review/content/ff2e4ae0-051a-430b-a1b3-0b677eb6e63c/review How to build Stories with webObjects that contain files after you change the file content.
Parts of the code in globalscripts.js is derived from Ivan Campos' github project https://github.com/IvanCampos/OpenAI-API and in particular the jumpstart example.
- JulianRose-c8fdCommunity Member
Hei Angela,
You might like to check this too: https://community.articulate.com/discussions/building-better-courses/chatgpt-integration-for-articulate-storyline-360-resources-for-developers#reply-864497
- JulianRose-c8fdCommunity Member
The openai node.js solution (and an equivalent python/flask solution) are described at: https://platform.openai.com/docs/quickstart/build-your-application. (I built and ran the python example a couple of weeks back. It is 'equivalent' to a node.js solution.)
The openai node.js library (including an outline of how it works) is here: https://www.npmjs.com/package/openai
With the node.js solution, the OpenAI key is saved in an environment variable accessible only to that built server process. So it is secure. But - assuming a multi-user host - you would need to run one server process for each user (each one with a unique IP port) if you intend to run a different proxy for each user key. (You can of course use a single corporate key instead of personal keys. Or you can develop your node.js server to store multiple keys.)
The node.js solution might work if you want to build a small web application. In effect you are building proxy web server(s), which communicate with the OpenAI server using your key(s).
For discussion - I suggest the node.js solution might not be so well-suited to a (single-user host) SCORM or xAPI target. These targets are the broad output of Articulate tools. (While it would work on handhelds, I'm not sure how users would fare on those devices, also targeted by Storyline and Rise.)
If the community is saying node.js is the only acceptable manner, then there are implications for how Articulate might provision OpenAI in Storyline / Rise. Which is clearly wanted.
The problem is not accessing the OpenAI APIs. My concept demo shows that can be done. And I think Chris has a solution along the lines of node.js, and/or see the openai tutorials. One problem is storing the OpenAI key securely, as others have focused on. I think the main question should be on integrating OpenAI use cases within SCORM deliverables.
Feedback and corrections welcome.
- MustafaSnger-71Community Member
Thx for the guide.
I will look deeply soon.
- AngelaDunn-b8d7Community Member
Thank you, I'm following.
- JulianRose-c8fdCommunity Member
In another thread Chris pointed out the open AI key was not secure.
To ally that concern please find attached revised versions of the story and the embedded functions globalScripts.js file that encrypt the open AI key in the browser. Transmission uses the https protocol thus ensuring the key is encrypted during network traversal.
- Jürgen_Schoene_Community Member
your example is very good for testing and learning, but not for real use
NEVER use such functions for store the API key
function writeVal( key, value ){
const serializedData = localStorage.getItem( NAMESPACE );
const data = serializedData ? JSON.parse( serializedData ) : { };
data[ key ] = value;
localStorage.setItem( NAMESPACE, JSON.stringify( data ));
}
function readVal( key ) {
const serializedData = localStorage.getItem( NAMESPACE );
const data = JSON.parse( serializedData );
return( data ? data[ key ] : undefined );
}the API key is stored on the client computer (-> localStorage)
everyone (every learner) who as access to the course can copy the API key in seconds and use or sell it - you will pay for the 'illegal' usage
the only solution is to store the api key on a separate webserver (e.g. with node.js)
client browser <-> your webserver with the API key <-> openAI server
- JulianRose-c8fdCommunity Member
Hei Juergen,
Thank you for your feedback.
Indeed my example is just a concept demonstrator for using openAI. It is far from a complete application or deliverable course.
I should point out the functions you critique are not actually those used to store the OpenAI keys. The functions used to do that are writeKey and readKey, which are similar, but ok basically the same.
Following Chris' earlier observation on key security, I have added encryption prior to calling these two functions. So what people will see in the runtime browser variables will be those encrypted strings. You can find this in the revised globalScripts.js file and .story attached in my post above yours; see the revisions to openaiGetKey and openaiSetKey.
I import the SimpleCrypto library on startup, see the first exec javascript on slide 1.1 of the .story.
I would be interested if you think this encryption is also deficient.
(Just to make it clear you can find the encryption key in the output file data.js if you build a web target from Storyline. So you wouldn't want to deliver a web bundle. But if you deliver a SCORM / xAPI, or deliver to Review360, then the encryption key would be no more exposed than by using node.js.)
(I have deleted the original globalScripts.js and .story files.)
(I would also point out the openai command line interface tool makes the openai key visible through an environment variable in the client process: https://platform.openai.com/docs/guides/fine-tuning/installation. But we can say that is not best practice.)