Example
171 TopicsUpdated "Reveal" codes
I’ve been experimenting with the original HTML code blocks included in Articulate 360’s built-in examples and wanted to share how far you can extend that base structure using GenAI to iterate and refine interactions. Starting with the default image-reveal index provided by Articulate, I used GenAI to progressively develop three new versions. I supplied my own images, created meaningful alternative text for screen readers, and introduced additional UX and accessibility improvements. Every version is fully tailorable if you want to adapt the formatting, colours, spacing or behaviour. The three examples are: Enhanced Image Reveal Grid Uses the original Articulate structure. Adds a hover zoom, a click-to-zoom state, and high-contrast purple letter tiles for accessibility. Fanned “Deck of Cards” Flip Interaction A dynamic fanned layout, more like a real card hand. Cards lift and reveal their letter on hover, flip on click, and reset if clicked again. Includes chevron navigation for easier cycling. Plain Flip Grid with Navigation A clean, accessible flip-card grid with navigation chevrons. Mirrors the deck behaviour but with a simplified layout. All three examples are linked below, along with the downloadable files. If you have suggestions, improvements or alternative approaches, I’d really love the feedback. And if you’d like to use or remix any part of this, feel free — I’d love to see what you create with it. Review360 Zip files GDrive location: HERE78Views2likes3Comments💡 Confidence Self-Check: A Reflective Benchmark Tool UPDATED 151125 - See comments below!! 👇
Hi everyone, UPDATED 151125 - See comments below!! 👇 Here’s a quick show-and-tell example I’ve been experimenting with — a Confidence Self-Check tool built in Storyline 360 and embedded into Rise 360 as a formative reflection block. The goal was to give learners a way to benchmark their confidence and awareness before and after a session, helping them see their own progress and prompting metacognitive reflection — without the need for LMS data capture. I wanted something that: ✅ Supports metacognition — helping learners think about their own learning 🔄 Tracks progress with “before” and “after” self-checks 🧠 Encourages reflection rather than testing knowledge 💬 Uses local storage only (no data collection) to keep it private and learner-centred 💻 How it was created This build was produced through an iterative Generative AI-assisted workflow, where I coached an AI (ChatGPT – GPT-5) step-by-step through design reasoning, JavaScript development, accessibility checks, and instructional alignment. The focus was on human-assured prompting — using AI to accelerate build logic while maintaining learning design intent, tone, and pedagogy. The project was inspired by JoeDey’s “Perpetual Notepad” (huge kudos for the original concept!), and extended to include weighted confidence scoring, dual checkpoints, and adaptive feedback messages. ⚙️ Known limitation Because this tool is designed to be session-specific, each new deployment requires: Updating the SESSION_LABEL and STORAGE_PREFIX variables in the JavaScript to give that session its own ID. Editing the question text to match the focus of that session. These edit points are clearly marked in the script with: >>> EDIT SESSION METADATA HERE <<< and >>> EDIT QUESTIONS FOR THIS SESSION HERE <<< It’s a simple one-minute change — but worth noting if you plan to scale this across multiple modules or courses. You can explore the working example here: 👉 Rise Review link A downloadable .story file is included inside the review for anyone who wants to look under the hood, edit the JavaScript, or adapt the design for their own learners. 💬 Open for feedback I’d love to hear from other e-learning designers — especially anyone experimenting with AI-supported authoring or reflective learner tools. How might you extend or refine this concept? I’d love your thoughts or suggestions — particularly around: How you’d extend this for different learner profiles Ideas for alternative feedback messages or visual treatments Whether you’ve built similar “confidence meter” interactions in your own work Feel free to reuse, remix, or expand the concept. Always happy to connect and collaborate with other learning designers! 🔗 Portfolio: forgedframeworks.co.uk/ 📧 Contact: dan.boyland@forgeframeworks.co.uk Thanks in advance for any feedback, and again, credit to Joe Dey and the Articulate community for sharing the foundation idea that made this possible.91Views2likes5CommentsCrossword in Rise
I vibe coded a crossword puzzle interaction in the new Articulate Rise custom html block to support our Accounts Review training. It took about an hour of back-and-forth with Copilot to get this working. Check it out here https://rise.articulate.com/share/OHzJApuSIhFcNe4GLwmto58-5dg_-j-C#/lessons/3cT6ydJmoggnBlDSVsXmKaxp11ASrlKp Full HTML code is below the preview, feel free to adapt it and repurpose for own projects.525Views8likes19CommentsCard Reader - Fun new drag and drop interaction
Working with a new client I was tired of the same ol click and reveal content interactions. I used the drag and drop feature to come up with something different. Learners drag cards to a virtual reader that scans, beeps, and reveals colorful insights—turning simple content into an engaging experience. It’s a fun way to explore information. Review link: https://360.articulate.com/review/content/3582f784-ef45-416a-a8a0-73c9f0380ae7/review?version=1109Views1like1CommentEscape Room
We've built escape room interactions for clients a few times, but haven't been able to share them publicly. That's why I've taken the time to create this demo. I created a fictional drug, SynebroVax, and built a scenario where the learner plays a pharmaceutical sales rep preparing for a high-stakes meeting with stakeholders. The goal is to explore the environment, uncover insights, and respond to realistic stakeholder concerns, all within a timed, gamified experience. It’s designed to show how immersive storytelling, decision-making, and interactivity can turn complex product knowledge into something memorable and engaging.2.1KViews18likes33CommentsLevel Up English Quiz
Hi everyone! I’d like to share my project “Level Up English Quiz”, which I created for the English Speaking Club at the company where I work. This quiz was created to attract new members and speakers. Feel free to share your feedback or ideas for improvement. 👉Level Up English Quiz How it’s built: I created three variables: FiftyUsed, HintUsed, and SkipUsed. Each hint icon has two states: Normal (active) and Disabled (used). When a hint is clicked, the corresponding action is performed: 50/50 — changes the state of two incorrect answers to Hidden; Hint — opens a layer with a text clue; Skip — automatically skips the current question and moves to the next one. After a hint is used, its variable changes to True, and the icon becomes inactive. When each question loads, triggers check the variable values and disable any hints that have already been used.373Views5likes8CommentsCapturing & Consolidating Learner Notes
I created an example of adding a learner notes functionality using the new Rise Code Block. This let's learners capture notes at various places in a course and then consolidates them all into a printable format or an email. The demo also includes a complete code design walk-through and has all of the code at the end for easy copy/paste into your own projects. Enjoy! https://360.articulate.com/review/content/4c2a7e7f-09ca-4d4f-890d-86cec26bf48b/review290Views4likes6CommentsCooking Game (Jeopardy style + Gamifiation)
Hello Articulate Heroes! I'm excited to share my second personal project with you — a cooking-themed, Jeopardy-style game! Cooking Frienzy This project was inspired by two fantastic webinar series shared here: How to Create A Jeopardy! Style Game Gamification series I started with the "Jeopardy!" template and added the following custom features: Cooking-themed questions and answers — 5 questions across 5 categories Custom visuals — including characters, backgrounds, UI, and tokens The ability to choose one of three characters at the start of the game (and replay with a different chef assistant!) Personalized feedback and questions — with character-specific images and voiceovers A 20-second Pomodoro-style timer with a “wiped” animation Tokens awarded when the user completes a certain number of questions The characters were created using AI. Thank you for taking the time to check out the game! I’d love to hear your thoughts — feel free to share any comments or suggestions! You can check-out the game by this link: Cooking FrienzySolved1.2KViews8likes20CommentsJavascript - Free text box with answer
AI helped me write code to create a quiz question using a free text box answer with suggested answer and requirement to complete before continuing. Screenshots are attached to show what it looks like if the learner doesn't enter text and when they do. Code is: <div class="custom-container"> <h1 class="custom-header">ENTER QUESTION HERE</h1> <div class="input-container"> <textarea id="myTextArea" class="custom-textbox" placeholder="Type your answer here..." rows="5"></textarea> <button id="submitBtn" class="custom-submit-button">Submit</button> <p id="submissionMessage" class="message"></p> </div> <!-- New element to display the suggested answer, hidden by default --> <div id="suggestedAnswer" class="suggested-answer" style="display: none;"> <p><b>Answer:</b></p> <p>ENTER ANSWER HERE</p> </div> </div> <script> document.addEventListener('DOMContentLoaded', function() { const submitBtn = document.getElementById('submitBtn'); const textArea = document.getElementById('myTextArea'); const submissionMessage = document.getElementById('submissionMessage'); const suggestedAnswer = document.getElementById('suggestedAnswer'); submitBtn.addEventListener('click', function() { // Check if the text box has content if (textArea.value.trim() !== '') { // Sends the completion message to the parent Rise window window.parent.postMessage({ type: 'complete' }, '*'); // Update UI to show completion submissionMessage.innerText = 'Submission successful!'; submissionMessage.style.color = '#001655'; submitBtn.disabled = true; textArea.disabled = true; // Show the suggested answer suggestedAnswer.style.display = 'block'; } else { // Provide feedback to the learner if they didn't enter text submissionMessage.innerText = 'Please enter your response before submitting.'; submissionMessage.style.color = '#d32f2f'; // A red color for errors } }); }); </script> <style> .custom-container { background-color: white; font-family: Arial, sans-serif; text-align: center; padding: 0px; border-radius: 8px; margin: 0 auto; /* Center the container */ max-width: 850px; } .custom-header { color: #001655; /* Use the hex code for the header color */ font-size: 20px; /* Updated header font size */ margin-bottom: 20px; } .input-container { display: flex; flex-direction: column; align-items: center; } .custom-textbox { width: 100%; max-width: 850px; background-color: #e2f0fa; color: #001965; border: 1px solid #c8dbe6; padding: 15px; font-size: 17px; /* Updated text box font size */ line-height: 1.5; border-radius: 5px; box-sizing: border-box; /* Ensures padding is included in width */ } .custom-submit-button { background-color: #001965; /* Dark blue for the button */ color: white; border: none; padding: 12px 24px; font-size: 18px; cursor: pointer; margin-top: 10px; border-radius: 5px; transition: background-color 0.3s ease; } .custom-submit-button:hover { background-color: #003366; /* A slightly lighter blue on hover */ } .message { margin-top: 15px; font-weight: bold; min-height: 20px; /* Prevents layout shift */ } .suggested-answer { text-align: left; margin-top: 30px; padding: 10px; background-color: #f0f8ff; border: 1px solid #c8dbe6; border-radius: 5px; } .suggested-answer p { margin: 0 0 10px 0; color: #001965; } .suggested-answer p:last-child { margin-bottom: 0; } </style>132Views3likes2CommentsBuckle Up
With the additional functionality exposed by the JavaScript API, I've been wanting to implement more specific mechanics of design, e.g., movement/traversal, action/combat, puzzles/problem-solving, etc. With objects (and their coordinates) being so easy to reference now, I wanted to demonstrate movement/traversal being implementable with only some modest code. In this particular design, I wanted to explore how sliding textured images horizontally from one side of the screen to the other could simulate motion, especially with a simple, red rectangle standing in as a car controlled by you. Note: The Up/Down and Left/Right controls cancel each other out accidentally. My intent in this project was on animating graphics and connecting audio, not so much exploring controls, so I left it as it is. I do recognize though that additional learner control would likely improve that aspect of the experience. Graphics For graphics, I constrained myself to most every visual asset being either a basic shape styled a particular way or an image sourced from the content library so that the bulk of the emergence would come from how those simple objects were manipulated in complex ways. Below is a reduced version of the slide's main update loop: update(() => { // Calculates deltaTime (dt), a variable common in frame-rate dependent content like games to constrain the content from rushing or dragging unintentionally. const now = performance.now(); const dt = (now - lastTime) / 1000; // seconds since last frame lastTime = now; const elapsedSeconds = now - lastRecordedTime; // Flipping this flag to false naturally stops the road from moving. if (isMoving) { // Applies/simulates friction against the speed variable, increased elsewhere (not here) in other code related the keyboard keys. speed = Math.max(1, getVar('speed') - (getVar('isOffroad')? 2.5*gsap.ticker.deltaRatio(60) : .2*gsap.ticker.deltaRatio(60))); turnSpeed = Math.ceil(speed / 150) + (getVar('isOffroad')? 2 : 0); // The gsp.ticker calls above allow for gradual changes in the speed over time (as opposed to sudden changes), better simulating friction. In practice, increase and decrease speed however you want so long as its final value makes it down below to whichever lines change the x-position of the image. // Move roads to the left every frame road1.x -= speed * dt; road2.x -= speed * dt; // Reset to right side once entirely off-screen, leap-frogging one road image over the other. if (road1.x < (slideWidth() * -1)) { road1.x = road2.x + road1.width; } if (road2.x < (slideWidth() * -1)) { road2.x = road1.x + road2.width; } }; Audio The "car" features three sound emitters: A repetitive sound emulating exhaust coming out a car's muffler A slower, lower repetitive thump that plays when "driving" on grass A horn (press Spacebar or click the car) All three "emitted" sounds are generated at runtime (not sample-based) using the Tone.js library. The horn is instrumental and silly by design, demonstrating a simple "sine wave" from a default synth that one can use to play music. The other two emitters demonstrate a more complex understanding of waveform modulation. The offroad rumble and muffler synthesizers demonstrate the contextual simulation of simple rhythmic, mechanical sounds. The offroad sound is produced with a Metalsynth, a synthesizer considered clangy and dissonant, but adding a low-pass filter (LPF) at a warm 90hz softens its clang into a warm rumble: const rumbleLPF = new Tone.Filter(90, "lowpass"); const rumblePanner = new Tone.Panner(); if(!window.tireRumble) { window.tireRumble = new Tone.MembraneSynth({ envelope: { attack: 0.1, decay: 0.0, sustain: 1, release: 0.1 } }) .connect(rumbleLPF) .toDestination(); } The muffler exhaust sound was a bit more nuanced. Being personally cognizant of the importance of harmonics in vehicle vibrations, both desirable and not, the FM synth seemed a fitting choice: const exhaustHPF = new Tone.Filter(45, "highpass"); const exhaustLPF = new Tone.Filter(125, "lowpass"); const exhaustPanner = new Tone.Panner(); if (!window.exhaust) { window.exhaust = new Tone.FMSynth({ envelope: { attack: 0.001, decay: 0, sustain: 1, release: 0.005 }, modulationIndex: 40, harmonicity: 3, volume: -6 }) .connect(exhaustHPF) .connect(exhaustLPF) .connect(exhaustPanner) .toDestination(); } Both of these synths run in Storyline's update() loop with some logic dictating when and how they activate. Of particular note (indeed, my solitary seed of this whole project) is scaling the pitch played by the synthesizer with the speed variable: // triggerAttackRelease(note, duration, time?, velocity?): this window.exhaust.triggerAttackRelease(5 + getVar('speed')/100, "8n", now); The minimum of 5 guarantees a lovely purr. "8n" refers to eighth notes, the type of note played over and over again by Storyline. A gentle logarithmic function might better emulate a gear nearing a maximum, and a few conditional ones stair-cased together could emulate the sound of changing gears… The explosion sound effect and music are stock I found online, though I did adjust the song's stereo field to carve more space in the center for the sound effects. Naturally both the music and the explosion can also be crafted in Tone.js--I entertained the idea of the off-road sound effect only ever being out-of-sync with the music to further layer in negative feedback--but I did what I aimed to. I've wanted for a very long time to procedurally synthesize responsive audio at runtime and this proof-of-concept hopefully helps demonstrate such utility.180Views5likes2Comments