characters
5 TopicsWelcome video
This example was created using a new feature from Rise Labs. I used the AI Avatar designed to welcome learners and introduce the objectives of the training. From a pedagogical perspective, AI avatars can be a powerful tool to engage learners at the start of a course, create a sense of presence, and clearly communicate learning goals. However, this type of tool should be used carefully to avoid cognitive overload. With this feature, we are given 1,200 credits, which correspond to 1,200 seconds of video content. This encourages intentional design choices and reinforces the importance of focusing on high‑value moments, such as course introductions or key transitions. From a technical standpoint, there is still room for improvement, particularly regarding mouth synchronization and eye movement. Nonetheless, this is a very promising start, and I’m curious to see how this technology evolves and how it can be integrated meaningfully into learning experiences.86Views0likes2CommentsI Built a Microlearning About AI Avatars Using AI. It Got Recursive Fast.
When Articulate dropped AI Avatar as a Labs feature in Rise 360, I did what any reasonable instructional designer would do: I decided to build a microlearning about AI avatars using AI avatars. Because nothing says "I understand this tool" like immediately using it to explain itself. The result is a five-slide Rise 360 course comparing Articulate AI Avatar and Synthesia — built entirely in Code Blocks, hosted by a custom AI avatar I created using Articulate's chat prompt interface, sitting side by side with interactive elements. The irony was fully intentional. The chaos was not. The Five Slides (Quick Version) Each slide uses a two-column layout with a styled vertical divider — video on one side, interaction on the other, alternating left and right so it doesn't feel like a template repeating itself. Slide 1 — Overview: Two flip cards, one per tool. Click to reveal what makes each approach distinct. Simple, satisfying, takes ten seconds to understand. Slide 2 — Feature Comparison: An expandable table. Click each row to get the real detail plus links to official docs. Because a static comparison table is just a spreadsheet wearing a costume. Slide 3 — Known Limitations: Six hotspot cards, color-coded by tool, each revealing a limitation and a workaround. My favorite slide. There's something deeply satisfying about making limitations feel like useful information rather than a product failing a job interview. Slide 4 — Best Use Cases: A branching scenario. Pick a real-world situation, get a tool recommendation with actual reasoning. No "it depends" cop-outs. Slide 5 — Knowledge Check: A three-question quiz with modal feedback and links to documentation. No video on this one — by slide five, learners have earned the right to just answer questions without another avatar staring at them. Just a heads-up: if your project includes videos, the upload might take a minute. Don’t panic—it’s a great time to go grab a coffee and a snack! ☕🍩 Where Claude Came In (Yes, I Used AI to Build the AI Course) Full transparency: this was a collaboration between me and Claude (Anthropic's AI). I used it throughout, and I'm not going to pretend otherwise. Narration scripts. Claude drafted the avatar scripts optimized for AI delivery — short sentences, natural pauses, clean endings. Of course, I read through the scripts and adjusted where needed. 😁 The HTML. Claude handled the code; I directed the design and tested obsessively in Rise. We went through more iterations on the flip card centering than I care to admit publicly. The honest version of AI collaboration: the AI does the parts that would have eaten your afternoon, so you can spend your brain on the parts that actually require a brain. What Articulate AI Avatar Is Actually Like The Labs label is accurate — this is a feature in active development, not a finished product. Generation is slow. Characters gesture like they're conducting an invisible orchestra (which is sometime funny). Backgrounds occasionally do things backgrounds shouldn't do. Videos can trail off without a clean ending if your script doesn't close well. None of these are dealbreakers. Workarounds exist: keep scripts short, end with a complete sentence, use simple backgrounds, test before committing to a full render. The thing that actually surprised me: Articulate supports illustrated characters in addition to photorealistic ones. Synthesia is photorealistic only. I created a fully custom avatar for this course using Articulate's chat prompt interface — described the role, the look, the setting — and used that same character consistently across all five slides. If your brand has a style that isn't "stock photo human," Articulate is currently your only option. That's a real differentiator and it doesn't get enough attention. The newest addition — video download — is what made this whole layout possible. Before, avatar videos lived only inside Rise. Now you can export the MP4 and actually do things with it. I took the intro video into Camtasia and added supporting images directly into the original avatar footage before dropping it into the course — something that would have been impossible when the video was locked inside Rise. Once it's an MP4 it's just a video, and your normal post-production workflow applies. Place it in the same folder as your HTML file, reference it by filename in the Code Block, and Rise serves them together. That's the technical trick the whole course is built on. Three Things I'd Tell Past-Me Test in Rise constantly. What looks right in a browser often breaks in the iframe. The centering problem I spent way too long on was one hardcoded pixel width on the body element. One line. Write scripts for the avatar, not for yourself. Short sentences. Periods where you'd normally use a comma. The script that reads slightly clunky on paper is usually the one the avatar delivers best. The ironic thing about this project is that I used AI to build a course about whether you should use AI. The answer? It depends on your use case and whether you've read the documentation. Conveniently, that’s also the core lesson of the course. Tools used: Articulate Rise 360, Articulate AI Avatar (Labs), Synthesia (for comparison), Claude by Anthropic. Not All AI Avatars Are Created Equal11Views0likes0CommentsThe unlikely return of Jobsworth
Hello! For this week's challenge, I used the 'Upload Character' tool in the new AI Avatar feature to resurrect Jobsworth, my robotic compliance officer from Challenge #412. It took only a few minutes to generate the voiceover and video. I have to say, the results are pretty impressive! Looks like my days of animating characters using state changes might be numbered. Check it out for yourself here: The unlikely return of Jobsworth
26Views1like0CommentsUsing AI Avatars to Enhance Immersive Learning - Inside Tesla's World
This challenge was an opportunity to explore how AI avatars can be used to enhance engagement within a learning experience, not by over-guiding the learner, but by setting the tone and creating a stronger sense of immersion. Rather than using a traditional narrator or static introduction, I wanted to introduce the experience through a character. To achieve this, I first generated an expressive avatar. This allowed me to create a consistent visual identity that felt aligned with the theme of Tesla and the overall environment. The avatar is used intentionally and sparingly to introduce the experience and reappear at key moments while the learner remains in control of the exploration. This approach helps maintain immersion without overwhelming the experience. The aim was to demonstrate how AI-generated avatars can be used in a simple, practical way within tools like Rise and Storyline Check out what I created in response to this week's challenge by clicking here. to elevate storytelling, create presence, and make digital learning feel more human and engaging.112Views1like1Comment