Example

SMcNicol's avatar
SMcNicol
Community Member
1 day ago

I Built a Microlearning About AI Avatars Using AI. It Got Recursive Fast.

When Articulate dropped AI Avatar as a Labs feature in Rise 360, I did what any reasonable instructional designer would do: I decided to build a microlearning about AI avatars using AI avatars. Because nothing says "I understand this tool" like immediately using it to explain itself.

The result is a five-slide Rise 360 course comparing Articulate AI Avatar and Synthesia — built entirely in Code Blocks, hosted by a custom AI avatar I created using Articulate's chat prompt interface, sitting side by side with interactive elements. The irony was fully intentional. The chaos was not.

The Five Slides (Quick Version)

Each slide uses a two-column layout with a styled vertical divider — video on one side, interaction on the other, alternating left and right so it doesn't feel like a template repeating itself.

Slide 1 — Overview: Two flip cards, one per tool. Click to reveal what makes each approach distinct. Simple, satisfying, takes ten seconds to understand.

Slide 2 — Feature Comparison: An expandable table. Click each row to get the real detail plus links to official docs. Because a static comparison table is just a spreadsheet wearing a costume.

Slide 3 — Known Limitations: Six hotspot cards, color-coded by tool, each revealing a limitation and a workaround. My favorite slide. There's something deeply satisfying about making limitations feel like useful information rather than a product failing a job interview.

Slide 4 — Best Use Cases: A branching scenario. Pick a real-world situation, get a tool recommendation with actual reasoning. No "it depends" cop-outs.

Slide 5 — Knowledge Check: A three-question quiz with modal feedback and links to documentation. No video on this one — by slide five, learners have earned the right to just answer questions without another avatar staring at them.

Just a heads-up: if your project includes videos, the upload might take a minute. Don’t panic—it’s a great time to go grab a coffee and a snack! ☕🍩

Where Claude Came In (Yes, I Used AI to Build the AI Course)

Full transparency: this was a collaboration between me and Claude (Anthropic's AI). I used it throughout, and I'm not going to pretend otherwise.

Narration scripts. Claude drafted the avatar scripts optimized for AI delivery — short sentences, natural pauses, clean endings. Of course, I read through the scripts and adjusted where needed. 😁

The HTML. Claude handled the code; I directed the design and tested obsessively in Rise. We went through more iterations on the flip card centering than I care to admit publicly.

The honest version of AI collaboration: the AI does the parts that would have eaten your afternoon, so you can spend your brain on the parts that actually require a brain.

What Articulate AI Avatar Is Actually Like

The Labs label is accurate — this is a feature in active development, not a finished product. Generation is slow. Characters gesture like they're conducting an invisible orchestra (which is sometime funny). Backgrounds occasionally do things backgrounds shouldn't do. Videos can trail off without a clean ending if your script doesn't close well.

None of these are dealbreakers. Workarounds exist: keep scripts short, end with a complete sentence, use simple backgrounds, test before committing to a full render.

The thing that actually surprised me: Articulate supports illustrated characters in addition to photorealistic ones. Synthesia is photorealistic only. I created a fully custom avatar for this course using Articulate's chat prompt interface — described the role, the look, the setting — and used that same character consistently across all five slides. If your brand has a style that isn't "stock photo human," Articulate is currently your only option. That's a real differentiator and it doesn't get enough attention.

The newest addition — video download — is what made this whole layout possible. Before, avatar videos lived only inside Rise. Now you can export the MP4 and actually do things with it. I took the intro video into Camtasia and added supporting images directly into the original avatar footage before dropping it into the course — something that would have been impossible when the video was locked inside Rise.

Once it's an MP4 it's just a video, and your normal post-production workflow applies. Place it in the same folder as your HTML file, reference it by filename in the Code Block, and Rise serves them together. That's the technical trick the whole course is built on.

Three Things I'd Tell Past-Me

Test in Rise constantly. What looks right in a browser often breaks in the iframe. The centering problem I spent way too long on was one hardcoded pixel width on the body element. One line.

Write scripts for the avatar, not for yourself. Short sentences. Periods where you'd normally use a comma. The script that reads slightly clunky on paper is usually the one the avatar delivers best.

The ironic thing about this project is that I used AI to build a course about whether you should use AI. The answer? It depends on your use case and whether you've read the documentation. Conveniently, that’s also the core lesson of the course.

Tools used: Articulate Rise 360, Articulate AI Avatar (Labs), Synthesia (for comparison), Claude by Anthropic.

Not All AI Avatars Are Created Equal

No RepliesBe the first to reply

Getting Started with the E-Learning Challenges

Find practical answers to common questions about the E-Learning Challenges, a weekly event that helps you build skills, create your portfolio, and grow as an e-learning professional.