localization
182 TopicsLocalization News!
Hello, I just wanted to take a moment to share a couple of announcements that I'm super excited about! The big news is that we just launched Articulate Localization, which allows you to translate, validate, and manage your courses from within the Articulate 360 platform. 😍 Check out this article for more information. Along with that, I'm happy to share that we just launched a new Localization group where you can ask questions, discuss localization best practices, connect with others who work with localization, and more. Just hit the Join Group button to get started! I'm curious, have you started a trial of Articulate Localization yet? Let us know in the comments!3.5KViews5likes39CommentsLocalization Publish to Word
Hi, I'm currently using the Localization tool to translate my modules into Spanish and Chinese. Some of our slides have images with a different version for each slide. For example, a state publication is translated in each language, so we'll use the match languages cover. So, we've been adding states to the images, so the Spanish state shows when the language is set to Spanish. However, when publishing to Word, the outcome is unpredictable. Often, the English version shows in the Spanish word view, and sometimes the Spanish or Chinese version show in the English word view. Also, the master slide items don't appear correctly in the word view for Spanish. At the top, the text is in English in the Spanish word view. As well as the play buttons, despite them showing in Spanish in Storyline. Is there a best practice for this?34Views0likes2CommentsParity Between AI and Manual Translation Workflows, and AI Translation Quality Concerns
I want to start by saying that the new Articulate Localization tool is genuinely impressive. The ability to manage multiple language versions as a single course package is exactly the kind of workflow improvement our team has needed, and the in-context validation via Review is a great touch. That said, I'm running into significant gaps that are creating real problems for a current project, and I want to raise three interconnected issues. 1. Poor AI quality forces a manual workflow that breaks the multi-language learner experience We're building a course that requires both Hindi and Bengali for our client's learners, with the requirement that learners can select their preferred language within the course itself — a single item in the LMS, not two separate courses. Both languages are available through AI translation. However, following a formal Language Quality Assessment of the Hindi AI output (detailed in point 2), the quality is not at a standard we can publish. As a result, we'll be using our internal globalisation team to provide human translations for both Hindi and Bengali — at significantly higher cost to our client. Here's where it becomes a compounding problem: the manual XLIFF process produces standalone duplicate courses. It doesn't slot into the multi-language course stack the way AI translations do. That means we cannot offer learners a language toggle within a single course — we would have to publish two entirely separate courses in the LMS and ask learners to self-select the right one. That is a worse learner experience, harder to manage, and not what our client asked for. To be clear: this situation was directly caused by the AI translation quality not being fit for purpose. We started this project intending to use Articulate Localization end-to-end. The tool's own output has pushed us onto a manual workflow that the tool doesn't fully support — and our client is the one bearing the cost of that, both financially and in terms of experience. The fix we need: allow manually translated XLIFF files to be imported into the multi-language course stack, not just as standalone duplicates. If we're providing validated human translations, we should be able to manage them within the same course package and give learners the language-selection experience the tool is designed to deliver. 2. AI translation quality for Hindi (hi-IN) is below acceptable thresholds We had the Hindi AI output formally assessed by our globalisation team using a Language Quality Assessment (LQA) framework against the WalkMe + Training profile (≥1,000–4,000 words), applied to a 2,860-word electrical safety course (en-US → hi-IN). The overall verdict: the translation is not suitable for customer-facing content. Error summary by category Category Minor Major Notes Fluency 14 (+ 2 repeated) 0 Largest volume; affects naturalness throughout Omission 2 1 (+ 1 repeated) Most severe — content is dropped Inconsistency 2 (+ 2 repeated) 0 Systemic terminology variance Inconsistent with termbase 3 (+ 1 repeated) 0 Termbase not followed Punctuation 1 (+ 3 repeated) 0 Devanagari punctuation misused Mistranslation 2 0 Grammar 1 0 What this means in practice Our localisation team reviewed the output qualitatively alongside the LQA scorecard and identified five compounding issues: Clarity and readability Much of the content is technically understandable but does not read like natural, professional Hindi. Sentences are awkward or overly literal, which makes the training hard to follow. The LQA flagged 16 fluency errors across the sampled content — including several that required substantial rewrites in the corrected version, not for accuracy but for basic readability. Missing critical information In multiple places, important safety instructions are partially or fully absent. The clearest example: the navigational instruction "Please ensure you have flipped all cards, watched the video, and opened the transcript before moving on" was rendered as "Please ensure you have flipped all the cards before moving on" — the video and transcript steps dropped entirely. This segment appeared twice in the course, and the omission occurred both times. This is not a cosmetic issue. Learners following the Hindi version could skip key actions or misunderstand safety procedures as a direct result. Meaning changes Some phrases are mistranslated, particularly around risk and mitigation. The LQA flagged two mistranslation errors in the sampled content alone. Even small wording changes in this context can weaken or alter safety messages — which is unacceptable in high-risk, electrical-safety training. Inconsistent use of key terms Key concepts — equipment names, safety gear, risk terminology — are not used consistently. "High Voltage" alone appears both as हाई वोल्टेज (transliterated) and उच्च वोल्टेज (translated) across different parts of the same course, with no consistent rule applied and the provided termbase not followed. The same idea appearing in different forms across a course is genuinely confusing for learners. Overall brand and safety risk The combined effect is a course that does not meet the standard of a polished, trustworthy training product. For a safety-critical topic, this introduces reputational risk for the content owner and potential compliance and safety risk if learners misunderstand or fail to fully absorb the guidance. We would not be comfortable publishing this output without significant human rework. Our recommendation We recognise that the in-context validation feature in Review is Articulate's human-in-the-loop step, and we appreciate that it exists. The problem is that it can only be effective if the base AI output is of a standard that a reviewer can reasonably work with. What we received was not that. When a translation has major omissions, meaning changes, and systematic terminology failures throughout, the Review step stops being a validation pass and becomes a full retranslation effort — one being carried out by people who may not be professional translators, without the tooling or context that a language service provider would have. That's not a sustainable or safe quality control mechanism for safety-critical content. Our globalisation team's recommendation is that Articulate either improve the AI output quality to a standard where Review can function as intended, or explicitly position the output as a machine translation post-editing (MTPE) starting point — and set user expectations accordingly. Right now, the workflow implies a level of AI quality that our experience suggests isn't there, at least for Hindi. 3. "Bangla" should be labelled as "Bengali" (or both) A small but important usability point: the language is listed in the tool as "Bangla" rather than "Bengali." While Bangla is the correct native name for the language, Bengali is the standard English name used across the L&D industry, by language service providers, in ISO language codes (bn), and in most professional translation contexts. In practice, this caused real confusion on our project — we initially concluded that Bengali wasn't supported at all and were prepared to raise it as a missing language. We only discovered it was available by chance. If that happened to us, it will happen to others, and some won't catch the error before making decisions based on it. A simple fix would be to list it as "Bengali (Bangla)" or add "Bengali" as a searchable alias. This is a discoverability issue, not a technical one — but it has real consequences for users trying to plan multilingual projects. Allow manually translated XLIFF imports to be added to the multi-language course stack (not just as standalone duplicates) Investigate and address Hindi (hi-IN) AI translation quality — particularly around omission, fluency, and termbase compliance Consider clearer guidance or workflow support for MTPE as an intermediate option between raw AI output and full human translation Relabel "Bangla" as "Bengali (Bangla)" or add Bengali as a searchable alias — the current labelling causes users to incorrectly conclude the language isn't supported We're genuinely invested in making Articulate Localization work for our projects. These issues are the main barriers right now. Thanks for the tool and for taking this feedback seriously.91Views0likes1CommentArticulate Localization: Tracking Language Validator Suggestions
Hello, We are working with the translation feature in Storyline and Rise. We have a need to update our translation glossary as we are seeing the language validator suggestions. Is there a way to export the following information from Articulate Review: Source language text Original translation from Articulate Language validator translation suggestion There are enough changes to our modules that tracking them manually is not really feasible given the timeframe we have to work with. Thanks all!22Views0likes0CommentsLocalization File Issue
I’m hoping someone can help me troubleshoot an issue I ran into with the localization tool. Last week, I used the localization feature to translate one of my courses into French. While working on the version saved to my desktop, I was able to switch back and forth between the English and French versions without any trouble. I published the course to Review 360 on Wednesday for feedback from my boss, and I didn’t return to update the file again until this morning. Now, the option to switch to the French version has completely disappeared. I’ve checked my desktop and our shared folders, but I can’t locate the French version anywhere. I’ve attached a screenshot of what I’m currently seeing in Storyline in case it helps diagnose the issue. If anyone has experienced this before or has ideas on how to recover the translated version, I’d really appreciate your help!132Views0likes2CommentsRise Translate Feature - Turn off
Rise has introduced the Translate feature (1 April 2026): New: Translate, validate, and manage multi-language projects with Articulate Localization, then activate a paid plan when you’re ready to publish. I gave it a try on a test course. Is there a way the admin can actually turn it off / delete once its been applied to a course? e.g. so that the 'Choose your course language' popup doesn't appear when opening the course share link and so that the Quick Share options can be revert / all accessed again? Also when its duplicated, so are all the translations. I think there needs to be an option to say only keep the original (in my case English version).AI Assistant & Localization in Rise- WONDERFUL!!!
I have created two courses using AI Assistant and Localization, and I am over the moon excited! The process was so very time-consuming, and the translation, per my language expert checker, was spot on! I’m a department of one supporting multiple car rental brands in 28 countries. Hooray for Localization! At least for the Spanish translation. Working on my third course now, using AI Assistant and Localization. I could never have produced three interactive courses in two different languages before these new enhancements in Articulate 360, in under two weeks. Never, never, never! Thank you, Articulate 360 team, not just for the two enhancements, but for making them very easy to use. You certainly kept it simple, streamlined, and focused on your user audience’s needs. AI Assistant is not an ordinary AI Assistant. It is a Trainer’s AI Assistant! Okay, Luminaries, you have another Articulate 360 cheerleader! (smile)🤩 Pat The Trainer98Views0likes2CommentsRemove / Undo Localisation
Hey, since it is not possivle to use localisation on Custom HTML Blocks I wanted to remove or undo the process but that is not possible it seems. Thats leaving me with a permanent side-tab where it wants me to continue with localisation and I cant update my shared Review-link and i have to generate a new one. Why isnt it possible to abandon the localisation attempt and go back to normal? Thanks in advance22Views0likes0CommentsLost Text Animations by Paragraph After Translation
I just translated a Storyline 360 course into Spanish using the Word doc import. Nearly every slide has an animated text box with the setting "By Paragraph" so that each bullet point is timed with the audio narration. After translation, that menu option is gone and all the text comes in at once with no way to adjust it. I've tried several things to fix it (turning all the bullets off and on again, removing and re-adding the line breaks, etc.), but nothing gives me that option again on the Animations menu, Effect Options (Entrance Animations) menu. Never had this problem before with translations. Any ideas?239Views0likes11CommentsLocalization validating updates
Is anyone else having problems having updates validated? I was working on a few courses and went through the process to edit the source language and translate them. I created the review copy to send so the validator could look at the updates, but one course shows 177 updates, one shows 0 and the other shows 9. Each course had about four sentences edited in the source language. I followed the instructions. https://www.articulatesupport.com/article/Articulate-Localization-Translating-Updates-to-Your-Course#request-validation Has anyone else seen this problem and did you resolve it?