Level 1 Reaction Evaluation Data

Sep 10, 2015

Hi Heroes,

Outside of design, I am curious to know what type of data everyone gathers for Level 1 feedback on eLearning courses and you do with the data once it is collected? This is the basic evaluation that is very easy to implement, but how are you using it? Contractors, Corporate eLearning developers, freelancers are all welcome to answer!

6 Replies
Patti Bryant

Hi Brian!

Below are the questions I generally ask:

- Was the content clear and easy to understand?
- The course was not too long
- The course was easy to navigate
- The course helped me get better

- Course loaded correctly
- Sound Quality OK
- Videos played OK

- What could we do to make this course better?
- What did you like about the course?
- What did you not like about the course?

After receiving the results, it's easy to scan and see if there are any issues with the course audio/video, etc. Since I use a likert scale, I can also see if learners think the course is too long, etc. Finally, the free form questions offer suggestions on how to improve the course for the next version.

I hope this helps!

Keepin' the joy,

Bob S

Patti's list of Qs is pretty good.  I typically add a couple around content relevance and likelihood of applying it. These two questions (also on a Likert Scale) help us determine if we have hit the mark... after all, the best functioning course in the world isn't effective if it didn't cover what the learner needed and/or they aren't inspired or likely to go use what they learned.

As far as using the collected data... Currently done as a quarterly review (quantitative data only) across our 300+ courses to see where are opportunities are.  Honestly due to resourcing, the free form data is typically reviewed primarily when the ratings flag a potential problem, otherwise annually when considering additional learning topic areas of the coming year.

Hope this helps!

Bob S


One more thought for you...   If you are offering multiple courses to the same learners, you may find that "less is more" here when it comes to your Level 1 evals.  I've seen learners experience burn-out on evaluations after multiple courses.

To combat this, consider keeping your Level 1's super short and easy... preferably just a half-dozen or less simple rating questions and one free form (max).  Even though it may seem counter-intuitive, you are likely to wind up with MORE and better data.

Wanting Zou

Hello Everyone, 

Not sure if this is too late to ask under this post. Does any of you put this reaction 1 evaluation in the course? if so, how do you do it? 

We are trying to build this level of evaluation in our courses as well. I am wondering if there is a way to do it without referring the learners to another webpage. 



This discussion is closed. You can start a new discussion or contact Articulate Support.