Students failing test after FTF course

This is not an e-elearning question, I just want the opinion of the community as educators.

I ran a two-day Face-to-face course a couple of weeks ago, which involved lecture, students using on-line reference manuals to retrieve technical information, and practical activities which combined both the theories and referencing we did in the classroom (it was automotive training).

I decided to drop the usual multi-choice final test in favour of an on-line test a week later, which required the students to access and find information in the on-line manuals. The reason for this change was my changing the objective of the course from "remembering information" (impossible) to "finding information" (useful).

There were only 5-questions, which have to be completed in 30 mins, but many students are failing the test!

I have set the required pass rate to 100%, but I don't think that is too stringent as we practiced using the on-line manual at least 15 times during the training (although the students were sitting two to a computer). All they had to do in the test was look up the answer in the on-line manual!

How would you go about finding the reason for this failure-rate? I would love to hear any suggestions. I am fluent with the reference material, so  can't really objectively judge how difficult the questions were. Should I ask the students why they think they failed? This is a real shock to me, as I thought I was the best trainer since sliced bread (don't we all?). 

Any ideas on how I can get to the bottom of this, and still respect myself (and the students still respect me) would be much appreciated. I do understand, it is possible that I just didn't give them enough practice with the on-line manuals during the Face-to-face training (perhaps that needs its own separate training), and also that it may just be a sign of the effort the students are, or aren't making. 

For the record, one not-so-experienced student who passed the test said it was tight, but they got there in the end. Maybe I should be talking to them first.

This is horrible for me. I was hoping to start freelancing next year and this has knocked my confidence right back...

4 Replies
Steve Flowers

Hey Steve - 

I wouldn't take this as a reflection on your instruction. There may be something going on with the test. I'd start with the test and work backwards from there. You may want to investigate the test to see if any questions were ambiguous, if each correct choice was valid, and there were no other answers that "could have been right" under certain circumstances. 

We have used 100% mastery tests in the past. Unfortunately, people make mistakes even when they know the right choice. Providing some grace may help to resolve most of the problem with folks failing. 

After running some test item analysis, you might try either boosting the number of challenges and reducing the cut score or providing some remediation feedback or guidance and providing at least one additional try. 

Bob S

Steve M,

First... don't lose confidence! The instruction may have been just fine.

As per Steve F's thoughts, I would look hard at the testing side. A couple of thoughts....

  • Consider starting off the test with a "practice" question again. This will remind the learner what is expected and let them practice what they learned before being tested on it
  • While 30 mins for 5 Qs sounds fine, remember some folks have "test anxiety" and their performance can tank when put under the artificial pressure of a hard time stop. Is the 30 mins a regulatory requirement? If not, consider removing it unless it serves some clearly defined purpose.
  • 100% mastery is always the goal. But I don't know many perfect people. Most folks I know make silly mistakes, misread a question, get questions confused with others, etc.  Would answering 4 of 5 correctly (80% or "B" in college) still show mastery? If so, then consider setting that as your standard.
  • Finally (but you should start here!), look at the quiz and answers you've gotten so far and try to analyze where they are struggling. Is there a question or two that most people miss?  Do they run out of time and not finish all of them? Do they answer correctly but for a slightly different question than you asked? Is there a common "wrong" answer that many of them give?  This kind of analysis can prove extremely helpful in terms of ensuring your instruction, practice and assessments are clear and alinged.

Hope this helps and let us know where you land!

Bob

Steve McAneney

Thanks for the responses.

On reveiw, one question was still ambiguous (I had corrected this, but it wasn't updated in our LMS for already assigned tests). Everyone was getting this one wrong.

Adding a 'practice' question or two would definitely have been a good idea, as this is the first time students have had this sort of delayed test.

Including a contingency factor in pass rates is definitely a must. I will do that from now on.

Golly, that was a learning experience! For me! I'm glad this particular class was only for 8 students.

Thanks again for your help.