Writing good quiz/test question answers/distractors

Sep 08, 2011

Feel free to ask questions about these or any other questions./comments about writing quiz/test questions.

Writing good MCQ answers/distractors:

1.

2. You can vary the number of distractors. Three to five distractors is ideal. A smaller number of answers/distractors increase the probability that a guess will be correct, however.

3. ALL distractors must be plausible. These are the best types of plausible-but-incorrect distractors:

a. Common errors and commonly held myths or misconceptions (for those with less

knowledge or skill)

b. Statements that are true, but do not answer this question

c. Content that is paraphrased incorrectly

4. If answers/distractors include best and not-as-good alternatives (“Select the best answer…”),

Make sure that there is an unambiguously correct answer or answers. Provide enough detail to differentiate best from not-as-good.

5. Keep answers/distractors about the same length.

6. Avoid answers/distractors that combine distractors (“b and c”).

7. Avoid using “all of the above,” and “none of the above.”

8. Make sure to use different placements of the correct answer. The most common placement of correct answer is c and test-wise learners know this.

NEVER include silly distractors.
42 Replies
Patti Shank

I think what Catherine is saying is that if you can't come up with an answer from the stem, the stem probably isn't worded clearly enough. Is that your message, Catherine?

Of course, if you are asking learners to select the best" answer, you must make sure all of the answers/distractors are extremely clear as well because you are asking them to make fine distinction.

Catherine Conley

Yes, that's what I wanted to pass along. Improperly worded question stems can lead to unneccessary confusion and frustration. A properly worded stem should stimulate possible answers before even looking at the answer choices.

if the test taker has to look at the answer choices to define the question, then something is probably wrong with the stem. the example given was a question stem "Lightening is?". This stem does not give the test taker enough information to know what is really being asked and tested.

Ramesh P

HI,

I need  a clarification, for setting up a we need to give the questions and answers in Quiz Maker. If i have 50 Questions in the excel sheet with the answer. Is it possible to import all those by using import option. Else is there any specific quiz template available for this.

If available then If the requestor gives the details in that particular template can i add those to my content by just clicking the import option.

Kindly make me clarify on this. I really getting stressed in creating the quiz continuously.

Your help will be much appreciated.

Thanks & Regards,

Ramesh.P

Bryan Tregunna

Here are some other best practices (I apologise if they have been mentioned previously) 

·        make sure you tell the trainee whether there isone correct answer or more than one, but not necessarily how many are correct.

·        if you have to qualify an answer, qualify themall, otherwise the qualified option is likely to be the correct one.

·         avoid options that intersect to indicate the correct option (e.g When do the French celebrate Bastille Day? A: 4 June, B: 4 July, C: 14 July, D: 14 August.

I dislike the "All of the above" option for two reasons: often, when the option appears it is the correct one; if it is multiple choice (one correct answer) question and the learner knows that option A is correct, he/she may select this without reading the other options. If more than one answer is correct the Multiple Response is better to use.  By the way, I have seen "All of the above" used in a Multiple Response question - now that really confuses the learner, especially if there are double negatives!

Many years ago I suggested changing the wording of a question as many learners were misunderstanding/misinterpreting the requirements and the response was to leave it in as "it really catches them out!"  I hope the learning industry has moved on since those days.

Daniel Brigham

Hi, everyone:

Here's an idea from William Horton's E-Learning by Design. Start your development process with the tests.

So often designers develop all the content and THEN build the tests. Tests (or whatever you wanna call them) are more important than that. At the very least, e-learning designers should be thinking of test questions and what tests will include as they are developing topics, learning objectives, and so on.

And here's another thought: most designers make tests too easy. Most teachers and professers of course do the same thing, and for simliar reasons. --Daniel

Bryan Tregunna

Yes, Daniel, it is often the case of I've told them something, so I'll ask a question about it. You quote William Horton and he would be right. You start with a set of learning objectives and this will give you the end test. You may not be able to complete all the questions as you may not know enough about the subject matter to be able to come up with plausible distractors, but if the objective is to identify the difference between a splip and a splop then your test question is exactly that.

This is why the first part of ADDIE is so important; getting the learning objectives right is key to the success of a learning event. Often, all too often, the learning designer has had no involvement in the first stage and has been given a set of less-than-satisfactory objectives with a tight time scale in which to weave the magic. Ideally, we should challenge the objectives, and I would do so when appropriate, but it is often more pragmatic to do what you have been asked; provide clients with what they want - even if it is not what they need.

Although evaluation is the last letter of ADDIE, evaluation begins at the beginning and pervades throughout the process. Perhaps it should be AEDEDEIE (but that would be less memorable).

James Brown

I'm going to kick the hornets nest and run. I have never been fond of pure Lickert exams. You have a 25% or 50% chance of getting a question right and they really are not a true accurate measure a learner's knowledge of the subject matter. I am not making this statement out of opinion but based on various studies that I have read on this subject matter. That is why you need to have a very specific goal in mind when you design your training and develop a broad spectrum of Multiple guess, fill in the blanks, drag and drop, and essay questions to determine if the learner truly posses the knowledge after your course is complete. Based on personal observation,  most e-learning courses that I have seen tend to revolve around psycho-motor skills which really need to be tested by both lickert exams and hands on demonstration, which in e-learning, may or may not be a rather difficult thing to accomplish depending on the subject matter.

Bryan Tregunna

I agree. eLearning should not be seen as a panacea. If you say that most eLearning courses you have seen revolve around psychomotor skills then I am horrified. eLearning is ideal for the knowledge aspect of a learning need. Technology is getting better all the time but I would expect a blended event where the learning is other than pure knowledge. If you want someone to build a wall, then you need them to practise building walls and assess them building walls.

I suggest that pure multiple choice tests are done for the ease of the examiner, rather than truly testing the individual. But multiple choice questions can be used as an assessment. If the question is crafted skilfully, has plausible distractors and follows Patti's best practice list then it can be a true test. A learner would have to be a very lucky guesser to meet the standard given 10 multiple choice questions. The lottery is a kind of multiple choice question, and I'm still working for a living!

However, going back to what I said in an earlier post, the learning objective gives you assessment. Objectives such as identify, select, and differentiate can be assessed with multiple choice questions, whereas objectives of explain and describe cannot (at least they should not.)

Patti Shank

Here are some from my writing assessments workshop:

Write good MCQ answers/distractors:

1.

NEVER include silly distractors.

2. You can vary the number of distractors. Three to five distractors is ideal. A smaller number of answers/distractors increase the probability that a guess will be correct, however.

3. ALL distractors must be plausible. These are the best types of plausible-but-incorrect

distractors:

    a. Common errors and commonly held myths or misconceptions (for those with less knowledge or skill)

    b. Statements that are true, but do not answer this question

    c. Content that is paraphrased incorrectly

4. If answers/distractors include best and not-as-good alternatives (“Select the best answer…”),

Make sure that there is an unambiguously correct answer or answers. Provide enough detail to

differentiate best from not-as-good.

5. Keep answers/distractors about the same length.

6. Avoid answers/distractors that combine distractors (“b and c”).

7. Avoid using “all of the above,” and “none of the above.”

8. Make sure to use different placements of the correct answer. The most common placement of

correct answer is c and test-wise learners know this.

This discussion is closed. You can start a new discussion or contact Articulate Support.