Writing Effective Multiple-Choice Questions: Lessons from NASBA Standards

These standards ensure that Continued Professional Education-eligible training programs—whether eLearning, virtual instructor-led training, or traditional instructor-led training—adhere to strict guidelines that enhance learning effectiveness.

Which of the following best describes the concept of quantum superposition?

  1. Particles exist only in one state at a time.
  2. Particles remain stationary until observed.
  3. Particles rapidly switch between two states.
  4. Particles can exist in multiple states simultaneously until measurement collapses the system into one definite state.

If you selected option D, you are correct. But why? Perhaps you have knowledge of quantum physics—or maybe you simply followed a common test-taking strategy: choosing the longest answer. This tendency is a documented response bias. A study by Thomas L. Mentzer (1982), analyzing 35 multiple-choice test files, found that in 17 cases, the longest option was disproportionately the correct answer (Mentzer, T. L. (1982). Response Biases in Multiple-Choice Test Item Files, Educational and Psychological Measurement, 42(2), 437-448, https://doi.org/10.1177/001316448204200206 (Original work published 1982). This suggests that when uncertain, test-takers often opt for the longest response. While this may be a useful hack for students, it presents a challenge for question writers. Those who craft assessment items must be mindful of such biases to avoid inadvertently guiding learners to the correct answer.

Lessons from NASBA: Crafting Fair and Effective Questions

My experience as an instructional designer at a professional services firm exposed me to the rigorous standards set by the National Association of State Boards of Accountancy (NASBA). NASBA standards ensure that Continued Professional Education (CPE)-eligible training programs—whether eLearning, virtual instructor-led training (VILT), or traditional instructor-led training (ILT)—adhere to strict guidelines that enhance learning effectiveness. These are comprehensive guidelines that provide a framework for the development, presentation, measurement, and reporting of CPE programs. Among these guidelines are detailed requirements for constructing multiple-choice questions.

Let’s examine key elements of question design and how NASBA standards apply.

The Question Stem

The stem should be clear and neutral, free from hints or unintentional cues. Avoid incomplete sentences, as they may provide subtle indicators of the correct answer.

Answer Choices

Correct Option: As noted above, the correct answer should be comparable in length and specificity to the distractors. Overly precise wording or definitive phrasing can make it stand out.

Distractors: Incorrect answers should be plausible—neither obviously incorrect nor too easy to eliminate. As a best practice, a multiple-choice question should include at least two distractors, with research supporting a three-option model for optimal assessment (Apicella, A. (n.d.). Comparison of 3-Option and 4-Option Multiple Choice Questions and Related Issues and Practices, Credentialing insights, https://www.credentialinginsights.org/Article/comparison-of-3-option-and-4-option-multiple-choice-questions-and-related-issues-and-practices).

Avoiding “All of the Above”: While tempting, this option often fails to challenge learners. If test-takers identify more than one correct choice, they may default to selecting “All of the Above” without fully engaging with the question.

Providing Meaningful Feedback

For Correct Responses: Feedback should reinforce learning with a rationale, not just a simple “Great job!” Strong feedback strengthens comprehension and challenges learners.

This guideline pushed me to rethink and refine many overly simple questions. Following it made me step up my approach, crafting questions that challenge learners and align with deeper learning objectives.

For Incorrect Responses: Each distractor should have distinct reasoning explaining why it’s incorrect. Vague or repetitive feedback undermines the effectiveness of the question. Most importantly, incorrect feedback should clarify misconceptions without hinting at the correct answer.

Practical Example of NASBA-Compliant Question Design

To illustrate these principles, here’s an example of a well-structured multiple-choice question that adheres to NASBA’s standards:

Question:

Which of the following best describes situated cognition in instructional design?

  1. Learning occurs in isolated environments and is later applied to real-world tasks.
  2. Knowledge is constructed through interaction with the environment and social context.
  3. Instruction focuses mainly on memorization without contextual applications.
  4. Learners acquire skills through passive observation rather than active engagement.

Answer Explanation and Feedback:

Correct Answer: B – Knowledge is constructed through interaction with the environment and social context. Feedback: Situated cognition emphasizes that learning is deeply connected to the environment and social interactions in which it occurs. Rather than treating knowledge as separate from real-world experiences, this approach fosters practical application and deeper understanding.

Incorrect Answers and Feedback:

A – Learning occurs in isolated environments and is later applied to real-world tasks. Feedback: Learning is most effective when integrated into authentic contexts. Situated cognition suggests that knowledge should not be detached from its application but developed through meaningful engagement within relevant settings.

C – Instruction focuses solely on memorization without contextual applications. Feedback: Memorization alone does not support deep learning. Situated cognition rejects rote learning in favor of strategies that encourage meaningful, applied understanding.

D – Learners acquire skills through passive observation rather than active engagement. Feedback: Effective learning happens through participation and interaction. Situated cognition supports active engagement rather than passive observation to ensure knowledge retention and practical application.

Honing the Craft of Question Writing

Adhering to these principles helped me refine my instructional design skills. The years I spent designing NASBA-compliant training strengthened my ability to craft effective assessments—so much so that I now evaluate every multiple-choice question I encounter. When I find well-structured questions that meet these criteria, I silently offer an “NASBA nod” to the writer.

By following these best practices, we can move beyond simple assessments and create meaningful learning experiences that challenge and engage learners.

Tanaya Pandeya
Tanaya Pandeya is a seasoned instructional designer with more than 20 years of experience crafting effective, learner-centered solutions. She is skilled in adult learning, blended learning, and instructional design strategy. Connect with her via e-mail: tanayapandeya@gmail.com