Reimagining Content Feedback

Is your approach to content feedback CRISP? Or does it sometimes crumble?

So you’ve just spent weeks designing a fantastic piece of training content. You’ve conducted a thorough analysis of the training needs, crafted impactful learning objectives, drafted a detailed storyboard, and poured your heart and soul into building a course that will captivate your audience. All that’s left to do is ask a few willing colleagues to review your hard work and confirm that you’ve produced a world-class deliverable, right?

Well, we all know that’s not exactly how it usually goes. Remember that time you eagerly submitted your masterpiece for feedback, only to receive that soul-destroying e-mail with a wall of bullet points highlighting where your content could be improved? You know this feedback is helpful. You understand it was sent with positive intent. You’re fully aware that you asked for this feedback in the first place. So why does it still feel like criticism?

That’s because we’re human, and traditional content feedback processes often fail to consider how humans receive feedback. When delivered poorly, it can be triggering, overwhelming, confusing, and discouraging for individuals and teams. This, paired with other tricky environmental factors, can amplify challenging team dynamics, highlight gaps in knowledge, or even call into question the overall content quality. But the good news is we now have a better way to manage this.

How? By writing feedback for content in a way that is Constructive, Relevant, Impact-led, Streamlined, and Prioritized—in other words, ensuring our feedback approach is…CRISP.

CRISP Content Feedback Protocol

The CRISP content feedback protocol brings an objective structure and clarity to feedback, ensuring it remains positive, productive, and time efficient. This protocol uses a mechanism that ranks feedback points into four tiers (graded with letters A to D) based on their overall impact on the learning objectives and learner experience:

A – Essential Feedback (Impact on Learner 80-100 percent):

  • These points are non-negotiable. They are crucial for achieving learning objectives and must be addressed.

Example: Correcting factual inaccuracies or errors. 

B – Important Feedback (Impact on Learner 60-80 percent):

  • Significant but not critical. Addressing these points can substantially improve the content.

Example: Recommendations based on past experiences and backed by data.

C – Suggested Feedback (Impact on Learner 20-60 percent):

  • These are optional suggestions that can enhance the content if time and resources allow.

Example: Proposing alternative learning activities or interaction points.

D – Nice-to-Have Feedback (Impact on Learner 0-20 percent):

  • Minor tweaks with minimal impact. These recommendations generally should be avoided to prevent unnecessary workload and stress.

Example: Personal preferences on color choices or graphic placements.

By using this ranking system, we can ensure that feedback can be actioned according to priority levels. This structure also enables the designer to learn from the rationale provided. High-priority feedback points can be addressed first, ensuring the most critical improvements are made, while less impactful suggestions are considered as time and resources allow.

Balancing Suggestive and Directive Styles

Once we have ranked our feedback points appropriately, we then can consider our writing style, finding our balance between suggestive and directive feedback styles. The below distinction should enable a conversation with team members to identify the style that best suits their preferences. Each style has its unique benefits and drawbacks:

Suggestive Feedback:

Encourages creativity and autonomy, allowing individuals to explore and innovate.

  • Pros: Fosters independence, promotes critical thinking, maintains positive relationships.
  • Cons: Can be vague, requires self-direction, may lead to missed opportunities.

Directive Feedback:

Provides clear and actionable steps, ensuring quick and precise improvements.

  • Pros: Ensures clarity, saves time, maintains standards, allows for immediate implementation.
  • Cons: Limits autonomy, potential for resistance, fewer learning opportunities, may strain relationships.

Transforming the Feedback Process

The CRISP content feedback protocol transforms how we give and receive feedback. It brings structure, objectivity, and prioritization while enhancing the quality of our training content and fostering a more collaborative and efficient working environment. We’ve found it to be highly valuable when applied to our global curriculum projects at Indeed. So the next time you are reviewing someone’s training content or requesting feedback on your own work, consider applying the CRISP protocol and experience a stress-free content design process.

Jugal Vansia
Jugal Vansia is the manager for Global Enablement Programs at Indeed, leading a global team and setting the priorities for the global curriculum strategy. He is dedicated to fostering a culture of high performance and continuous learning, while ensuring that teams at Indeed have the optimum environment where they can thrive, and ultimately help people get jobs. An accomplished Learning & Development leader, Vansia is passionate about empowering individuals and organizations to reach their full potential. With a keen interest in the fields of artificial intelligence and psychometrics, Vansia brings a unique perspective to his work. He has a wealth of experience in designing and delivering impactful global training programs, with a track record in driving learning effectiveness and organizational success.