The Magic of Develop and Implement
Welcome back to our review of the “basics.” We’ve made our way through Assessment and Analysis and the Design plan. This Last Word column is devoted to Develop and Implement.
A quick thought on the sequence of events, as it relates to project planning methodologies such as Agile and SAM: These methods have the design, develop, and implement phases compressed or overlapping. Overlaying the ADDIE phases ensures nothing is missed, solid plans for execution are built, and the team has agreement before beginning the develop phase.
The develop phase is where the plans begin to take shape and come to life (it is a bit like magic!).
Once stakeholders and decision-makers sign off on the full design plan—which includes the evaluation strategy, post-learning support plan, and the plan for future revisions—it is time to build the learning content according to what was decided. The people involved will engage with the process, the content, the tools, the subject matter experts, and the schedule. And they will handle the feedback, refinement, and revision cycles. The focus of the meetings and discussions shifts from what might be to what will be soon.
Almost seamlessly, develop moves to implement (and back). Think of this phase as implement to iterate; iterate to implement. What does this mean? The work should never be considered “done.” Ever. Move forward into every project with this mindset built into the overall plan (or at least firmly placed in the back of your mind). Heck, go nuts! Put the revision plan on the Learning and Development (L&D) team’s calendar right now.
Never be in a hurry to deploy the learning content and move on to a new project. Instead, test this one with a small group before full rollout to the “masses.” Regardless of the type of learning content, topic, or the level of experience of the learner group, it is wise to activate a group of beta testers who may provide valuable qualitative feedback. For example, ask them to:
- Rate the level of difficulty accessing the learning content.
- Rate the ease of navigation of the registration process, the course itself, and any assessments.
- Rate the method used to deliver the learning content.
- Rate the level to which the learning content is directly applicable to the work.
- Rate how the learning content aligns to its intended purpose and expected outcomes.
Once you’ve gathered the qualitative feedback from the beta group, revise the learning content accordingly. Depending on the beta group’s experience, it might be necessary to ask them to participate in another round before assuming everything is final and good to go. After all, qualitative feedback is magical, too!
Dawn J. Mahoney, CPTD, owns Learning in The White Space LLC, a freelance talent development (“training”) and instructional design consultancy. She is passionate about developing people through better training, better instructional design, and better dialog. Mahoney asks the tough questions to ensure the training content is relevant to the work and performance expectations. She does this work because she loves to see the moment when the learning “dawns” on her learners. If you need help, get in touch with her at: firstname.lastname@example.org.