Among the many things that can be placed at the feet of COVID-19, the explosion of remote working and virtual learning is near the top. As people were forced to stay and work from home through lockdowns, workers began to do their jobs remotely, and the switch to virtual corporate training logically followed. The difference is that while the switch to working remotely was abrupt and challenging, the shift to virtual training has been in progress since well before the pandemic. The pivot to virtual training is being made for a multitude of reasons, including:
- Lower cost
- Ease of scheduling
- Elimination of geographic challenges
With the right virtual training technology, instructors can use multiple techniques to engage learners. Engagement is crucial if companies want to ensure learning effectiveness. Engaged learners are actively involved, experience a higher learning level, have a stronger retention rate, and overall achieve greater performance and productivity levels.
One of the key components of any training is understanding the results from a learner perspective. Measuring learning success generally is done through assessments, quizzes, and, in some cases, role-playing or group project work.
Many companies don’t assess learners’ knowledge at any time during or after training. But knowing what level of learning occurred provides organizations—and Learning and Development (L&D) professionals—with critical information that goes beyond how an individual did on a quiz. It helps form basic performance improvement measures for the instructional design of the actual training.
Virtual In-Class Assessments
In-class assessments are a critical step in any training session and are used for several different reasons. Assessments provide confirmation that learners are engaged and understand the information, so they can apply it in the real-world workplace. In-class assessments offer a quick, real-time confirmation that the instruction is successful because the assessment results are communicated immediately to the instructor.
Some virtual learning solutions offer in-class assessments, so you can quickly navigate any knowledge misunderstandings in real time. Virtual platforms also help to keep training session participants engaged in their learning. They are more likely to pay attention if they know questions will be asked during a session. With the prevalence of online meetings and the natural attention span of most people, adding these assessments every 5 to 10 minutes helps keep learners engaged.
Such engagement leads to overall higher learning results. The key to learning and information retention is active engagement. During the active learning process, students transition from being recipients of information to being participants actively engaged with new information.
Evaluating Learning Results
Measuring learning results is as critical as engaging your learners during training. Assessments include quizzes, surveys, polls, and evaluations that help you understand what people are gaining from the time spent in training.
As reported by Forbes at the beginning of the pandemic, the change to virtual learning was happening at a “plodding pace” until COVID-19 forced a change. “The expense and time of bringing together groups of employees for in-person training are exorbitant in comparison to high-quality online versions, and, frankly, the poor quality and unmeasurable outcomes of in-person corporate training have always been complaints. These complaints are greatly amplified now in comparison to the virtual alternative,” the article notes.
A big issue among corporate audiences is that it can be challenging to measure training effectiveness. Indeed, one popular model companies often use to measure training is nearly 60 years old, and some organizations never make it past the model’s second level.
During the 1950s, University of Wisconsin Professor Donald Kirkpatrick devised this model for measuring training effectiveness. It is split into four levels:
Each of these levels revolves around assessing the efficacy of a training session after completion.
Level 1 (Reaction) focuses on participants’ engagement in the training. These are the questions asked of trainees to assess whether they found it valuable. Level 2 (Learning) is a knowledge assessment that revolves around what trainees did or did not know before and after the training. Level 3 (Behavior) focuses on how trainees’ behavior has changed in response to the training and is preferably validated by a third party other than the learner. Level 4 (Results) looks to evaluate training outcomes as they affect the business overall.
As explained by MindTools, Levels 3 and 4—which arguably yield the most useful information for the business—can be time-consuming, resource-intensive, and expensive to implement if the L&D function does not work with business unit leaders to set specific business goals for the training and then evaluate the results against those goals. As such, MindTools concludes, “the model may not be practical for all organizations, especially if you don’t have a dedicated Training or HR department to conduct the analysis.”
Obtaining learning, behavior, engagement, and performance analytics is critical to the L&D team, and the organization, as they need to know the value of their training programs and how training activities relate to employee performance.
Jigsaw Interactive’s virtual learning technology can help you evaluate a training’s effectiveness as it provides detailed analytics on participants’ learning, behavior, performance, and engagement. For example, Jigsaw collects more than 450 data points on each person entering a Jigsaw session. The data points include levels and areas of engagement—what each learner is doing by capturing where they click, the amount of time spent doing, the number of reviews of information, where they focus, details on the type of learner they are, etc. Jigsaw also collects and provides reports on all levels of activity, poll and assessment results, attendance time, etc. This information has value throughout the organization—from talent development to instructional design.
It’s all about the learning, and learning is all about engagement. Measuring training effectiveness in an objective way provides tremendous value to organizations. After all, you can’t manage what you don’t measure.