Virtually There: Learner Engagement—Why Is It Important?

And how can we measure the effect of engagement of instructional outcomes?

In my previous article, learner engagement was defined as turning on three factors:

  • An emotional response to the training—How does the learner “feel” about the content and its presentation/treatment?
  • An intellectual response to the training—Does the instructional experience require and involve the learner’s intellect?
  • An environmental response to the learning—Do the learners interact with the learning environment and is the environment changed because of the training?

With this as a definition of learner engagement, I want to explore why engagement is important in the learning environment. As you read this, your initial thoughts likely are drawn from out individual instructional experience (“Of course, engaged learners are better!”), but how can we measure the effect of engagement of instructional outcomes?

Outcome #1—Learner Performance in What We Seek to Teach

In the past, instructional approaches have been measured primarily by how learners did on the test. While this may be an appropriate start, this approach ignores personal biases relating to the learning experience in favor of data (“a 93 percent on the test is a 93 percent on the test”). Based on the definition of learner engagement established above, this measurement alone isn’t enough. To assess learner outcome properly, you need to take all aspects of learner engagement into consideration, including evaluating the effect of engagement on learner behaviors and attitudes after the instructional experience. Only then will you be able to see big differences between two participants, as in an example of a training program teaching a software skill:

Student 1 loves her job and its tasks (Emotion Factor High), but finds the content fairly slow and unchallenging despite being critical to her job (Intellect Factor Low) and as a consequence, doesn’t participate much and ignores the questions asked by other students (Environment Factor Low) and the demonstrations and activities using the software in the Web-based seminar. Test Score: 93 percent.

Student 2 is so-so about her job (Emotion Factor Low), but the content is challenging and important to her job (Intellect Factor High). The instructional treatment is directly applicable to managing her personal finances, so she participates and asks questions as to how the content is applied to the task in the software (Environment Factor High). Test Score: 89 percent.

Judging by test scores alone, it would be easy to conclude that Student 1 performs better than Student 2. By doing that, however, we lose insight into learner outcomes down the road, and we may be measuring the wrong thing.

Which student in the example is more likely to go on and perform the actual task more effectively? The key in answering the question of learner performance with respect to engagement is to define measure(s) of effect and establish rubrics based on them that are situated in real-world contexts, and provide instruction in different learning environments. If the right environment with situated content is presented to the learner, the learner’s engagement goes up, learner self-efficacy improves, and achievement (even that just measured by summative testing) improves.

Outcome #2—Learning How to Learn in a Different Environment

Not much research has been conducted with a diverse learner audience to understand the effects of learning environments on the learner’s attitudes, behaviors, perceptions, and performance while learning.

In the examples above, the learner who’s actually trying to apply the content in a real-world context in the classroom enters the world ready to perform the task…but what else does she leave the classroom with?

Applying the above definition of learner engagement, the learner who actively participates (and is, therefore, more “engaged”) experiences changes in all three factors, and completes a particular learning experience with a sense of how to learn and act within that learning environment. This competency is a skill of learning in and of itself. Within any particular learning environment, learners develop methods to navigate their way through the learning experience by learning how to interact with the learning environment (“if I raise my hand, the instructor will call on me” or “if I click this question mark button, I’ll access a help function”). Learners refine their understanding of what works and what doesn’t within the learning environment to achieve the outcome they want.

The big take-away here is that the learner learns not only the subject matter being taught, but also how to interact in a particular learning environment to learn (hence the need for an Environment Factor in the construct). While perhaps not earth-shattering when most instruction is face-to-face, in the context of Modern Workplace learning, this is a big deal. Modern Workplace learning provides, quite literally, hundreds of potential instructional environments in which to deliver instruction. If learners understand how to interact in a particular learning environment, their engagement will improve (Environment Factor goes up), and their learning will be enhanced. Additionally, if the learning environment is based on—or, even better, uses—a real-world context, the learner develops true competence by participating in the learning experience; the knowledge and skills developed in the instructional experience are directly transferable to the real-world task.

Measuring Learner Engagement—Next Steps

If learner engagement is defined around the three factors, how do we measure it?

The answer is: It depends. Here are a few relevant things we do know:

  • Learner engagement is dynamic within the learning experience.
  • Modern Workplace learning posits myriad learning environments.
  • Each learner comes to the instructional treatment with his or her own experience in a particular learning environment.
  • Post-delivery affective instruments are of limited utility in assessing Emotion Factor in the moment.
  • Intellectual response to subject matter can be evidenced by behavioral change within the environment demonstrated by the learner.

To measure learner engagement, we need to take both the unit of analysis to the learner AND the environment into consideration. This means that in every distinct learning environment, the methods and means of measuring learner engagement are different. That’s a challenge—we need to develop a set of measures for each learning environment and then develop an inventory of the learner’s abilities to interact within the learning environment before instruction begins. Only then can we hope to measure (and more importantly, predict) learner engagement with any degree of accuracy.

However, there is some good news: Each learning environment does share some similarities with others, so the development of a set of measures isn’t quite so daunting. In addition, we can develop some proxies for the learner’s abilities within a particular learning environment based on prior learner experience. In fact, as the research underway provides more insight, you might imagine that the approach to optimize learner engagement would follow much the same protocol of defining elements in a blended instructional treatment—each instructional objective would establish one or more learning environments to provide the best instruction possible and how to engage the learners within that environment. Instructional design and delivery takes on an added dimension in establishing and maintaining learner engagement throughout the experience.

In the coming years, educational technology under development will provide new and robust methods to assessing and capturing learner experience that will provide meaningful insight into learner engagement. Coupled with achievement data, we will seek to develop models of effects of learner engagement on performance to definitively answer the question of the effects of learner engagement. In the meantime, the development of a robust model and environmental-specific measures is a great first step toward determining the causal effects of learner engagement, and the consequences of it in learning outcomes.

Charles (Chip) Dye is a senior executive with experience growing technology-based service companies in the e-learning industry. His primary functional expertise lies in enterprise learning, learner-centric universal design, situated cognition, training function automation, and learner community development and optimization. At InSync Training, LLC, his responsibilities include development of key personnel, advising on strategic implementations of e-learning systems, and professional development of technical staffs. Dye currently is engaged in doctoral research focusing on the development of skills and mastery with an eye toward return on investment, whether in public education, industry, or military preparedness training. Research areas include:

  • Assessment of unstructured and structured constructivist learning.
  • Virtualization technology and its impact on learner behaviors and expectations.
  • Structured modeling for return on investment in educational technologies.
  • Combinatorial uses of educational technologies to facilitate particularized learning trajectories/outcomes.

 

Training Top 125

Minneapolis, MN (November 18, 2014)—Training magazine, the leading business publication for learning and development professionals, today announced the finalists for the annual Training Top 125, which ranks companies’ excellence in employer-sponsored training and development programs.

From the Editor

As I was editing this issue’s cover story on non-traditional Learning and Development (L&D) teams ("Creating an L&D A-Team"), I realized Training magazine’s small team is a case in

Digital Issue

Click above for Training Magazine's
current digital issue

Training Live + Online Certificate Programs

Now You Can Have Live Online Access to Training magazine's Most Popular Certificate Programs! Click here for more information.

Emerging Training Leaders

Emerging

Spectacular. Impressive. Dazzling.

Spring is—finally—in the air.

By Lorri Freifeld

ISA Directory

Twitter