L&D Best Practices: Strategies for Success (Nov./Dec. 2019)

Training magazine taps 2019 Training Top 125 winners and Top 10 Hall of Famers to provide their learning and development best practices in each issue. Here, we look at how ADP is revisiting instructional system design and how Mastercard’s Sales Excellence team is using certification to drive confidence and sales performance.

Revisiting Instructional System Design

By Jordan Birnbaum, Chief Behavioral Economist, ADP

Irrationality is having a moment. As the field of behavioral economics emerges from its classical parent, there is growing acknowledgment that much of human behavior is, in fact, irrational. Of course, “irrational” can have a pretty broad meaning, so for our purposes, we’ll define “irrational” as describing people who do NOT make decisions in their own best interests (as has been the classical approach).

The good news is that irrational does not mean unpredictable. In fact, we can predict irrationality (in some cases remarkably well), and as such, account for it. But doing so requires that we take a new look at old models in order to find any potential for irrationality we might have missed along the way.

From a behavioral perspective, the world of industrial/ organizational (I/O) psychology (the science behind Human Resources or HR) is remarkably advanced. I/O psychologists have been practicing behavioral economics since before there was the term, “behavioral economics.” Nonetheless, there remain countless opportunities to tweak I/O psychology approaches to better account for a more nuanced understanding of human irrationality in all facets of the workplace. And training is no exception.

As readers of this article are likely well aware, instructional system design describes a process for designing and developing training programs. There are many different versions, but the most widely adopted is ADDIE: Analysis, Design, Development, Implementation, and Evaluation. Let’s consider each stage to look for irrationality (under the surface) that might lead us to make some significant tweaks to our execution.

Analysis

Training design usually begins with an analysis of the situation, which requires two fundamental questions to be answered, each an opportunity for irrationality to enter the fray.

1. Why is the training necessary? When the need for training is tangible and objective (for example, new legislation requiring new procedures), the path forward is pretty clear, and the risk of irrationality is relatively low.

But when the need for training is harder to define (for example, low customer satisfaction), “solutions” can get irrational quickly. People are likely more motivated by managing impressions (“it’s not my fault”) than uncovering actual causes, despite the fact that doing so ensures the problems will continue. Failing to account for this potential irrationality can lead to trainings that solve for nothing.

2. Are people ready for the training? Assessing readiness goes beyond the practical questions of whether people will have the time, resources, skills, and knowledge to complete the training. Arguably, the most important consideration is the question of employee motivation.

In theory, if a company is introducing a new training, rational people will be motivated to complete it. (After all, they want to know how to do their jobs moving forward.) But as we know, people are not rational. They may not only disagree with the need for the training, but even be resentful about it. And that ensures the training isn’t going to go well.

As such, it is crucial to get a sense of genuine employee motivation, and to address anything you find before beginning the training. This could be as simple as reframing the training in a way learners find more appealing, or as complex as addressing long-unresolved personnel issues.

But one last warning: Don’t assume the feedback you get from people is accurate. They might be telling you what they think you want to hear. You’ll likely have to dig a little deeper. (I know, people complicate everything.)

Design and Development

When designing and developing training content, it is tempting to create for a user who has unlimited capacities, unwavering commitment, and a singular focus. Why is that tempting? Because it makes the work significantly easier for you. But for the training to be effective, its design must be dictated by the limitations of the human condition and not by the speed or skill with which the training can be delivered.

Of course, this is much easier said than done. Time is the most precious resource, and training requires a lot of time, so training design often begins with a significant bias for speed over effectiveness. (Good rule of thumb: Any time a bias is involved, the potential for irrationality is extremely high.) Fittingly, the rational question is whether the training is actually worth it given the time it will require to be effective. But, to be fair, rationality is a lot easier when you don’t have deadlines and numbers to hit.

As for the training content itself, there are many traps along the assumed rationality path. A rational person doesn’t require a case to be made for why he or she should care about the training in the first place. A rational person doesn’t require the content to draw on past experience to be willing (or able) to process it. A rational person doesn’t require recognition, encouragement, and/or incentives to complete a learning goal. And one step above rationality, a superhero doesn’t require learning to happen in manageable chunks over time, nor have a need to practice.

But human beings? We need all that stuff. And to be most effective, it has to be baked into the training development from its inception. For behavior change to happen, the information has to be delivered in such a way that accounts for human motivation, emotion, and limitations, aka irrationality. Implementation

To analog or to digital? Is that the question?

After all, isn’t everyone excited about what digital learning has meant, and what virtual reality will mean, for training? The combination of scalability, efficiency, and effectiveness is intoxicating to those who are passionate about this. Who the heck wants to talk about analog? Actual human connection went out of style in the mid 1990s (Internet), for goodness sake.

It’s easy to become so enamored with digital learning’s appeal and potential that it is natural to forget the inherent challenges of self-directed learning. In other words, it’s a person and a machine, and the person is in charge. And that becomes a lot more complex when accounting for human irrationality.

A rational person would learn as effectively whether sitting alone at a computer or in a room full of colleagues. A rational person would take breaks when losing focus, instead of continuing through the material without learning it. A rational person wouldn’t require extra explanation, nor raise idiosyncratic, unrelated questions. A rational person would be sufficiently self-disciplined to practice as needed following the training.

But…you know the “irrational” drill by now. All of these considerations must be accounted for as part of a holistic training implementation, and usually require some degree of human interaction in implementation. None of this is to diminish the incredible impact of new technology on training (and the best is yet to come). It is merely to suggest that digital training is not a panacea, in large part because human beings are irrational. And it takes irrational (people) to deal with irrational (other people).

Evaluation

Irrationality in the evaluation of training programs usually arises from two main sources: ability (making honest mistakes due to cognitive limitations) and intent (trying to accomplish something other than the stated intention, whether consciously or not).

As far as ability, there unfortunately exists a seemingly endless list of examples of how human error could undermine training effectiveness. Sometimes we use self-evaluations to measure impact, despite their insurmountable challenges with objectivity and accuracy. Sometimes we word survey items in suggestive ways that nudge users into providing specific feedback. Sometimes we measure learning instead of impact/ behavior change. And the challenges of causation versus correlation hardly need to be articulated here.

Moving from ability to intent becomes even harder to address. To a completely rational person, there is only one intention in a training evaluation—to measure actual impact. To a real person, it’s a lot more complicated than that.

Let’s pretend you are in the business of providing leadership development. When you complete the training, would you rather:

a) Distribute a survey asking participants whether they found the training useful?

b) Return in six months to run a comprehensive quantitative analysis of key performance indicators (KPIs) to measure leadership performance?

The first approach will be easy to do and likely make you look really good. The second approach will be far more difficult to do and may prove your training didn’t affect a thing. Which approach do you think most practitioners employ?

Now imagine you are a person who took the training. A completely rational actor would provide honest feedback so a proper evaluation could be conducted. But how about a human actor? A more common reaction may be, “Wait, if I say the training didn’t work, will I look like I wasn’t trying? Or even worse, that I’m stupid? What might that do to my career?”

Now imagine you are the person who bought the training. It’s pretty easy to say you want an objective analysis of impact. But do you really? If the training proves to be ineffective, will you count it as a victory that you proved a product or service is a waste by using your company’s money and your colleagues’ time? (Hopefully, you did a small rollout first, in which case you do, in fact, have reason to celebrate it as a victory!)

When you consider the evaluative motivations of trainers, trainees, and purchasers, it’s easy to see how each of their individual actions could be considered rational from the perspective of self-interest and self-protection. Yet when you consider those actions from the perspective of the actual intention—measuring training effectiveness—they are the height of irrationality.

Conclusion

Much in this world is premised on the idea that human beings are rational actors capable of consistently making decisions in their own self-interest. Since that is proving to be an invalid assumption, we have a lot of work to do.

Fortunately, the field of I/O psychology is way ahead of the game, because human irrationality has been incorporated into best practices as the result of a century’s worth of data. But the main point this article seeks to make is that even I/O psychology standards (like training) can benefit from an “irrational” review. Hopefully, as our understanding of human behavior becomes more nuanced, our ability to design effective training will become even more refined. Until then, may the irrational path of training design lead you to wonderful results.

Mastercard’s Priceless Learning Experience

By Brian Gontarski, Director, and Paul Nisco, Manager, Learning & Development | Sales Excellence, Mastercard​​​​​​​

Consider how rapidly technology solutions emerge, develop, are redesigned, and then replaced. Put yourself into the shoes of a sales professional in the rapidly evolving payments industry. New solutions in digital payments, cyber and intelligence, loyalty, consumer credit, and more are emerging at a breakneck pace. Maintaining your expertise with the latest solutions is an ongoing effort. Now imagine you are the Learning organization needing to provide an ongoing stream of offerings to keep pace with the latest industry information, product development, and value propositions—all on a global scale.

Mastercard is a technology company in the global payments industry that connects consumers, financial institutions, merchants, governments, digital partners, businesses, and other organizations worldwide, enabling them to use electronic forms of payment instead of cash and checks. With the depth and breadth of its offering across the globe, it is imperative that Mastercard’s sales force has the resources, tools, and knowledge to engage with customers and partners.

While the team at Mastercard had a robust product knowledge learning program, the results of an internal survey of sellers revealed they still wanted more. The Sales Excellence team, equipped with the analysis, spearheaded efforts across all regions to determine the underlying need. Ultimately, sellers wanted a more structured approach to product knowledge mastery.

Sales Excellence and Global Learning & Development responded with an internal sales certification program to elevate product knowledge and conversational fluency. It would need to engage learners directly and at scale across the enterprise. The program’s blended learning model provides employees with:

  • Just-in-time learning
  • Knowledge and skills practice
  • Proficiency assessment
  • Personalized coaching
  • Recognition

“Being certified is about our sales teams feeling confident and prepared to solve customer needs with Mastercard solutions. And it all starts with knowing our products,” says Mike Cyr, executive vice president, Sales Excellence, and program sponsor.

Don’t Start From Scratch

The team’s starting point was to leverage its best-in-class collateral and marketing support to serve as foundational building blocks and the main drivers for product knowledge. The first step was to ensure alignment between all teams. Core concepts in sales materials were expanded upon in live, interactive learning events. Perspectives from both the product and sales teams were shared to build understanding and dialogue about Mastercard’s solutions from an outside-in customer perspective. Attendees were encouraged to ask questions, and recordings of all events were made available and easily accessible on their devices.

Time for a Challenge

With the content and learning well aligned, it was time to challenge sellers’ current knowledge and identify and reinforce gaps. Knowledge assessment quizzes were developed and launched to sellers across the enterprise. The quizzes spanned the full range of a given product’s story. For incorrect answers, detailed reinforcement was provided, including where to locate more information within sales materials. These questions were reintroduced to sellers a few days later to gauge stickiness and proficiency. Over the course of the last two years, Mastercard has recognized a boost in product knowledge proficiency. For the first half of 2019, engagement in the certification program was 99 percent across the globe.

Lights, Camera, Action!— Video Challenge Accepted

Now that sellers demonstrated they had the knowledge, it was time to provide a forum to both practice sales messaging and assess their ability to articulate a product’s story with confidence. Sales Excellence evaluated a number of concepts, including group role-plays, manager one-to-one coaching, videoconference coaching, and more. Scalability, consistency, and speed were paramount, but not feasible using traditional methods. The final answer was a technology-based sales enablement platform that delivered a series of customer questions to sellers and then challenged them to record a video response. The responses then were evaluated by sales coaches, and personalized feedback was provided to the seller. Since 2017, team members have practiced sales messaging on myriad products more than 21,000 times and received personalized coaching on 2,000-plus videos submitted for review.

Sellers who pass all related challenges are granted a Product Knowledge Certification.

The Secret to Success—Data-Driven Change Management and Communications

The program model combining collateral, training, knowledge, and video assessments is a powerful integrated performance solution. At the same time, it ran the risk of ending up like other corporate-sponsored initiatives that eventually fade into obscurity. However, after two years, Product Knowledge Certification is running strong and continues to expand. Certification supports regionally specific products and 11 local languages, and is receiving greater investment.

The secret: Mastercard’s Sales Excellence team delivers ongoing management reporting on program engagement and performance delivered to sales leaders globally. On a weekly basis, sales leadership at the region and division levels receive ongoing updates on the program. They subsequently connect with their local teams to inspire and reinforce the importance of getting certified. Participating in and completing the program has become a fundamental aspect of being a seller at Mastercard.

The Results—Priceless

Sales Excellence has identified several different areas where the program has delivered a measurable impact. Increases in the number of learning events and recordings, utilization of internal and externally focused sales materials, and marked increases in knowledge proficiency are just a few. Additionally, the organization is seeing increases in sales pipeline for the products covered in certification. Finally, in the most recent sales survey, 80 percent of sellers are self-reporting greater confidence and fluency with product knowledge.

The team has started experimenting with other technologies, including two-way live role-plays and using virtual reality with artificial intelligence to assess video submissions for the program. The program’s moniker—Learn It, Know It, and Sell It—has become instrumental in the company’s efforts to maximize the capabilities of the sales teams as they deliver on their customers’ strategic plans.