Measuring Leadership Development

Leadership development programs are commonplace; less commonplace is an effective way to measure if they work. Companies are making an effort to determine whether and how leaders—and their organizations—are benefiting from these programs.

The impact of a toxic executive can be substantial, including the loss of star employees and departments that fail to reach business goals. Well-liked but ineffective leaders can be equally damaging, impeding a company’s profitability.

With so much at stake, organizations are wondering how to gauge the effectiveness of their leadership development programs. Does the coursework result in great leaders, or are these programs nice to have but not essential to high-performing leadership? Here is what five Training Top 125 companies and a learning technology developer are doing to measure whether often-expensive leadership development programs deliver the desired results.

Formal Assessments and Leader Feedback

Before you can assess if a leadership development program worked, you need to define success. “Effective leadership development means empowering our leaders by providing them with the tools they need to lead their group in a direction that meets the overall objectives of the company,” says Breanna C. White, project manager at C&A Industries. “Simply put, alignment and execution. If we are effective in our training, leaders can think critically, align their goals with the corporate direction, and execute with confidence, all while needing little guidance to do so. Lastly, effective leadership development means creating a culture that is open to learning; fosters continuous growth through the sharing of ideas; and never stops trying to get better.”

For a formal assessment, the company uses the DiSC 363 assessment tool. “Since the DiSC for Managers is a tool utilized from the start of that leader’s training, we incorporate this measurement to act as a continuous thread of common ideals and traits associated with their management style. Managers and their teams are assessed utilizing the DiSC 363 to gain an understanding of the top three things their team needs—more feedback, rallying the troops, positive workplace, whatever the assessment says,” White says. “Training then partners with the manager to design goals aimed at improving the identified areas; these goals then are reassessed after a minimum of six months. We should see improvement in the areas they were lacking. We constantly seek feedback through one-on-ones, which are utilized across the organization, not just for vertical communication, but laterally, as well.”

As valuable as a structured assessment such as DiSC 363 is, White says person-to-person observations of leaders are equally important. “Although formal assessments and evaluation of programs are utilized, the best metric for successful programs are the ‘in-the-moment’ items such as body language, current market trends, team feedback, etc. We never do the same program twice. As we obtain feedback throughout the program, we adjust. Sometimes this means training on the fly and allowing for flexibility. Even if we have an agenda, and there is something more pressing the leaders want to discuss, we adapt and make sure the topics are timely and relevant. We do surveys frequently to ensure we are timely with our topics.”

Prompt Reflection and Self-Assessment

At AlloSource, employees in leadership development programs are asked to analyze their own progress on a daily basis, says Director of Training and Development Mark Lenahan. In March 2018, AlloSource partnered with Avanoo to introduce a new training program for managers. The program begins with a daily e-mail from Avanoo with a link to that day’s video. Upon clicking the link, employees are asked to complete a “Check-In” where they rate their ability to focus on the day’s lesson. Once employees make their choice, the daily video loads. The videos are never more than three minutes and consist of a story, a lesson, and an action. Upon viewing the video, employees are taken to the “Action Board,” where they can reflect on the lesson or engage in discussions with their colleagues.

The daily reflection is coupled with more formal assessments. Lenahan explains that AlloSource uses three different methods:

1. Kirkpatrick Level 1 evaluations are completed by each participant to ensure the training met his or her development needs.

2. The Training team follows up with the requestor of the program to ensure the class delivered what he or she was looking for, and will be implemented and maintained by the individual or his or her team/department.

3. AlloSource believes the true test of any training is that the employee changes his or her behavior. This change is tracked using the appropriate analytics for that individual/ team/department (e.g., customer service scores, sales, employee retention, etc.).

Ask Direct Reports What They Think

Learning technology developer Skillsoft values the input of direct reports. The Skillsoft Leadership Development Program (SLDP) uses feedback from employees to gauge leaders’ progress. “Effective leadership development is all about substantive behavioral change,” says Senior Vice President of Content Product Management Heidi Abelli. “If leaders do not change their daily behavior and practices as a result of having gone through the development program, then we have not met our objectives. The key is changed daily practices and behavior that enable the leader to lead a team of individuals more effectively to accomplish business goals and objectives; drive employee engagement and commitment; and create an atmosphere of trust, loyalty, and responsibility within the team.”

For SLDP, Abelli and her team developed a multi-rater assessment containing approximately 65 questions that takes a leader’s direct report about 30 minutes to complete. This survey is intended to be completed by all of the leader’s direct reports and administered at least once every three months.

“Our approach is to aggregate the results of the survey data into a summary report,” Abelli explains. “The results of the summary report then provide the basis for a personalized learning plan for the leader that emphasizes areas where the leader may still require improvement, whether it is in the area of coaching or providing feedback or driving execution, etc. Over time, the summary report should show steady improvement in the ratings summarized in the aggregated report.”

Measure How Their Department’s Doing

By definition, a leader leads. So measuring how well leaders are doing means not just measuring their own performance, but that of the department or line of business they’re leading. At SpawGlass, metrics used to measure leader effectiveness look at the department’s overall performance, says Team Member Development Manager Charles Mogab. “Some metrics might include: How many of your team members have been promoted, moved to other key positions, or moved up in the company? How many, how quick, and how often do you meet your strategic goals? How many of your goals are you modifying or increasing during the year because you met the original or previous goal?” says Mogab. The company has 10 operating groups, and all 10 leaders have been promoted from within and have attended its leadership programs.

Part of observing how whole teams of employees—rather than just the leader—are doing is adapting course material as needed. To ensure leaders continue to meet their department goals, SpawGlass is making adjustments to help leaders adapt to the expectations of younger employees. “Although the core content of both of our leadership workshops has remained basically the same over the years, we have modified the courses to reflect changes in our strategic plan, the size of our company, and the changing workforce,” says Mogab, noting that the multiple generations in SpawGlass’ workforce mean the need for leadership development programs on how best to manage Millennials and how to incorporate coaching techniques. “Continuous change and continuous improvement is what keeps our leadership development workshops fresh and impactful.”

The expectations younger employees have of bosses differ from those of older employees, so the company wants to be sure its coursework is keeping up with those changes. “Feedback from some of the younger participants has helped us develop the new material around Millennials and coaching,” Mogab says.

At AAA, leadership development also offers guidance on working with employees, so the whole department, rather than just the leader, succeeds. The coursework includes modules such as “Coaching for Performance,” “Delivering Feedback,” “Communicating Performance Expectations,” “Employee Engagement, Conversations that Matter” (how to hold career development conversations), and “Influencing Others,” says AAA Director of Learning and Development, Human Resources Keri Borba. Formal assessments and feedback for learners allow the company to continuously look for ways to improve these modules. “We include a Kirkpatrick Level 1 Evaluation with all leadership development courses and take feedback provided seriously,” says Borba. “We take action on feedback that serves to improve the quality of the content, delivery, or performance support tools (i.e., job aids, simulations, etc.).”

Measure the Trajectory of the Leader’s Career

How long leaders stay with your organization, and how far they go in their career while there can be other good measures of leadership development success. At The Guardian Life Insurance Company, measuring the success of leadership development includes tracking the retention and mobility of participants within the enterprise. “We use a variety of measurement strategies, depending on the length and intensity of the program. For shorter programs, such as individual workshops, we focus on Net Promoter Score (NPS) and Value for Time Spent, along with pre- and post-confidence self-assessments. For longer, more robust programs, we measure retention of talent, advancement and mobility of talent, and business impact,” says Head of Learning and Career Development Gail Kelman. For instance, Guardian has measured that for its Emerging Leader Development Program (ELDP), it has retained 98 percent of graduates since the program’s inception three years ago. Some 95 percent of graduates indicate an increase in their network; 70 percent indicate an increase in their knowledge of the businesses that represent the enterprise; and 55 percent of graduates have been promoted or moved on to other roles within the organization. Of the 55 percent of graduates who moved to other roles, 68 percent were promotions and 7 percent moved to vastly different parts of the organization.

“In addition, many of our programs, like ELDP, have business-related projects embedded within the coursework,” Kelman says. “We measure the investment, implementation, and results of those projects. For example, our Commercial and Government Markets organization recently invested $350,000 to pilot one project, which leverages artificial intelligence to create an exceptionally engaging client experience.”

The measurement of concrete numbers and percentages is only part of the story in assessing the effectiveness of programs. Surveys and focus groups of leadership development program graduates also are conducted. As a result of discussions with recent graduates, Guardian launched a new leadership series called “Ignite.” “This new series will expand on the previous themes by including emotional intelligence, trust and credibility, navigating a new role, unconscious bias, and strengthening relationships,” Kelman says. “This approach will leverage expanded timing and cadence to allow participants to focus more fully on applying the learning. This design allows us to measure more of the impact of the program, by including assessments after each module.”

QUICK TIPS

  • Use formalized assessments such as Kirkpatrick’s Four Levels of Evaluation, and combine them with feedback from leadership development participants and those who report to them.
  • Make it part of each leadership development course to require participants to reflect on their own progress, and where they need to continue improving.
  • Find out how the leader’s behavior in day-to-day work at the office is benefiting from the coursework, and how you can help change behaviors that may be holding the leader and his or her team back.
  • Measure the success of the leader’s entire department or line of business. Determine how many of the leader’s employees have made progress in their careers, and have stayed with the company long term, and whether the department has reached its business goals.
  • Track how long leaders stick with the organization and how far they go in their careers. See how you can adjust programming to give leaders more of what they need to reach career goals.

Measuring the “New” Leadership Development Training Performance

By André Politzer, Founder and Managing Partner, Majestery (www.majestery.com) and Coaching Path (www.coachingpath.com)

We believe in dashboards to scale the quality of learning, especially for leaders who are executives, entrepreneurs, or professionals. Let’s get ready for the new era of artificial intelligence (AI) and machine learning. Applied to online tutoring, chatbots, Web, and mobile, this kind of technology opens a large scope of possibilities to understand learners’ behaviors and patterns of learning and to personalize content, teaching styles, and methodologies. We provide the TMS (Majestery Training Management System) with our training and coaching programs, giving learners and HR/ Learning stakeholders real-time customized monitoring tools, including:

ADAPTATION TO TECHNOLOGY

  • Assess innovation awareness and execution
  • Corporate culture adaptation to artificial intelligence (AI)
  • Learning adaptivity
  • Data security and confidentiality
  • Thorough reinforcement learning

DASHBOARD MONITORING PERFORMANCE

  • Matching learners’ profiles
  • Adapting data and analytics
  • Support quality
  • Clarity/transparency
  • Quality of learning
  • Impact and adaptation (AI captures data to mimic learners’ behaviors)
  • Feedback (frequency, relevance, priority)
  • Compliance (policies, alignment, best practices, communication)
  • Classroom (performance, participation, instructors’ persona, collaboration)
  • Certification (completion, scores, progress, ETA)

RETURN ON INVESTMENT (ROI)

  • Impact on the organization’s image and branding
  • Extension of culture, content, and methodology
  • Revenue growth
  • Learners’ improvements impacting customer satisfaction and loyalty
  • Time maximized
  • Inclusion and diversity
  • Learning reinforcement
  • Regulatory costs, audit effectiveness, and compliance

Measuring Leadership Development Effectiveness

By Jim Kirkpatrick and Wendy Kayser Kirkpatrick, Co-Authors, “Kirkpatrick’s Four Levels of Training Evaluation” (kirkpatrickpartners.com

One misconception in the training industry is that evaluating specific outcomes of leadership development and other soft skills programs is nearly impossible. The reality is that doing so is just as straightforward as evaluating the effectiveness of technical skill training, as long as outcomes and performance standards are specifically defined.

Start by creating a common understanding of what training effectiveness means. A good definition is training and related activities that support specific on-the-job behaviors that contribute to high-level outcomes the organization values.

Leadership training requests tend to develop in one of two ways:

1. Key organizational metrics—such as turnover, sales, or processing time—are not meeting standards.

2. The organization wants to proactively invest in its emerging leaders and talent pipeline.

Whatever the impetus, select a handful of key metrics a successful program would positively influence. For example, a leadership development program for sales managers at a consumer products company might target sales volume, profitability, salesperson turnover rate, and client satisfaction.

From there, a crucial—and often overlooked—step is identifying exactly what sales managers should be able to do as a result of the leadership training to generate the desired outcomes. Many factors influence key company metrics; the goal is to define a few specific, observable, measurable behaviors the training graduates should perform that are likely to positively impact the identified metrics.

For example, to reduce the salesperson turnover rate, one critical behavior for the sales manager could be conducting one-on-one meetings with each salesperson direct report at least once per week. An outline of topics to be discussed could be provided in the form of a job aid. Whether or not the meetings were held could easily be tracked. The individual attention provided during the meeting would help the salespeople feel valued by the organization and provide an opportunity for coaching and support, which would contribute to their success and thereby positively influence their decision to stay.

Once critical behaviors are identified for the most important organizational metrics, the basis of the training content is defined. For example, if managers are to conduct one-on-one meetings with their direct reports to provide coaching and support, during training they should learn coaching techniques and communication skills to use. These are common leadership development topics; the difference is that they now are directly connected to behaviors that will be tracked. If sales managers are not conducting the one-on-one meetings, this can be identified and corrected.

If the one-on-one meetings are being held, the sales, profitability, and client satisfaction for the sales manager and the salespeople they lead can be correlated to the critical behavior and the coaching points discussed during the one-on-ones.

While attempting to isolate the impact of a single behavior is inadvisable, a reasonable story supported by metrics can be told that sales managers who coach their employees regularly yield a team that produces better results. The connection between the training, performance, and results is credible.

Resist the common urge to focus the majority of your time and resources on learning objectives and competencies. If you define critical behaviors through needs analysis activities and dedicate the majority of your resources to supporting them, you will be the proud participant in a leadership development program that works.

Harnessing the Power of Predictive Analytics

By Stephen Young, Senior Research Scientist, and Stephen Jeong, Senior Faculty Member, Leader Analytics, Center for Creative Leadership (CCL)

Today’s workplace can be challenging for Training and Development professionals. The pace of change is unrelenting, and what was once “tried and true” is being upended continually by new competitors, new technologies, and new ways of working. These mounting complexities can pose a serious challenge when it comes to determining where to focus your efforts for the best payoff.

To make leadership development investments that benefit your business in impactful ways, it’s time to harness the power of data and embrace scientific methods that go beyond subjectivity to yield tangible results. That means turning to predictive analytics. It’s the same approach used by Google and other online search innovators. They use sophisticated algorithms to sort through mountains of information and serve up relevant results that match your needs.

Now similar tools can help your team predict the leadership development initiatives most important to your organization’s current and future success. No more guessing. Instead, HR analytics experts use algorithms to determine those aspects of leadership actions, company policies, culture, employee experience, and engagement that are most closely linked to your strategic business results. You get a clear roadmap that shows you where to target your people investments for the greatest impact.

To harness the power of prediction, follow this five-step process:

1. SET PRIORITIES. Start by determining the strategic priorities most critical to your organization’s future. Where are you headed? What metrics are being tracked? What would “better business results” look like? Your company’s strategic plan can be a great starting point to facilitate discussions with your senior leadership team. The goal is to identify two to four business outcome metrics your executives agree are most critical to your collective success.

2. GATHER DATA. For the most effective analysis, you’ll want to leverage multiple sources of qualitative and quantitative data about your leaders, teams, and workforce—from 360-degree leader assessments to surveys of culture, employee experience, and engagement. The goal is to get a well-rounded, data-based view of your current state, with both your people data and business data integrated within a single data file.

3. ANALYZE. When it comes to analysis, sophisticated algorithms do the heavy lifting—determining statistical links between your strategic business outcome metrics (e.g., sales, customer satisfaction) and the people data you’ve gathered. Rather than focusing on gaps that may not be relevant, you benefit from a new level of precision. You see which specific people factors are most likely to impact your highest-priority business outcomes.

4. MAKE SMART DECISIONS. Your analysis is likely to suggest that new and more targeted approaches are needed to lead your organization into the future—not just doing more of what led you to where you are today. Use the insights your analysis uncovers to design and launch highly targeted leadership development initiatives that focus first on what matters most to your organization’s success. Plan the steps you will take to bring key organizational strategies to life and produce optimal results.

5. ASSESS YOUR IMPACT. Follow-up matters. Track your strategic impact and return on investment using check-in surveys and additional metrics statistically linked to business results. Documenting improved earnings and cost reductions can tangibly demonstrate the positive impact of your development efforts and help you build your reputation. You will have the information you need to finetune your efforts, document results, and build support for future people initiatives.

NOTE: The authors work with clients to implement CCL Fusion (https://www.ccl.org/leadership-solutions/analytics-evaluation/ccl-fusion/), a new analytics offering that brings together experts in predictive analytics, leadership development, surveys, and assessments to help businesses produce results that matter.