Deloitte LLP: Learning Impact Evaluation
At the end of fiscal year 2015, Deloitte LLP implemented a new program evaluation instrument, designed to yield more telling and actionable data about the impact of its learning programs. Used for all nationally sponsored courses and programs across all delivery types (live, virtual, e-learning), the new Learning Impact Evaluation features a revised set of questions that focus on measuring learning gained, on-the-job application, and impact on performance. By analyzing the new evaluation data, Deloitte is able to better align its learning strategy with business and talent outcomes, and determine any necessary continuous improvement efforts for its learning programs.
This initiative impacts all 78,000 personnel in the organization, as everyone who completes a learning program receives this new evaluation. The Learning Impact Evaluation initiative is part of the broader Development Measurement Strategy, designed to improve how Deloitte Talent Development assesses, reports on, and acts upon the measures of its value to and impact on the business. This initiative had three primary objectives:
1. Provide greater insight into learning program effectiveness and impact
2. Better align learning strategies and solutions with business needs and outcomes
3. Enhance Deloitte Talent Development’s role as a performance consultant to the business
Program Evaluation Details
Members of Deloitte’s Talent Development Strategy and Innovation, Consulting Talent Development, and Talent Analytics teams developed the new program evaluation. They leveraged Bersin by Deloitte’s Learning Measurement Framework to determine the learning impact dimensions included on the new evaluation:
- Satisfaction: Feedback on course structure, materials, and instructor (mainly required for CPE)
- Learning: Learning gained
- Utility: Applicability of the content to the job
- Performance: Impact on individual performance areas There is also a “General” category that includes a net-promoter question (would you recommend this program?) and one that is open-ended for general comments. The evaluation questions were developed against each dimension with the following guidelines:
1. Only use questions that are universal to all delivery types, that reflect feedback pertinent to program effectiveness and impact, and that the firm wouldn’t already know the answer to (e.g., adequacy of the training facility/environment).
2. Do not include multiple versions of the same question, nor “double-barreled” questions that include more than one statement for learners to respond to in a single question.
3. Pose questions as absolute statements for respondents to more easily indicate their level of agreement (e.g., “What I learned in this program is essential to my work”).
4. Limit the number of questions and use plain language and simple sentence structure, making it clear what is being asked, to facilitate quality answers and overall response rates.
The majority of evaluation questions use a 5-point Likert scale of “strongly disagree” to “strongly agree.” There are a few openended questions, as well as one where respondents can select from five individual performance areas the program supports.
Deloitte Talent Development conducted three rounds of revisions with all of the CLOs to review, discuss, and refine the question set. The final questions also were determined based on an initial/pilot deployment of the evaluation across Deloitte’s Consulting business. Finally, Deloitte Talent Development defined a validation approach to test the evaluation questions for construct validity (are we measuring what we intended to measure?) and criterion validity (are we seeing correlation across questions and between dimensions?), and to assess response rates. Deloitte conducted validity testing on more than 35,000 new evaluations.
Deloitte offers two versions of the evaluation— a post-delivery version that goes to all program participants, and a follow-up version that is sent out between 30 and 90 days after delivery for key programs. The follow-up only includes select questions within the Utility and Performance dimensions, and is intended to compare what learners thought would happen and what actually did happen with respect to applicability of the content to their role and impact on performance. Deloitte Talent Development uses a variety of reports to depict evaluation data across impact dimensions, multiple deliveries, audience demographics, and related programs within a curriculum.
Results
A key result from the new evaluation has been the quantity and quality of the data. Response rates have increased by 5 percent across live instructor-led programs between the first quarters of FY’15 and FY’16. But more importantly, program managers have more telling and actionable data. To date, there have been numerous instances where the evaluation data has been used to validate that a program is having its intended impact or to shape continuous improvement efforts. This has allowed program managers to fulfill the role as performance consultant, moving beyond just program reporting to taking corrective actions so learning aligns with capability needs and helps enable business outcomes.
For example, the new evaluation was administered following the delivery of a milestone program for senior consultants in the Human Capital Consulting practice. The data and open-ended comments revealed that while learners enjoyed the program (Satisfaction) and learned from it (Learning), the content was not comparably applicable to the job (Utility) or as impactful in driving improved performance (Performance). Based on the evaluation data, the program team revamped the program for FY’16, reducing the duration from five to three days, replacing a simulation with a conference-style, elective-based program, adding a pre-assessment that defines recommended elective courses, and including a formal mini-case administered virtually. The evaluations provided the direction for revising the program based on the needs of the learners and the business, spurred innovative thinking around program structure and content that challenged the status quo, and increased buy-in from business leadership based on Deloitte Talent Development’s commitment to continuous improvement and optimizing program impact.
The Ritz-Carlton Leadership Center: Business Principles Symposium
The Ritz-Carlton Leadership Center created the Business Principles Symposium to provide a greater number of people the chance to benchmark the hotel chain’s business principles. Full-day courses are capped at 30 people, but the symposium accommodates 150 people. Two symposia were held in 2015, one in April and the other in November.
Program Details
The Culture Transformation Group (CTG) had an initial kick-off meeting to discuss possible topics and learning methods. Each member was assigned a topic and was responsible for developing a training program for that topic. Learning objectives focused on sharing The Ritz-Carlton’s DNA by showing:
- How culture is the foundation in every organization
- How civility relates to customer service
- The psychology behind memorable service
- The connection between leadership and motivation
- How culture, service, and passion positively impact the bottom line
After three months, the CTG members sent their core content and outlines to Jeff Hargett, the senior corporate director of Culture Transformation. He shared the outlines with the group, and they all reviewed content to make sure their training initiatives would work together.
At six months, Diana Oreck, then-vice president of The Ritz-Carlton Leadership Center, sat down with each member of the CTG individually to make final content choices and ensure the new training initiatives were clear, fresh, effective, and relevant.
Once the content of each training program was complete, each CTG member worked with a learning designer to produce visuals. The CTG chose to use Prezi presentation software to create a more engaging and impactful visual learning experience.
During the symposium, attendees were given cards and encouraged to write down any additional questions. CTG answered a question on its blog each week until all questions were answered. CTG also sent a follow-up e-mail after the symposium encouraging attendees to subscribe to its blog and follow its social media channels in order to reinforce the symposium learnings.
Results
The feedback forms indicated 97.9 percent of attendees felt their goals were met, and they graded the overall content as a 4.79 on a scale of 1 to 5 (with 5 being the highest rating).