The Growth of Learning Analytics

Five identifiable stages of learning analytics in the learning reporting market.

By Stacey Harris, VP, Research and Advisory Services,and David Grebow, Principal Learning Analyst, Brandon Hall Group

We are experiencing a revolution—a data revolution. Since the dawn of the computer era, our businesses, organizations, and personal lives increasingly have been directed by data.

This data revolution is a result of several forces colliding in today’s business environment:

  • The growing interest in leveraging learning as a tool to engage both internal and external audiences.
  •  The growing demand to leverage learning data to inform organizations’ critical business decisions concerning talent.
  • The recent attention given to ensuring that today’s learning functions are so effective that they become a competitive advantage for their organizations.

Brandon Hall Group, a research/analyst firm serving the performance improvement industry, recently completed several research efforts on the evolution of learning analytics, and how that growth has been affected by both technology advancements and industry demand for change.

The Evolution of Learning Analytics

Our research has uncovered five identifiable stages in the learning reporting market. Most organizations evolve through them over time.

Stage 1: TMS to LMS: Administrator Focused. When the Training Management Systems became the early Learning Management Systems, the primary person needing reports was the training administrator. A “flat file” was one of the earliest types of reports the LMS could generate. The flat file was a simple list of names and course completion dates for each individual. It was useful for tracking the people who took and completed a course. There was no need for any analysis other than the number of people who completed a prescribed course. As organizations grew and the numbers of people and courses increased, flat files became too long and unmanageable. Reporting needed to move to the next level.

Stage 2: More Data Means More Answers: Learning Manager Focused. As the size of the learner base and learning function responsibilities increased, learning managers realized there was a more comprehensive picture of learning that could be developed from the reporting data. Exportable data files replaced flat files and enabled learning managers to query, analyze, and answer, for example, the following questions inside third-party analytics tools:

  • How many people out of a total (say 10,000) did not complete a course?
  • What was the increase in course completions year over year?
  • How do the registration rates compare to the completion rates?
  • How many people started and completed the surveys on courses and programs?

Stage 3: Graphical Treatment of Data: Learner Focused. As the analysis became more complex, interpreting the analysis needed to become easier. The result was the graphical representation of the data in dashboards and other dynamic interfaces. The dashboard could quickly show personal training records, course completion rates, gaps in personal certificates or certifications, levels of competency, mapping skills levels to job descriptions, and more. The deeper levels of data and the resulting analysis now required a faster way of find answers. Learning data were now in the hands of everyone with access to the LMS, so the tools could begin to shift from an administrative platform to a career development and business tool.

Stage 4: Dynamic Reports: Line Manager and Business Manager Focused. As line managers and business leaders began to use learning data to make daily decisions, they needed to speed up the analysis and reporting provided by the LMS. The result is more dynamic reports and ad hoc reporting tools that can update, analyze, and report information whenever needed. These dynamic reports allow a manager to request a weekly review of a team’s learning, a tally on a Wednesday about a group-level certification, or a same-day report on an employee for a performance review. Not only has the speed of the requests and reports increased, the level of data that is being analyzed has increased to cover the entire lifecycle of employees’ learning and achievement—from the day they start, through all their promotions, until the day they depart.

Stage 5: To Big Data and Beyond: Business Focused. Learning has become a mission-critical business metric that often can show how “adaptable” an organization is by looking at the overall level of skills and qualifications. This “big picture” requires the input of big data that is much more informative than the previous levels of analytics. Big data is a collection of data sets so large and complex that it is difficult to process or analyze using traditional database management tools. It has data characteristics that need to be managed, such as volume, velocity, variety, veracity, variability, and complexity.Big data for learning incorporates every data point across the organization, including the following:

  • Demographics
  • Feedback
  • Course starts/course completions
  • Test results
  • Skill levels
  • Performance reviews
  • Course access points
  • Time on system
  • Clicks and scrolling

Big data analytics provide answers to a new set of learning-related questions that many organizations would have never thought to ask, such as:

  • Is there a connection between when and where course content is accessed and the level of employee engagement or performance?
  • What courses or certifications are correlated with improved performance in a specific division?
  • Which programs resulted in the greatest measured improvements in productivity, by employee role?
  • When we see audit support reports on certifications, how much time do we have before we begin to see problems?
  • With the number of new employees, what does the compliance gap look like across all the regulatory courses?

Every organization needs to assess its own readiness for diving into deep levels of learning analytics, but a good foundation can be built on solid data capture practices and by working on reporting standards that meet the needs of particular audiences.

Key Practices for Learning Analytics

Learning analytics have proven to be valuable tools and, like any tool, require a certain level of skill and understanding to be properly used. Here are some of the best practices we discovered during our research:

  • Educate and train everyone who needs reports on how to use your reporting capabilities and tools. Organizations too often buy LMS systems with great capabilities and they fail to use them effectively to get the best data in and out.
  • Clearly define data input requirements and parameters for all stakeholders. Data must be the same across the board. Even something as simple as the difference in a single letter of an employee code can result in what is called skewed data.
  • Data can be more than names and numbers. Organizations that get the most out of their learning analytics also have the ability to include use-case studies as part of their datasets. The resulting picture includes real stories about usage to support the more objective data.
  • Understand the nuances and opportunities posed by data warehousing. Big data means you will be pulling data from the LMS and many of the other enterprise software systems (e.g., HRIS) in the organization. It is up to everyone, not only people focused on learning, to maintain the integrity of the data.
  • Scheduling can become complicated with access to real-time data. To keep the cost of operations down, system users need to make a clear distinction between reports that can be scheduled (e.g., weekly or monthly) and reports that need to be available on an “as requested” basis. The question is, “What is the ROI for each report?” The answer is not, “Let’s run this on a daily basis now because we can.”

Stacey Harris is vice president of Research and Advisory Services, and David Grebow is the principal learning analyst for Brandon Hall Group, a research and analyst group serving the performance improvement industry, with more than 10,000 clients globally. Brandon Hall Group has an extensive repository of thought leadership research and expertise in its primary research portfolios—Learning and Development, Talent Management, Sales Effectiveness, Marketing Impact, and Executive Management. At the core of its offerings is a Membership Program that combines research, benchmarking, and unlimited access to data and analysts. Members have access to research and connections that help them make the right decisions about people, processes, and systems, coalesced with analyst advisory services tailored to help put the research into daily action. For more information, visit http://go.brandonhall.com/homeand http://go.brandonhall.com/membership_TM

Lorri Freifeld
Lorri Freifeld is the editor/publisher of Training magazine. She writes on a number of topics, including talent management, training technology, and leadership development. She spearheads two awards programs: the Training APEX Awards and Emerging Training Leaders. A writer/editor for the last 30 years, she has held editing positions at a variety of publications and holds a Master’s degree in journalism from New York University.