Measuring Organizational Learning: Going Beyond Measuring Individual Training Programs
Posted: 07/22/2009 12:00:00 AM EDT | 3
Whenever the subject of measuring learning comes up in discussions, particularly within corporate circles, the conversation inevitably moves to the Kirkpatrick Four Levels model. From there it often migrates to the question of establishing the Return on Investment (ROI) of training and attempts at using models designed for business finance to attribute the success of the training program in question to the success of the corporation. The fact that such attempts negate other environmental variables for the corporation’s success and minimizes or sometimes even totally ignores the efforts of marketing, sales, supply chain, etc. is never brought into the discussion. An exaggerated—but, nevertheless, not far from the truth—comment would go something like: “This training we launched in November resulted in a 10,000 percent ROI as seen from the sales of our toys going up 30 percent over the December holiday sales period. This is compared to a 'control group' that didn’t go through this training in June.”
Regardless of the direction of the conversation, the focus is almost always on specific training initiatives. Seldom, if ever, does the dialogue evolved to one around measuring the quality of learning across the enterprise.
The Kirpatrick Four Levels Model
The Kirkpatrick mode looks at the following:
- Level 1: Reaction
- Level 2: Learning
- Level 3: Behavior
- Level 4: Results
While not inherently a bad or invalid model, it was, however, designed primarily to examine the effectiveness of individual training programs, and when it was first developed, was focused on training programs within a manufacturing context. Its expansion for use with more soft skills and office-oriented training programs isn’t always bad either. However, the model is totally inadequate for the measurement of organizational learning, where it goes beyond the simple aggregation of different training programs.
The 5-Pillars Model
Through the sponsorship of the Sloan Foundation’s Sloan Consortium, universities and other institutions of higher education have been using a 5-Pillars model that have been adapted by companies such as Johnson & Johnson and Amway to measure the quality of organizational learning.
The model is depicted as a set of pillars, which holds up the quality of organizational learning. That is, all the pillars equally support learning and no one pillar is more or less important. Although it takes a different approach from the Kirkpatrick model, it is not in competition with it in that it is not proposed as an alternative to the Kirkpatrick approach. Instead, embedded within the pillars are elements of the Kirkpatrick model. (Click on image to enlarge.)
The following is a brief description of each of the 5-Pillars of the Sloan Consortium model for measurement of learning quality.
The first pillar, Access to Learning, addresses the question of what is the reach of learning opportunities across the organization. Metrics that can be used include the percentage of all the learning programs, how ever you define “learning programs” that is accessible to all employees across the enterprise. If you utilize some kind of learning technology to deliver these learning experiences such as a LMS or a Learning Portal, you could also look at the accessibility of the technology across the enterprise as well as technology reliability metrics.
The Learner Satisfaction pillar is about the satisfaction of learners with their learning and personal growth opportunities. This pillar could include the Level 1 metrics of the Kirkpatrick model. Additionally, if your organization has some form of an employee opinion survey, an index of how employees perceived their personal growth and development needs are being met can be easily used as in indicator of the strength of this pillar. Other possible metrics for this pillar are an index of repeat usage of the learning technology used as well as an index of the usage of the Help Desk for learning.
The third pillar of Cost Effectiveness examines the capital efficiency of learning investments. The basic issue examined by this pillar is around how much is accomplished with the given resources and budget. It will be tempting to lapse into an ROI initiative here but there are other data that would provide better indicators of the strength of this pillar. Furthermore the ROI of just individual programs does not necessarily mean that capital was invested effectively within the context of the larger organizational picture. Some examples of metrics that can be used would include how much learning resources have been leveraged across the enterprise or demonstrations of scaling of learning resources.
The next pillar, Learning Effectiveness, examines indicators of the impact of learning on the organization’s strategic direction. Some of the Kirkpatrick model’s Levels two and three approaches could be adapted for use within this pillar. Also, linkages could be made to the organization’s innovation or speed to market indicators with employee perceptions of their growth and development. If some form of correlation analysis is used here, caution must be taken not to attribute specific cause and effect since correlations often explain connections but not necessarily casual direction.
Finally, the pillar of Management Satisfaction looks at the indicators of management’s satisfaction with learning that is occurring across the enterprise. An obvious indicator would be the trending for learning budgets over time relative to a business performance indicator such as sales revenues. Surveying management satisfaction to learning would be another approach to measuring this pillar.
In applying the 5-Pillars model, again keep in mind that they equally support enterprise learning and one pillar is no more or less important than another. Also, as you bring metrics together, keep them simple. Avoid the temptation to launch full-scale statistical studies or over-stretch the utilization of business finance tools. These often not only require resources that would detract and distract from true organizational learning but they often take so long to assemble that they are out-of-date when the information finally becomes available.
Finally, another aspect of the 5-Pillar model for measuring organizational learning is that the definition of learning can be as broad or narrow as you want. Even if learning is defined in the narrowest possible way to be just training programs, the model still applies and is more able to take into account more than just the simple aggregation of the measurements of individual training courses.
Averaging Ratios And The Perils Of Aggregation
The Root of Active Learning: Boosting Creative Thinking and Innovative Capacity
Co-Dependency and the Manager—Do You Dread Discord?
Using Psychometrics to Identify and Develop Your Future Leaders
Michelle’s Law: Legislation Whose Time Has Come
Preface To Performance Measurement: Understanding Ratios
Ratios: Abuses and Misuses!
Coming: A New Training and Continuous Learning Boom
When is Gaming Important for Learning?
Unwired 2.0: Green Learning for Hard Skills Training
* = required.
Well written article. Wondering what tools you suggest for gathering data to verify the effectiveness.
This piece is very well written with a solid business logic that goes beyond any one metric. As a business professor who teaches in Corporate Learning programs, there are very important takeaways for us here, and I intend to share the piece with the top decision makers within Corporate Learning at Thunderbird and with key faculty. The author argues very convincingly about the totality of the organizational learning effort and the need to assess at many levels. This thinking will help all of us in the business of doing a better job of understanding the mindset of our Corporate Learning clients and most importantly, meeting their needs. Thanks for sharing important ideas and frameworks.
Workforce Planning Public Sector
Hotel Realm, Canberra, Australia
September 16- 18, 2014
12th HR Metrics & Analytics Summit
Rosen Centre , Orlando, FL
September 29- 1, 2014
November 9- 11, 2014
Roundtable: Training at the Frontlines
March 19, 2014
Improving Student Retention: Engagement, Persistance & Compliance in Online Higher Education
March 8, 2013