Well, the Big Data evolution continues, yet many organizations I speak with feel like they are moving even further away from their learning and talent measurement goals. We generate a lot of data, and usually have good intentions of using it, but are we really getting value from our leadership development training programs? These might sound familiar: 

  • “We put 1,378 people (92% of target audience) through the leadership development program!”
  • “We exceeded our ‘favorable’ course evaluation score with a 4.1 average!”
  • “We achieved our financial target of less than $350 / head!”

Summary: All is good; stay the course!

Now here’s what we generally do not hear:

  • “Consumption of our Business Writing eLearning course is extremely low, although our CBA shows this is a needed skill for almost all of our professionals. Let’s develop a plan to accelerate the promotion and communication of this course.”
  • “Across the program, learners excelled in work product quality, but lagged in innovation, especially in the Southeast region. Let’s dig a little deeper into development content related to this, and also schedule some time with the business leader in the Southeast to get his input.”
  • “Our field performance assessment showed a significant increase in client satisfaction for professionals completing the leadership development training program, leading to add-on projects of $250,000 per client.”

Summary: Our program is clearly generating ROI and business impact, but there are some target areas we can focus on to even further optimize our talent and transform our business.

The difference between these results was driven primarily by two factors: 1) Intelligent Data – I covered this in my last blog: Putting the Smile Back in L&D Program Measurement; 2) Intelligent Analytics – aligning your analyses to answer specific business questions, and enabling interpretation and action by target end-users. I have seen so many analytics models that are near perfect in accuracy, but are completely misaligned with the business objectives and the consumers of the results. These are quickly shelved, and stakeholders revert to their original reports (in our world, the good old smile sheet).  In learning and development training, an intelligent analytics model will answer questions like the following:

  • Are we hiring the right Talent?
  • Are learners actually learning in our programs?
  • Is their learning translating into performance in the field?
  • Are we advancing and placing our people effectively?
  • Are we improving business value for the organization?

In parallel, we need to identify the end consumers who will interpret the business performance analytics results and take the appropriate follow-up action. In the world of Human Capital Management they generally fall in one of 3 categories / roles:

  1. Talent: Recruits, Employees, Learners, etc.
  2. Function: CLOs, Talent Management Director, University Relations Leader, etc.
  3. Business: Learning Program Stakeholder, Functional Leader, Talent Coach, etc.

These personas are not only the end-users of the solution, but should also be a part of your design and development team. Involve them throughout the process to ensure your leadership and development training solution is properly aligned.

So, I’ve now covered two critical foundational concepts for developing effective leadership training and measurement solutions, but you probably are still asking the million dollar question: how do I do it? In my next blog, we will start our journey down the execution path – stay tuned!

 

Andy Tanner

Author Andy Tanner

More posts by Andy Tanner