Wednesday, October 23, 2013

Koreen Pagano #DevLearn: The 80:20 Rule of Data Analysis

This are my liveblogged notes from Koreen Olbrish Pagano (@KoreenPagano) session at the eLearning Guild DevLearn Conference, happening this week in Las Vegas. Forgive any incoherence or typos. I am in Vegas, after all...

There's data out there, in everything we do -- on your phone, your websites, etc -- how can we use the data that's available for learning?

Metrics that don't matter...learning orgs spend a energy tracking metrics that don't matter: time spent, courses completed, passing scores, # attempts...these metrics show the costs of the training, but not of the value. This data is about how much time people spend away from their work, but not their performance. This completion data doesn't really mean anything to the business - -in terms of what it's measuring.

This isn't data about whether they learned anything or changed behavior.

Typically, learning & dev is seen as a cost center.

For most orgs, the metrics that matter: are we fast, cheap, good?

Stakeholders, employees, customers -- the three buckets of people you need to focus on. Try to balance the happiness and satisfaction of those three groups.  Stakeholders might care about money, employees about coming into work every day and liking the culture, and customers getting good service (for example). 

Who do you really value? And how do you capture data and metrics that matter?

As training professionals, can we create training solutions that impact those metrics? Those are the metrics that matter to the business. We should be supporting those business objectives.

80% of the sales skills needed for success at pharmaceuticals are the same across all companies; it's only 20% that's unique.  (Koreen cites research from Andy Hartnett on this...)

Think in terms of business and performance metrics and not just learning metrics.

As a learning person, can I communicate the value of what I do back to the rest of the business?  If we're working for the organization, we need to interpret what we've done and share it back to the business---this is what it means for you?

What is the business problem you're trying to solve?

For each person, how do you measure their success? (what are the metrics that matter to that person to help motivate them to be better and to change their behavior?)

Learning & dev people are usually only brought in at the design stage. We miss the analysis stage and miss an opportunity to show our value.

Performance metrics and not learning metrics.

If we're just putting out content and not looking at performance, then we're just content pushers.

80% common business success metrics. (money, the base things that make everything successful)
20% snowflake metrics. (what is your org's core value -- how do they look to promote their core value -- look at those metrics).

How do you translate the learning programs you're creating back to those business metrics that really matter?

You need to know what's really valuable. And then design learning that supports that business value.

No comments: