Tuesday, April 24, 2012

A conversation on measurement and metrics (#ASTDLN)


These are my notes from the final workshop session at ASTD Learn Now in San Francisco.  Bob Mosher and Conrad Gottfredson of Ontuitive are leading a conversation on metrics and measurement. 


The things that we currently measure about learning are things that the learner doesn't really care about.  (% of course completions, pass/fail rate, # student days, etc.)


Bob Mosher -- there is value in smile sheets.  Research shows that if people like you're training, they'll take more.  There is defendable data when 12,000 people say that class was good.


When we measure the first two moments of learning need (new and more) we can measure with knowledge & skills gain -- certification, demonstrable skills, compliance.


Gloria Geary, Why Don't We Just Weigh Them? http://www.gwu.edu/~lto/gery.html


When we move to the world of performance support, we need to gather data to show that we make a difference to the organization to achieve its aims -- we need to measure competency and measure moments 3-5.  We need to tie it to on the job performance gains:

  • time to proficiency
  • lower support costs
  • completion of job-related tasks
  • increased user adoption
  • optimized business processes
  • customer/employee loyalty, morale, and/or retention
  • sales close/cycle time
But we need to be measuring what has a critical impact to the business.  We need to be measuring moments 3-5 (apply, solve, change)


There are three ways we can measure:
  • digital monitoring (we can track activity, see where they click, see where they spend time).
  • performer monitoring (quick checks)
  • “others” monitoring

We can do quick checks and ask the learner how they’re doing – but not for those things that would be catastrophic if we don’t do them right.


Shouldn’t the real measurement be whether or not they’re selling more chairs (assuming they’re selling chairs, of course)? What’s been the business impact?


Critical Skills Analysis – determine along a spectrum where things have critical impact to the business. Work with SMEs to create a rubric for the lines of business and for different skills.  
These are the lines of business perception of critical business actions – these aren’t the learning team’s perception…Bob and Con show a 1-7 ranked scale – a critical impact rating.  At one end is complete catastrophic results – e.g., someone will die.


Make sure you’re investing in measurement in the right places.  Figure out what’s happening in the right places.  Figure out what’s happening to improve performance support and to improve critical business impact.


We can’t measure everything. Don’t try to boil the ocean.  Measure what matters.  How deep do I go?


The new analytics:


Chad shares some data you can get from a mobile app:
  • time spend on a page
  • frequency of use
  • sharing info
  • type of info accessed
  • conversion points (are they doing what we’ve designed the experience to do?)
  • Other things…does access frequency go up or down over tie? Does engagement time go up and down?)
Analytics more in line with what marketing people look at.


Sample analytics that they got:
  • 20-25% of visits last between 10-30 mins (this was for a mobile quiz game that took about 2 mins – so people were spending more time here)
  • users returned to the app in less than one day
  • Game rules only comprised 1% of the time consumed – this was the manual/user guide – it confirmed for the developers that they had designed a good UI. 
This was data measured outside of the LMS. The digital analytic world – google analytics – it’s a new era in data. 


Yahoo Web Analytics – free tool used with advertisers.  To determine what % of business is coming from different channels.  At yahoo, using it to determine what learners are doing – what content they go to, what pages are useful, it allows them to understand behavior.  If there’s content out there that no one is looking at…then why?  This allows you to determine where they go and how long they stay – and to view it buy country/demographics.  Who’s using it?


Business example – health insurance provider using a performance support system:
  • 84% of sales force used the embedded learning solution DAILY
  • 6% increase in DAILY work productive – finding correct info, not waiting for answers, not bothering others (measurable, observable behaviors)
  • 2.4 hours saved per week per employee
  • So that means they had more time to sell.  $454K saved based on audience of 3,000 users
How would you go about gathering data at the moment of APPLY?

No comments: