Andy Whitaker, Consultant, Rustici Software @tincandy
How can we implement Tin Can (xAPI -- "experience API") in our organizations to get real benefit?
Matching learning to performance with Tin Can
Most "learning activity" is tracked in orgs in an LMS using SCORM.
But if you try to understand how learning is impacting the business, you might be spinning your wheels. It's a cobbling together of reports.
Questions to ask of your org's tracking abilities:
- Are you able to track (in an effective and efficient way) all learning activities, not just those in your LMS?
- Are you able to efficiently and effectively evaluate your learning programs?
- Are you able to understand a learning program's impact on the business?
We're not all the way there, but when we start leveraging this new technology, we can't start answering YES more often.
An API is "a shared language for two systems to talk about the things that a person does." An API helps different apps talk to each other (e.g., Yelp uses Google Maps to find your restaurant using Google's API).
With Tin Can, we talk about an Activity Provider and an LRS (Learning Record Store).
Most orgs aren't currently using tech that supports xAPI -- there's work to do. The hope is in a year, two years, more vendors/products will have adopted xAPI.
A Tin Can Statement = Noun...Verb...Object (and some other stuff)
Examples:
- "Andy read employee handbook."
- "Andy completed new hire orientation."
- "Tim reviewed Andy."
- "Andy attended TK 2015."
Log in to your employee internet -- download the handbook -- at that point, a statement can be sent to the LRS, saying that this activity has been completed. You have to have some kind of tech in place when this event happens.
Float's Tappestry app lets an individual log their own learning events: "Andy attended TK 2015." -- that's a real world activity that can be tracked.
[Cammy's note on this -- there's no verification of this event, however, so you could make up whatever you feel like and recording that you went to the event doesn't mean you actually learned anything or even paid any attention. Perhaps you were too busy gambling!]
So you can funnel LOTS of data into a learning record store. This is the beginning of connecting learning to performance. First you need all that data.
- Assessments
- Performance Observations
- CRM
- HRIS
- Talent Management
- Point of Sale
- Customer Support
- Surveys
You want all this data coming into one place...
What does the New L&D Ecosystem Do? (enabled by Tin Can and an LRS)
[Cammy's sidebar -- "ecosystem" is totally the new industry buzzword. It's everywhere and everyone's using it a little bit differently...]
It makes your learners love you. They get more credit for their learning activities. You don't have to funnel everything through their LMS. It's more modern.performance observations
It helps you do your jobs better. A more holistic view of the learner. ("This group of successful sales people have all read this book." -- maybe you can find trends that really matter.) Program evaluation.
Success with implementing Tin Can comes from starting small. Understand something narrow and then expand on that. Start with questions in mind. What is the end goal you really ant to understand. (e.g. "Does this particular learning program reduce support tickets?")
Real-world examples
Example: Pandora (the bracelet company)
They wanted to improve sales of a new product through training. They had online journals (through a Drupal/WordPress application), classroom, traditional learning, Cornerstone as their LMS (had to do some legwork to get Cornerstone to integrate with xAPI -- it's on Cornerstone's schedule of summer of 2015 to adopt TinCan - but for Pandora they had to do some additional work. Let your tech partners know that this is important to you! Customer demand will help.)
They wanted to correlate training with performance. Looking at Assessments, mystery shopper, looking at data coming in from their Point of Sale application.
So now we're looking at metrics beyond completions in Excel.
Question -- how do you make sense out of all this data? How do you make the data useful and usable?
Typically -- when an LMS vendor says they support Tin Can, they're using an LRS within their LMS. The statements are sent to the LRS and then they can choose what to do with that data within the context of the LMS. So for awhile, you'll see LMSs supporting data from both SCORM and Tin Can. The LRS listens and collects info from many different systems. "A business intelligence system for learning & dev." You could still have people going to the LMS for transcript related reports. In these early days, the LRS is more of a learning analytics system.
Example: AT&T
Their question: What types of content best impacts completion, retention?
If we confirmed the hypothesis, the company would do X. If they couldn't confirm it, they would do Y.
So this was for code of conduct training. They compared modalities: simulation vs. e-learning. Then they looked at the impact on completions, satisfaction, and retentions. Data showed that the higher fidelity content DID improve retention, etc.
Here the end result was not a cost savings. Seeing if it's worth their time to invest in something that would have a bigger impact.
Example: NexLearn (simulation tool)
Helping a 3rd party outfit their tool to contribute statements.
Innovative and progressive technologies are starting to adopt xAPI.
The larger LMSs are slower to adopt.
Why is this?
- They haven't heard the customer demand.
- The 1.0 version only came out in April of last year -- but the spec will still change -- so the bigger techs don't want to put forth the effort to adopt it if it's just going to change. So the tech community is trying to update the spec without breaking it.
Example: NHS
Tracking formal and informal learning related to dementia care. They were looking at a lot of self-possessed data. Within the lRS they have learning questionnaires. The nurse would log into the LRS and do a pre- and a post-assessments (competence questions related to dementia care).
They used Bookmarklet -- to report back to the LRS. They curated content from across the web that wasn't living in their system -- so they could report that nurses went to those destinations and links.
If people participated in an activity, they could see what their confidence was after completing the assessment.
Learning path analysis through accomplishments. Kind of like badging. Gave nurses a path forward for completing.
No comments:
Post a Comment