Wednesday, January 29, 2014

Join me at Learning Solutions for a Pre-Con Workshop! #LSCON

Well, actually TWO pre-con workshops!

Join me in sunny Orlando this March 17-21 for the eLearning Guild's annual Learning Solutions conference.

On Monday, March 17th (I'll be sure to wear green): The Accidental Instructional Designer
A day long workshop where we aim to get more intentional in our practice as instructional designers who had no idea this is what we would be when we grew up!

And for extra fun and learning, stick around for March 18 for another day of learning fun with me, Jane Bozarth, Tracy Parish, and Trina Rimmer The Essential ID Companion: The Other Skills You Need to Succeed
A day of talking about the nuts and bolts of the learning ID's job -- from vendor management to project management.

The rest of the conference looks to be a juicy one.  I'm presenting two concurrent sessions on Wednesday and Thursday.  Browse the whole conference here and register soon.

I've booked my flight and hotel already. Have you?

Monday, January 27, 2014

A Tale of Two Cases: Mobile Solutions with Phillip Neal #ASTDTK14

 These are my live blogged notes from a session on the final day of ASTD Tech Knowledge with Phillip Neal, VP of Business Need at Maestro.

One of the struggles with mobile is that organizations are used to designing the way they always have and that fits within their existing models of what training is.

Mobile is different.

(It’s like in the olden days of elearning when we all moved from CD ROMS to the Internet. Today, people are making a lot of mistakes. Just like we all did back then!)

When deploying mobile you have to think about partnering with marketing and other divisions within your organization.

Don’t eliminate core training – that still needs to take place. Instead we need to create pull through.  How can we pull learning content out into the field where they can continue to use it?

Phones just don't work for courses.  People might access courses on tablets, but if you want to be that mobile, you probably need to reconsider your design.

What is your pain point? Can mobile help with that? Start small.

Case Study:
Pharma company with lots of field reps.

Core training (LMS, webinar, elearning courses) vs Continuous Training (ways to extend that core training).

Client asked for one thing. Did some consulting and came up with a different solution to solve the business problem.

  • They had no mobile support
  • Limited extended training in the field after formal training
  • Decentralized content
  • No feedback loops to T&D team
  • Slow adoption and IT support
  • And did not want to tie this to their LMS (if they did that, it would have taken WAY TOO MUCH time). Decided instead to build the app native – but tie analytics to the back side for reporting.

  • Did a needs analysis – profile audience and understand their needs. Not just sitting in the board room to hear what stakeholders say. But talk to the people. To get the context as it applies to that need.
  • Build a road map
  • T&D education
  • Designed a process workflow and governance to support the app.
  • Socializing change
  • Pilot
  • Build momentum

Case example is an ipad app – Product Knowledge for sales reps:
  • Has access to pharma products package inserts.
  • Competitive section to stack your product against the competition – so you can see your messaging against the competition. (may have gotten this as part of core training, but now you can refresh)
  • Mini learning lessons tied to competition - -videos and lessons.
  • Uses a CMS to manage and update content.
  • A Learning Center within the app: Flash cards, lessons tied to the application (5-7 minutes in length around specific subject tied to that product).  The challenge was figuring out how we narrow down the bigger formal content into mini-lessons, mini-quizzes, drag and drops. As ways to reinforce retention.
  • Create a glossary with audio tied to it (so sales rep would know how to pronounce things and not make that mistake in front of the customer).
  • Through the application, people can rate the content and give feedback to T&D group about how effective. (Had to figure out a good workflow for T&D group to handle this feedback). Users could now search for the BEST content. T&D group can look at the poorly rated content and improve upon it.
  • Currently, feedback on quizzes given within the app – so user knows immediately – but that data is not tracked or sent back to an LMS. Didn’t want to duplicate an LMS.
  • The goal here is that if we can prove they’re using the app in the field, then that’s the outcome.
  • My Assignments area: your coach can assign you some content (e.g., the coach is on a ride along with the reps and identifies some gaps), you can assign yourself some content.
  • Journal: for reps in the field, important to journal in the field and track what’s going in their cases and reviewing with manager and coach.

Results showed 22 sessions per user, 1.7 minutes per session. So lots of touchpoints as reference.

I had to duck out to catch a plane and so missed the second case study on Kellog's.

Friday, January 24, 2014

Learning Record Stores with Tim Martin #ASTDTK14

 These are my live blogged notes from Tim Martin's session at ASTD Tech Knowledge, wrapping up today in Las Vegas.

Tin Can 101: It’s a shared system for two systems to talk about the things that a person does. Unlike SCORM which was about content in a browser.  (There’s much more here:

TinCan allows simulators, servers, mobile devices, etc to communicate.

Tin Can is driven by a web service based solution. It uses a RESTful web Server (and yes, I had to ask what this is: a more commonly adopted web service architecture. Developers, apparently, will like this.

Two parties in xAPI/Tin Can: Activity Provider and the Learning Record Store

How might you use Tin Can today? Start small. Success comes from starting small, understanding something narrow and then expanding. Instead of trying to measure everything in your organization, think about designing experiments.  

This echoes something @reubentozman said in his session yesterday on Designing for Data (sorry – I didn’t blog it, but I did Tweet a lot for Reuben’s session)
"Think about design as an experiment. Design your experiments to capture data you want/need."

Tin Can has a noun (the person, email address, ee identifier, etc), verb, object structure. “Cammy read a book.” “Cammy ran a marathon.” “Cammy presented at a conference.”

Learning Record Store = collects statements from Tin Can (A much better definition can be found here:

Organizations might use multiple LRS’s in their organizations – each one looking at the same set of data and presenting them in different ways for different parts of the organization. Use the data in the LRS from an interesting analytics view.

This page gives a really nice introduction to how you an approach the design process as an experiment.

So the analytics could help you identify completion rates if a course is designed with higher fidelity vs. lower fidelity (helping you test your hypothesis that the production value of yourx` course does matter).

Or “People who do that and this are more successful than people who do that and not that…” Follow the learning path and watch who succeeds based on the paths.

Experiment Design:
  • The company was looking to look at cultural adoption of the code of conduct. They wanted people to do the right thing because they believed it, not because they were forced to. So they wanted to see the # of calls going up to report ethics violations and help desk.
  • They had a piece of content from a rapid learning tool. The experiment compares high fidelity content (with video and simulations) to the actual text of the code of content. So completion statements for each modality.  “Tim completed this.” – can see the depth to which Tim explored the simulation. The sim sends statements.
  • For the text version can just know if they got through all the pages of the document. So text doc sends statement.
  • Quizzes to test comprehension.
  • Pre and Post surveys to self-assess afterwards. Comes in from a survey tool. These make statements.
  • Calls to the help desk can send statements. (“Tim called hotline.” Or “Tim reported violation.”)

So it's a fixed set of statements about a fixed population.

Gave one part of the company the high fidelity content and the other part the text. We think (the hypothesis) that high fidelity content is going to help people get it better.

The high fidelity group follows through on the path and completes different tasks, surveys etc. And if the data proves that high fidelity content creates better outcomes (e.g., more calls to help desk) then the business can make different decisions.

The first meetings are asking questions like “so where do you think you can make a difference in the organization.”

Some designers just think of themselves as designing content in a tool. Instead, let’s think about designing experiments.

Thursday, January 23, 2014

Creating a Virtual Leadership Academy: A Case Study #ASTDTK14

Virtual learning event for physician leaders.

These are my live blogged notes from ASTD Tech Knowledge session with Eric Heckerson.

 I missed his opening setup, so didn’t get some of the context. But the upshot is they created/designed a virtual leadership event. It was “faux live” video of a teacher/speaker, onscreen with PPT slides, also links to chats and other resources. The moderator was live, but the speakers/videos had been pre-recorded. It used to be all in person events -- so they were trying out doing virtual events with one live event once a year. A bit of a pilot program to see if this was an approach to continue with.  

Here's the process he followed to figure out the right approach and get it developed:

When shooting video of classroom teaching, people preferred the video shot not as talking head, but where you see the speaker in front of an audience (even if just a few people in the audience)

How they built in interaction and motivation to the live virtual events:

  • Asked the doctors to sign an “I commit” statement (I won’t multitask) – they had to send their initials to the presenter to publicly commit to paying attention
  • Answer occasional polls
  • Enter chance to win an iPad
  • Find a secret word (embed the secret word twice in the presentation that animates across the screen – the learners were told to look for it. They needed that word to fill out a survey at the end in order to win the ipad – 30% of people couldn’t identify the secret word).
Content was around things like how to coach. Then in the follow up in person sessions, they had the chance to practice.  (Coaching, mentoring, etc.)

What they liked:

  • Convenience
  • Focused approach
  • Topics
  • Ease of use
  • The secret word
  • Length (3 focused topics in a short amount of time)

What they didn’t like:
  • Technical issues
  • More interaction
  • Diverse topics
  • Some speakers
  • Classroom vs. studio

Lessons Learned

  • Brief: 45 minutes broken into 15 minute blocks (Keep your content brief. Try monthly learning nuggets of 10 minutes. No one will be offended by brevity).
  • Easy – simple to access, log-on and learn.
  • Linked to action – What will they do with it.
  • Interactive – polls, chat and questions
  • Engaging – keep their attention
  • Visual – relevant graphics and video
  • Effective Speakers - -make sure they’re strong and engaging
  • Relevance

Overall, people liked virtual and appreciated the convenience. Don't want to do away with live events ever. 

Making Interactivity Count (Session slides from #ASTDTK14)

I presented yesterday at ASTD Tech Knowledge, "Making Interactivity Count".  It was a great session, with a lot of audience participation and great conversation. For conference attendees, I believe it was recorded and will be part of the post-conference package so you can hear all the banter.

Here are my session slides. Enjoy!

Amy Jo Martin, Thursday Keynote at #ASTDTK14

These are my live blogged notes from the Thursday morning keynote at ASTD Tech Knowledge in Las Vegas.

@AmyJoMartin NY Times Best Selling Author & Founder of ‪@DigitalRoyalty.

It’s social communication, not social media.

Social Communication impacts all elements of businesses. It spreads across recruitment, development, etc.

People don’t buy what you do, they buy how you do.  The goal is to sell to people who believe what you believe.

Through social we can explain why we do what we do through social.

How do we monetize?

Access > Connection > Relationship > Affinity > Influence > Conversion
(impressions don't convert, but influence does convert).

Return on Influence

Standard media impressions (cold metrics like FB and Twitter #s) x Warm Metrics (Sentiment + Engagment) = taking into account both the reach you have and the way people feel about you.

Engagement metrics = likes, retweets, comments – the ways that consumers show they’re engaged.  (Comments take more time and are weighted more heavily than a like or retweet).

Influence leads to revenue.

How do we humanize our brand? Humans connect with humans and not with logos.  It’s the individuals behind the logo that matter.

We don’t just need our leaders to be the face of social. Although that’s where it starts (leaders need to embrace and accept that this is what’s happening).

Employees should speak on behalf of the brand.

Zappos social media policy = “Be reasonable and use your best judgment.” If social is mistreated, it’s more of a personnel and HR issue.

Your employees become brand ambassadors who speak on behalf of the brand.

  • People say they don’t have enough time.
  • People say they don’t want to bridge their personal and professional worlds. (But you do have control over what you share).
  • People think it’s too high tech.

Facebook has billions of users every month to Snapchat which has millions. (And yes, milennials are still using FB. Just differently)

Megatrends for social

Predictive analysis

Brand stories might start on TV (commercials) and continue on Twitter. E.g., Trident commercial airs during key shows. They can track and monitor who’s tweeting about those shows (using hashtags for those shows – like #madmen) and then target those viewers to continue the story on Twitter.

Social Media Education:

  • Decreases liability
  • Converts employees to brand ambassadors (they live the brand day to day)
  • It’s a professional development tool
  • Saves companies money (get easy data without spending money)

Digital Royalty University – they founded to go out and teach principles of social media.
  • Use a blended learning approach.
  • Kickoff with an in-person training session to set the stage about what about to learn and the very basics.
  • Then transition to online learning. First do an assessment to understand what level they’re at before they get into the curriculum. (Just because you’ve been on FB for 3 years doesn’t mean you get it).
  • Curriculum on demand, live webinars, google hangouts, videos.
  • Train the trainer model for the in-person training. Have had to do that to be scalable.
  • Curriculum on demand -- some titles: Twitter, FB 101, Social Event Activation, Social Crisis Communication, Measurement and Monetization, Community Management, Converting employees to brand ambassadors.

Created a custom learning environment for lululemon.
SHRM – progress bars to show how far you have to go, lessons are 4-10 minutes
LPGA golf players are mandated to go through social education

Innovation allergies (“that’s not the way we’ve done it,”, “that will take too much time.”, “what if it doesn’t work?”) – if we fail to innovate, we will face adversity.

Get people to be the humans behind the brand. Tell the story of the brand, be proactive with customer service, and connect with people.

Social Media Incentive Program –

Results of Social Media Education:
  • Humanizing brands
  • Quicker customer service response time
  • Professional development
  • Increase in brand’s positive sentiments… (Promocode for free classes through end of February: ASTDTK)