Friday, January 27, 2012

Learner Experience Design with Julie Dirksen #ASTDTK12

My live blogged notes from Julie Dirksen’s Friday session. Julie wrote the book: Design for How People Learn.

Learner experience design: overlap between user experience design (uxd) and instructional design

User experience design – how Amazon makes sure that customers can buy a book. – as long as someone can get to the end of the process, it’s a success.

With Instructional design – we have a higher standard – it’s not just about getting to the end of the process – it’s about creating behavior change.

Making the user interface invisible to the learner. Reducing cognitive load – don’t want the learner thinking about how to get through the program, want them expending their cognitive load on the content…

Elements of User Experience by Jesse James Garrett – the layers that go into user design (

When something doesn’t work in classroom training, you know immediately. You’ve got an immediate feedback loop and you adjust it.

So how do we get that type of feedback into the process?

Question: How do you do analysis? Send out surveys, interview SMEs, job shadowing, observation…contextual inquiry.

Job shadowing/contextual inquiry: following them around. Go in for a 2-3 day engagement to kick off a project – 1st day you talk about the problem you’re trying to solve…then “can you sit me with someone who’s doing it?” (you get a ton of information and it really doesn’t take that long).

You learn things that the SME might not have told you – they print out the form to compare it, there are other reference materials he has nearby, he jots some notes down that he’ll need 3 screens later. If you watch them do it, you can find out all sorts of interesting things. If you hadn’t done this, you would have missed some big things.

We do a bad job matching up context of our learning environments to context for use.

The best place to study for a test is not a coffee shop, but in a windowless classroom with noisy HVAC system – study in the environment where you’ll be taking the test. Context matters.  By having more context in our learning environments, we help with retention. (this is very well researched – if you study with a vanilla candle, you’ll do better on the test if you’re burning a vanilla candle…)

Creating a trigger response (if you hear an angry customer, what triggers for you that you need to use your angry customer training…?  If you see this, do this..if you see that, do that…)

high context vs. low context training

metaphors that are cute waste the opportunity (e.g. a course on lean manufacturing that uses a world soccer cup metaphor is a bit of a waste – we want to trigger lean process when they’re on the job and not watching soccer on the weekend).

User Personas

In User Experience Design, you use PERSONAS to do your audience analysis.

“This is Alice, age, job function, description – She’s been with the company for 3 years and started as a tester…she uses this at home, she says this type of thing…” – it’s much more of a fleshed out story of the person.

Typically have 3-4 user personas.

It’s not fiction writing.

Takes a bit of time – typically do it on bigger projects.


Really important when you’re creating more interactive learning…(not as critical if just doing page turners)

(Trying to prevent the problem of building something and THEN having people say “oh, that’s not going to work…”)

Create a wireframe prototype in PowerPoint – can take an hour…Get feedback on what the interaction is going to be (roughly).

Keep it quick and dirty – if people get hung up on “it’s the wrong font” then you’re having the wrong conversation.

The act of prototyping helps you uncover design issues…

Usability Testing

Test your designs.

Steve Krug’s books on usability testing: Don’t Make Me Think; Rocket Surgery Made Easy

What it isn’t: not user acceptance testing, focus groups, demos, sending out for feedback.

It IS watching someone using your application. You sit next to them or you do it on a WebEx.

  • Create a test plan (there’s a sample on Julie’s resource page I’ll list below).
  • Recruit users 5-6 users; 1-1.5 hours each. By the 3rd user you’ll start finding the big issues.  You could do 3 users and then make some changes.
  • Write a script. Let them know why you’re there; I’m here to test the interface and not you; don’t help them as they go through it (don’t say “oh, you just click on that..”; Have them talk aloud as they go through it.
  • Then document your results.

Julie’s resource page:

No comments: