Showing posts with label will thalheimer. Show all posts
Showing posts with label will thalheimer. Show all posts

Wednesday, March 10, 2010

Audio Interview with Will Thalheimer on Common Design Mistakes

will_thalheimerI had great fun last week talking with Will Thalheimer on common design flaws in e-Learning.

In case you don’t know Will, he’s an amazing asset to the e-Learning community, providing a bridge between research and practice.

According to Will, the top three mistakes learning designers make:

1. We’re too focused on information presentation.

2. We fail to minimize forgetting.

3. We isolate our e-Learning.

The conversation continued and Will added on a few more common mistakes in e-Learning design.

Listen to my conversation with Will Thalheimer on e-Learning design.

Friday, February 06, 2009

Building Measurement into Our Training-Development Process

Another lunchtime webinosh with Dr. Will Thalheimer. The topic today: Building Measurement into Our Training-Development Process.

Guest speaker today is Roy Pollock, CLO of Fort Hill Company. Roy wrote Six Disciplines of Breakthrough Learning. Roy has a new book: Getting Your Money's Worth From Training & Development. A Guide to Breakthrough Learning for Managers/Participants. (A manager can make or break a training investments; must get managers involved in a training initiative to get the maximum return.)

Roy and Will have been teaching a one-day workshop on measurement. Roy takes business line; Will takes the professional learning side.

Learning News

  • Microsoft just released Semblio, a new authoring tool
  • Skype 4.0 (much better video)

Question of the Week -- Economy

How has economy affected your Learning Unit?

  • Hit Us Hard 31%
  • Hurt A Little 33%
  • Not Much Effect 33%
  • Helped Us 3%
  • (Is "it hasn't hurt us yet, but I'm nervous" an answer?)

Measuring Learning

3 Reasons to Measure Learning

  1. Prove Benefits
  2. Support Learning (testing is a useful learning strategy that helps people retain info)
  3. Improve Design

Outcomes Planning Wheel (Roy's model)

  1. What business need(s) will be met?
  2. What will participants do different and better?
  3. How can we confirm these changes?
  4. What are the measures of success?

Will's Learning Landscape

  1. Learning Intervention (Learner Learns)
  2. Performance Situation (Learner Retrieves/Learner Responds/Learner Applies
  3. Learning Outcomes (Learner Fulfillment, Learning Results)

Usually do Level 1 eval as during Learning Intervention

During Performance Situation -- Learner Applies -- could delay the L1 smile sheet. could do a Level 3 support at that point.

Level 4 Eval traditionally done during Learning Outcomes

(there were a lot more things on his chart, but I'm not that fast and it was somewhat complicated...)

Most of us are doing Level 1 completion and smile sheets. Some are doing Level 2 recall or decision making. Fewer doing Level 3 and 4.

Roy: "If I'm a business leader, I really need to know if training is working in order to invest $$."

The Job Aid: Building Measurement Into Your Training-Development Plan

1. Identifying training opportunities (drive performance by improving knowledge/skills). This can be done by learning and/or business leaders. L&D can be proactive and add real value.

2. Underlying business needs are clearly articulated. Let's get clear about business needs. Training is an investments, it must serve business needs.

3. What will participants do better and differently?

4. Is training the right solution? Often training is the hammer to every nail. Not every business challenge can be solved through training.

5. Besides training, what else is required to produce the desired behavior? Training is rarely the whole solution. What other support mechanisms, systems, etc. need to be put in place? Business managers need to be part of solution in order to ensure that training sticks and that performance does improve.

6. What are the relevant metrics? Match to business and learning imperatives. Take time to define the measurement and define the outcome -- must be done up front. If you define success up front, it makes it easier to be successful in your design. Business leaders want to be changing performance on the job.

7. Get sign-off from all stakeholders on behavior change goals, resourcing responsibilities, metrics. Talk with stakeholders every time. Don't just do a cookie cutter design.

8. Design and develop the training and follow-through. Unless learning is taken back on the job and practiced, then it won't produce results. It's all about learning transfer which requires follow-through.

9. Make measurement part of design and development. Must be built in.

10. Pilot test training prototype and improve it.

11. Pilot test measurement instruments and improve them.

12. Deploy training; support on-the-job application.

13. Deploy measurement. Collect data.

14. Analyze data. Report results. Take action. How can we use this data to sell our story (the story of L&D) to assure continued funding.

15. Make improvements. Plan future improvements. In a six sigma cycle.


A philosophy of measurement in one page.

You can download this job aid and others on Will's site.

Questions

Question asked about measuring results social media and wikis. Roy talked about getting examples from learners and anecdotes.

When training a new system (IT, software) how do you separate the benefits of the system with the benefits of the training? Understand what the training is supposed to measure...What can training itself produce as part of the process?

Using a blog to self-report application of something new. What's better - the learner's report or the manager's report? It depends. Sometimes the manager has no idea what the learner is doing. Sometimes the manager does know. Don't just limit to asking managers. If program is about improving customer service -- then ask customers. Who would observe this? Who would be in the best position to observe a change? Could be people. Could be a system.

You could go all out and separate the effects of training by using control groups. Give some people training and others no training. etc. But the level of measurement has gotta be proportional to the strategic value of the course and its benefits.

How do you quantify self-reports when there's no business metric? Always the danger with self-reports that the learner exaggerates. Use Brinkerhoff's Success Case Measure: If you got a result, describe it.

What are the simplest, most effective tool to measure learning? The # 1 tool was Captivate (in eLearning Guild Report on Measurement). Will says the #1 tool we should be using is our brains. It's not about the tools.

Tuesday, January 27, 2009

Are You an Order Taker?


When you design learning interventions, are you trying to create better learning or are you just taking the customer's order and delivering what they asked for?

Will Thalheimer, in the follow-up notes to his brown bag on Learning Myths, writes:
"Many of us have been trying for decades to make changes, but I think also that many of us are just doing our little part as order takers. We build learning interventions when asked."

What's your reality?

Photo Credit: mr. bartley's burger cottage waitress by irina slutsky

Friday, January 23, 2009

Learning Myths with Dr. Will Thalheimer

Another lunchtime webinosh with Dr. Will Thalheimer. The topic today: Myths the business side has about learning.

But first, the news.

Learning News

Inaugural oath flub. Justice Roberts tried to administer the oath from memory. He should have had a job aid.

Next Brown Bag will be Friday February 6th. Stay tuned for topic.

Question of the Week -- Economy

How has the economy affected your learning unit?

  • 9% Hit us Hard
  • 47% Hurt a little
  • 41% Not much effect
  • 3% Helped us

MYTHS the Business Side Has about Learning

Client asked him to develop a course for business side. To help improve on-the-job learning. Thought it would be good to address myths (asked clients, asked question on LinkedIn, looked at books).

Everybody's got myths: business side, learners, learning professionals

Captured 140 myths and categorized them (not a scientific set of findings...)

Asked participants to submit the own myths they've come across.

Most Popular Categories of Myths (A Top Six List)

6. Manager think learning and development is a low-priority part of their role.

5. Learners know how to learn.

4. Training and instructional design require no special skills or competencies.

3. Information presentation is sufficient as a training design.

2. Training alone produces improvements in on-the-job performance.

1. Bad learning designs are thought to be good learning designs.

Other High-Importance Categories:

  • On-the-Job learning is forgotten or not utilized or not supported.
  • It's a training issue. ("We need a course on this" when it might really be a management issue).
  • Formal training has little impact.
  • Experienced workers don't need training.
  • Learning development is easy and can be shortened or short-changed.
  • Measurement myths
  • Technology is key to learning success (we must use elearning, social media, video etc. -- nothing else is effective).
  • Learning designs don't need to specifically minimize forgetting (enable remembering).
  • Content doesn't need validation. (Do we really know if we're teaching the right stuff?)
  • Particular behaviors are easy to learn.
  • Learning is always beneficial. It is never disruptive or distracting. It never misinforms.
  • Opportunity costs of learning can be ignored.
  • We have to measure ROI.
  • We don't have to measure learning.

Bad Learning Designs Thought to be Good Learning Designs (a partial list, I can only type so fast!)

  • It's good to have new employees take ALL elearning courses right before they start work
  • Employees only learn by doing
  • Readings is always bad, boring
  • Training can be just as effective if we make it short
  • Training doesn't need pre or post work
  • We should and CAN cater to learning styles
  • Six-hour online course is fine
  • Some learning media are inherently better than other
  • More info = more learning
  • People remember 10% of what they read, 20% of what they see
  • Most communication is by body language
  • We need more exciting visuals to grab attention
  • Immediate feedback is always better

What Can We Do About It?

Given that the Business side holds some myths as self-evident, what can we as learning professionals do about it?

55% responded that these myths cause great damage to learning and development.

Business side doesn't understand what we do, don't see the value add of learning and development.

From participants:

  • We can be mythbusters
  • Gently guide and present the right solution when presented with the wrong one
  • Need to discuss learning models and theories when appropriate (educate our clients)
  • Have proof and case studies of good design
  • Stick to the truths we know and respond to the business side tactfully (L&D is often seen as arrogant)
  • The best leaders DO understand the value
  • Provide real evidence of success
  • Help management solve problems, don't just do a workshop

Help people understand how learning works.

Learning Intervention --> Performance Situation --> Learning Outcomes

My notes from other Webinoshes in this series:

And don't forget:

Will Thalheimer: The Learning Show: Don't Forget Forgetting

Update: Here are Will Thalheimer's notes on the session.


Friday, November 07, 2008

Webinosh with Dr. Will Thalheimer: Smile Sheets

WillThalheimerI sat in on a great lunchtime webinosh with Dr. Will Thalheimer on smile sheets.

My live blogged notes were beautiful and then poof -- my machine crashed and my notes are gone. The sorrow.

(Please oh please can I throw this laptop out the window now?)

I won't attempt to recreate them -- don't have the time.

But it was great, and I highly recommend you attend his next session on context.

Dr. Thalheimer's focus is on bridging the gap between research and practice. That's the kind of theory I can sink my fingers into.

Bottom line on smile sheets:

Favorable returns on smile sheets have little correlation with the overall value of the learning experience.

Dr. Thalheimer talked about ways he's been trying to make smile sheets more effective so that they can be used to tell us something valuable about the learning experience.

Look for a sample smile sheet on Dr. Thalheimer's site (type smile sheet in the search field).

If you're lucky enough to be going to DevLearn next week, Dr. Thalheimer will be presenting. Make a point of sitting in .

Technorati Tags:

Wednesday, February 28, 2007

Confessions of an Instructional Designer

I failed Will Thalheimer's Learning Research Quiz. I was mortified. I've been doing this for over ten years and I did terrible. How can I even call myself an instructional designer?

It turns out that I'm not alone. Will just published the results of the Learning Research Quiz (2002-2007).

The 32% average score---and the stubborn lack of improvement regardless of
experience, education, and age---suggests that most people in the
learning-and-performance field are unprepared for roles as designers of
learning, at least as far as their ability to apply knowledge of learning
research.

It was a hard quiz. I took my time and thought carefully about the answers. And I probably only got about 32% (although I didn't track my score).

What I have done -- and what I would recommend everyone do -- is all of the suggested follow-up. I took the quiz then immediately reviewed the feedback and results. I scheduled a follow-up review for myself a few days later in Outlook (repetition, spacing). And then another follow-up a few weeks after that.

What I've found is that some of the information has actually stuck. Meaning, I think I done learned me something.

What exactly? Repetition, spacing, prequestions/pretesting, learning objectives/performance objectives, relevance, delayed feedback, etc. The obvious stuff...Instructional Design 101.

Now -- if I can just get back to actually doing instructional design from all the project managing I've been doing....