Multitasking and Making Training Stick

October 20, 2015

Multitasking is rampant today – almost everyone does it at one time or another.  I attended a workshop recently and soon after it began I noticed that at least one-third of the 100ish participants were tapping away on laptops, tablets and/or phones (yes, a few were using more than one device).  I got up and peeked around.  No one seemed to be taking notes on the lecture.  They were reading and responding to emails or texts, posting on social media, and/or surfing the internet.

Is multitasking bad?  Does it interfere with learning and retention?  In study after study, test scores for people who multitasked in learning environments were significantly lower than for those who did not, or if scores were comparable, the multitaskers required considerably more time to achieve the same learning outcomes.  One study found that people who sat close to multitaskers in the classroom had lower levels of learning even though they themselves weren’t doing it.  Even worse, in one study multitaskers who performed poorly on a learning assessment believed they performed as well as their non-multitasking peers.   

What’s a trainer to do?  Neuroscientists tell us that most people’s maximum attention span is about 20 minutes.  As attention begins to drift the temptation to multitask increases, and the digital device(s) beckon.

A trainer can and should vary the activities every 20 minutes:  small group discussions, Q&A, application exercises, a video, and inward reflection activities are popular choices.

Most trainers ask participants to turn off phones at the beginning of training and to close out all unnecessary websites.  E-learning designers should consider adding a slide with this message at the beginning of each online module.

Consider also these strategies based on current neuroscience research to reduce participants’ tendency to multitask:

  • Tests and quizzes.  Whether printed or on a slide, response required or rhetorical, studies show that when people are tested on material, especially if they know they will be tested, they have higher levels of learning and retention.  Flash a few quiz questions on the screen or pass out a written quiz.  Quiz features in most e-learning software makes this a snap.  Be sure to make the test reasonably challenging and base it on the learning objectives.

Let participants know in advance they will be tested, or give a “pop quiz” without advance notice.  People will soon figure out they are going to be tested periodically.  Motivation to learn increases if testing results are recorded which makes the participant personally accountable.

Tests and quizzes need not be confined to the learning module.  Require completing an after-class test as a condition for class credit.  Or use a training transfer technology tool to deliver one question at a time.  The newest revision of my Training Training Technology white paper has an overview of the latest tools.

  • Flip the classroom.  Do you really need to deliver the content via lecture?  All of it?  The temptation to multitask is most compelling during lectures.   Consider instead:  an online module, pre-reading, and/orvideo viewing ahead of time followed by in-class discussion, practice, and application.  Of course a typical issue with pre-work is that some participants come to class without having done it.  Guard against this after the first time by giving a test at the beginning of class or starting class with small group discussionthat draws on the pre-work content.  Ask those who did not do well to spend their first break reviewing the material.  (Have a few extra laptops and/or tablets available if needed.)
  • Chunk content into smaller segments of 15-20 minutes. Obviously this will not work for every class especially if participants travel to get there, but when possible, shorter training chunks (think stand-up meeting or short online module) mean participants will pay better attention and therefore they will learn more in less time.
  • Tell participants about the research on multitasking and learning.  Ask them to consider how they and the organization will benefit when they achieve the outcomes of the class, and how their multitasking might interfere with their learning goals.
  • Use available tools on the technology platform. For virtual instructor-led classes, be sure to use the tools available such as raising hands. Prepare a question or two to ask an individual when the system notifies you that a participant is inattentive or has navigated to another screen.   For self-paced e-learning, review key metrics such as time on slide and navigation and consider ways to shorten/revise the content.

These strategies probably won’t completely eliminate multitasking during training but chances are that the more of them you use, you will reduce it.

Until next time…


Big Bang’s Sheldon Shows Us How Not to Teach

August 11, 2015

I enjoy the TV sitcom Big Bang Theory.  Apparently a lot of other people do too because reruns seem to appear often – on several channels in different time slots.  Recently I saw a segment where Sheldon teaches – or tries to teach – Penny physics.  His teaching is a great example of how not to engage a learner as well as many other mistakes some instructors make.

The six minute version on YouTube provides an opportunity to apply the before-during-after time periods related to transfer and also to apply some of my Training Transfer Process model.

Click here to watch the exerpt of the episode.  While you’re watching I hope you’ll think about what Sheldon should have done differently and what he could have done before and after the session with Penny that might have helped her apply her learning.  Also think about your own teaching/training…. I know I recognized a few things that I have to watch and keep myself from doing!

When you’ve finished viewing, read my comments below the clip about how Sheldon could’ve applied aspects of my Training Transfer Process.



A Retention Aid – Interleaving

July 14, 2015

I recently went on a hiking vacation in Utah with a group of like-minded people. Our guide was a professional wildlife conservationist and geologist.  As we hiked along trails, he pointed out many different species of birds and trees as well as geology formations.  This information was interesting but a lot to take in as I concentrated on breathing and avoiding boulders on the steep paths.  When we encountered something we had seen previously, he would ask us if we could recall the name or something about it.  Sometimes he would also use the opportunity to ask us a few questions about other things we had seen.

By the end of the trip I was able to identify 5 different types of pine trees, 3 type of hawks, and many different desert mountain flowers and plants.  Pretty cool….considering that my goal was to hike and enjoy the scenery and I really didn’t care whether or not I learned the names of the flora and fauna.

Why was I able to have such good recall?  Our guide, a former teacher, used a technique called “interleaving.”  This teaching/learning technique involves mixing up recall and practice nonsequentially.  It is the opposite of “block practice”, where lesson, practice, and recall are done all at once.  Think about it this way:   If you want to teach 3 learning points, A, B, and C, a block practice session would look something like this:  AAABBBCCC.   An interleaved practice session would look like this:  ACBABCB AC (randomized).

Numerous studies support the effectiveness of interleaving vs block practice to achieve long-term learning and retention.  (Interestingly, many people in studies who used interleaved practice performed worse than their counterparts using block practice during the practice session, but performed better when tested at a later date.)

Here are some suggestions for how to use interleaving in your training.  Detailed descriptions of how organizations such as Farmers Insurance and Jiffy Lube use it in their training are in the book Make It Stick (good title!) by Brown, Roediger and McDaniel.

  • Begin your teaching – whether live, virtual, synchronous, or asynchronous – with a story or two that illustrates the application of several learning points.
  • As you present each learning point, relate it back to the story and point out how it was used, could have been used, and so on.
  • As the learning module(s) progresses and you go on to other topics, randomly ask questions about one or more of the previously-presented learning points and relate back to the application story. 
  • After the learning event (class or elearning module) is finished, continue randomly quizzing via occasional emails.  For more information about after-training technologies to do this, see my white paper Training Transfer Technologies.
  • Consider using hard copy aids such as flash cards (digital versions could be created in PowerPoint) to randomize quizzing.

While my hiking guide used this technique intuitively (I asked him), many of us should do it intentionally.  I have made some adjustments to classes I have designed to accommodate this technique.  I’d like to hear from other trainers who use who or are trying interleaving.


Note:  The term interleaving is also used in computer disc storage.  Here it refers to rearranging blocks of digital information on storage discs to improve speed of access.

Until next time…


Action Plans – Do they make training stick?

April 20, 2015

Action plans have long been thought to support training transfer and to make the training stick.  Actually the truth is….. yes, no, and …. sometimes.

End-of-training goals signal what is important, they provide a sense of direction, and they can be a focus for evaluation and feedback on performance of the task(s).   But simply asking participants to jot down their goals, or what they plan to do to apply their learning, will have limited effectiveness.  Also limited in effectiveness is setting difficult goals that are hard to reach.  However, goals that focus on specific behavior outcomes can be very effective in producing on-the-job application of skills learned in training, particularly for complex skills.   Also, when feedback is solicited from participants’ colleagues  and participants know this will happen, action plans will spring to life.

Consider applying the following evidence-based enhancements to your current action planning segments.  You will significantly increase how well participants transfer their learning.  For additional ideas on action planning see my earlier Sticky Note.

  • Develop specific behavior outcomes that identify skills to be applied.  These behavior outcomes might replace the course objectives or they can be used to enhance them and make them more “actionable”.   For related information on sticky class objectives, see my earlier Sticky Note.

Here are a few examples of behavior outcomes:    There should be between 3 and 15 of them depending of course on the length of the class:  “Gives employees feedback on a regular basis” (supervisor training), “Calls the customer by name” (customer service training), “Uses current discussion boards for current issues and solutions” (software or help desk).

  • As part of the end-of-class action planning segment, provide participants with a list of the behavior outcomes that they should expect to apply.  Either suggest that they choose among these for inclusion in their own action plans, or provide the action plan pre-completed with the behavior outcomes listed.  Numerous studies have found that when action plans have specific behavior outcomes, participants are more likely to do them.
  • Ask participants to supply the names/contact info for at least 2 coworkers, subordinates, or manager(s) they work with, who will be asked later for feedback.
  • Survey these coworkers, subordinates, or managers 3 months after the training, using the behavior outcomes previously developed.  Assure them the results will be confidential.   Consider whether or not to share the feedback from the survey with the participant, and if so, how to protect the confidentiality of those who responded, particularly since there will be  small numbers of respondents.

Online survey tools are helpful for providing anonymous feedback but may not be enough to assure confidentiality with a small number of respondents.  Note: whether or not these individuals are actually surveyed may not be as much of a motivator as that the participants believe they will be surveyed.

The purpose here is to make participants accountable for applying the skills, but of course the information collected might be used for evaluation of the class or for follow-up training.

  • Adapt for elearning in the following ways:

o   List the behavior outcomes close to the end of the last module in the training.

o   Provide an action plan form with behavior outcomes listed.  Direct the trainee to fill in a date or milestone, and suggest that they print it for reference.

o   Set up your LMS to send the survey mentioned above to participants’ managers (who are generally already loaded into the LMS).

o   Consider developing an interactive action planning tool participant

Until next time…


How do you know training has “stuck”?

February 27, 2015

Assessing whether learning is really being used on the job is challenging for many trainers.  End-of-class level 2 evaluation is easy enough to do:  the participant demonstrates skill or knowledge acquisition at the end of the training.  We might also assess pre-training  learning and compare.  But determining  whether skills and knowledge are actually being used on the job is another matter entirely.

The most straightforward way to do this is to simply ask participants if they are using what they learned and to what extent.  But are we actually getting an accurate measure of whether a participant is using their learning?  In a previous Sticky Note I mentioned neuroscience research that points out people often think they know, have experienced, or are experiencing something when in fact they have not.  This illusion calls into question the practice of assessing on-the-job use of skills learned in training, by asking participants if they are using it.  The key issues are:

  • Are participants actually aware of how and in what ways they are using what they learned in a training class?
  • Can participants distinguish between what they are applying from a particular class and what they are doing for other reasons such as another previous training, intuition or trial and error?
  • How much are these self-reports tempered with wanting to provide the appropriate response, to please the trainer, the boss, the organization?
  • Studies on these types of self-reports indicate they are unreliable.  One study found that self-reports produced positive responses that were 35% more positive than reports by participants’ managers.

What’s a trainer to do?  What are more accurate ways to test whether learning is being applied?

  • In post-training reaction level 1 evaluation, ask participants about their “intention to transfer”, that is, whether they plan to use what they have learned, and how they plan to use it.  Studies show there is a strong link between intention to transfer and later actual transfer.
  • After training at a point in time when participants should have had an opportunity to use the training, ask their managers (a quick survey, or more detailed focus group) whether their employee is using the skills learned in training and how they are using them.  Six weeks and three months post-training are popular times.
  • To reduce the tendency to give the desired positive response, ask managers and participants specific behavior-based questions, known as a Behavior Observation Scales (BOS).  Assessing on a 5 point scale (1 = Almost Never,  5 = Almost Always), specific behaviors linked to class objectives are addressed.  For example, for a class on coaching, one behavior observation scale item is “Provides feedback regularly”.  A BOS item for a sales training class: “Reviews individual productivity results with manager”.
  • Instead of – or in addition to – asking participants and their managers, poll a select group of individuals, perhaps one level above participants’ managers,who are in a position to see many participants’ on-the-job behavior.   One study paired an HR rep with each of these individuals, and the role of the HR rep was to assist the manager with completing the Behavior Observation Scales.
  • Instead of assessing the learning application for every participant, assess a sample of participants.  In general, 30% of the total number of participants should provide a reasonably accurate representation of all trainees in a particular training program.

Don’t rely on your “gut feelings” about whether trainees are using what they learn in training.  Use popular, free survey software or features of your LMS to find out how much of your training is sticking!


Until Next Time…



PS:  Join me at ATD (formerly ASTD) International Conference and Expo May 17-20.  I will be presenting on Evidence-Based Techniques for Training Transfer.  

Sticky Objectives

December 11, 2014

I’ve been working recently converting training/learning objectives to “sticky objectives,” and I’d like to share a few thoughts with you. As most of us know, good instructional objectives are essential for effective training and evaluation. A couple of points about instructional objectives before I continue:

  • A good instructional objective should include 3 things: 1) the performance (what the trainee should be able to do after the training, 2) the condition (when, ex. “when conducting a performance evaluation”), and 3) the criteria (how well). If the objective does not contain all three of these elements, it can’t effectively indicate the desired result of the training.
  • Most instructional objectives are preceded by the this phrase: “at the conclusion of the training, the participant will be able to:”
  • It may not be useful to share the instructional objectives with the trainees. It may be more helpful to develop instructional objectives for use in the design process with trainers, program sponsors, and other “insiders,” and to write and publish objectives that are focus on specific job performance, such as “conduct an effective performance review,” “use the 6 key functions in Excel,” and “use the Situational Leadership model to identify the appropriate mix of direction and support for an employee”…. for use with trainees in the learning events. In my experience, many trainees are intimidated or just don’t relate well with instructional objectives (some trainers too, but that’s a different issue).

take-a-lookNow that I’ve commented on instructional objectives, I’d like you to consider this:

  • What is the purpose or point of the training and of the training objectives? Is it to demonstrate knowledge or a skill or possibly even an attitude change at the end of the training? In most cases, the answer to this question is “no.” The purpose of most training is for trainees to apply certain knowledge, skills or attitudes to their jobs so that their performance is more effective in specific, targeted ways.

So if the purpose of the training is for trainees to use certain skills in their job performance,

  • The objectives should be written to describe what trainees should be able to do, on the job, after the training. If we look at instructional objectives from this perspective, the performance, conditions, and criteria may not change. What will change, though, is the statement that precedes the objectives: “at the conclusion of the training, the participants will be able to….” Instead substitute “in their job performance, the participants will….” Remember, our focus should be on what they will do, not what they can or will be able to do.yearend-book-bundleMaking these simple adjustments in the wording of instructional objectives – and in the more general objectives shared with trainees – can keep trainer and trainee focused on the true goal of the training – on-the-job performance.

    Until Next Time…

The Illusion of Learning

September 19, 2014

Are your participants guilty of the illusion of learning?

I came across an interesting concept as I was reviewing recent studies in neuroscience related to Making Training Stick®. It’s called the Illusion of Knowing and refers to people’s errors in perception. For example, we seem to be hearing more in the news about individuals who have been convicted and incarcerated for crimes that later DNA testing proves they didn’t commit. Eye witness accounts have identified the individual as the perpetrator only to be proven wrong years later. These memory distortions arise out of our discomfort for ambiguity and our desire to “have the right answer”.

recent-booksSo in a level 1 evaluation when participants are asked “what did you learn?” their responses may well be shaped more by this illusion of what they would like to have learned and what they know they should have learned, than by what they have actually learned.

Another illusion is called Imagination Inflation, which is the tendency of people who when asked to imagine an event, will sometimes begin to believe, when asked about it later, that the event actually occurred. For example, if a level 3 post-training survey asks a participant how they are applying their learning, they may believe that they have applied it or are applying it when they actually have not or are not. This is particularly troublesome with complex learning of tasks/behaviors where application is less than straightforward such as soft skills: customer service, management, communication.

How to overcome these illusions of learning? Feedback. Studies show that when students have an opportunity to reflect on their demonstration of learning and on their performance, their perceptions of their learning and performance become more accurate. This is called metacognition, the awareness and understanding of one’s own thought processes and learning. If you are one of the many individuals who, based on the notion that adults do not want to be tested, have shyed away from testing (as I have), I urge you to reconsider. Testing and then providing participants with correct answers is one way to provide feedback and reduce their Illusions of Knowing regarding what they have learned. This of course also serves as level 2 evaluation of learning.

Another feedback tool is the Behavior Observation Scale (BOS). This can reduce participants’ possible illusions about what they are applying or have applied. Develop a set of behaviors that demonstrate successful application of the skill(s) during or prior to the design phase. This can be done by the designer and/or other interested stakeholders in the training. Provide a Likert 5-choice scale for each behavior. This BOS should then be used by the participant and/or their manager to assess their on-the-job application. This of course may double as a level 3 evaluation.

mels-burstHere’s what you can do to reduce illusions of learning and improve course feedback from level 1, 2, and 3 evaluations:

  • Review your end-of-training level 1 evaluation form and eliminate questions that ask participants what they learned. Consider instead asking a question or two about how they intend to apply their learning. (Research studies have found a strong relationship between reports of intention to apply and actual application.)
  • Consider adding testing to all training. If you currently use tests, incorporate opportunities for participants to review their answers vis a vis the correct answers.
  • Develop Behavior Observation Scales for complex learning. Distribute these to participants and their managers post-training, when they are most likely to have had an opportunity to apply their learning.
  • Consider ways to “motivate” them to respond. (My favorite is to withhold credit for the class until post-training feedback has been received.)

Until next time…


Letter to Self – Easy closing activity that Makes Training Stick

June 30, 2014

One of the first closing activities I used was called a “letter to myself.”   At the end of training, participants were asked to reflect on what they learned and how they were going to apply it, and to write a letter to themselves, complete with self-addressing an envelope.   Then I picked up the envelopes, stashed them away for a few weeks, and mailed them back to their authors.  I got a lot of positive feedback from people.  One time when I took a workshop I participated in this activity and experienced first-hand how energizing and motivating it was to receive that letter with the reminders and encouragements I had written.


A recent experimental study has demonstrated what I’ve always believed:  that this activity is more than a “nice to do”.  Trainees in the study who participated in this type of activity had higher levels of self-efficacy (the belief that they could apply the skills they had learned) and they demonstrated application of their training.


Researchers Amanda Shantz and Gary Latham did a study on what they termed “written self-guidance”.  Half of their trainees who participated in a soft skills training program participated in a “letter to self” type of activity in which they reflected on what they had learned and how they planned to apply it.  Those who participated in the activity demonstrated significantly higher levels of application of the training than those participants who did not.  This activity is not the same as having participants write a reflection paper, develop an action plan, or write a class summary because it requires trainees to write motivational letters directed to the self, and the participants at a later point in time receive a letter written by themselves, to themselves.


Here are some specific guidelines for using this activity in training you facilitate, develop, or administer:

  • After a summary of the training content, ask participants to write a letter to themselves  – “Dear Self” – in which they outline their key learnings and how they plan to apply what they learned.
  • In the instructions, stress that they are the only ones who will see their letters – they will seal them before they leave the class.
  • Ask them not to pay attention to or be concerned about grammar or spelling.
  • Encourage participants to include self-affirming and comments that are relevant for them.  Provide examples.
  • As they finish, pass out blank mailing envelopes and ask them to write their full mailing address (interoffice, home address, etc.).
  • Allow approximately 15 minutes for this activity.  At the end of the time, collect the letters.
  • Store them safely (remember, they’re confidential) in your office and tickler your calendar to mail them in 3 weeks.   (The experiment used a 5 week interval but I’ve found that 3 weeks is better in today’s fast-paced work environments.)
  • Mail them at the appointed time.

This activity can be adapted for live virtual or elearning in the following way:

  • Ask participants to open their email system and type an email to themselves.  Use the same instructions as above.
  • Then ask them to save this email as a draft.
  • Mark your calendar, and 3 weeks later get in touch with each participant (email, text, etc.) and ask them to open their drafts folder and read their letter to themselves.

Click here to download a copy of the article that describes the details of this study.


**I hope to see many of you at Training Mag’s Online Learning Conference Sept 22-25!

Opportunity to Perform

May 1, 2014

I was forced to take an online class recently to learn skills that I will not need to use for at least three months.  A gun wasn’t placed at my head so maybe “forced” is a bit strong, but I certainly felt forced.  The situation was this:  a university for which I teach occasional online classes is in the process of changing over to a new technology platform.  The change-over schedule was announced and the area where I teach will be one of the last to implement the new technology, which will be several months away.  However, all instructors must take the five-day new technology orientation class now. Will I remember what I’ve learned when it is time to use it?  I doubt it.  Fortunately a lot of the instruction is via text documents that can be saved, so I have tucked them away in a digital file for future reference when needed.

Research has consistently shown that transfer is limited when trainees do not have the opportunity to perform, sometimes known as opportunity to practice, newly acquired skills.  In many studies, the opportunity to perform was rated as the highest form of support for learners, and the lack of opportunity to use training was rated as the biggest obstacle to transfer of the training.

Here are some suggestions and reminders to help support trainees’ opportunity to practice and perform newly learned skills:

  • Before training, communicate with trainees’ managers and ask them to plan time and assignments for when training is completed, so trainees can immediately try out their learning.  This communication can be auto-sent to managers at the same time class registration is confirmed.
  • During training, provide opportunities throughout the training – whether live training or self-paced elearning – for the trainee to plan when and how they will begin using what they are learning.  Encourage them to discuss this with their manager.
  • After training, send follow-up reinforcement messages to trainees reminding them to find opportunities to practice their new skills.
  • Training Transfer Technologies - Free White PaperFor certain types of training such as management or compliance training, follow up after training byemailing short “what if” scenarios and case studies and asking or requiring participants to respond.  Note:  several new training transfer technologies are well suited for such follow-up.  My Training Transfer Technologies white paper provides a n overview of them.  Request your free copy.
  • Prepare participants who aren’t able to practice or perform right away.  Provide a manual, short documents and/or web-based support tools for them to refer to when they have the opportunity and the need to use what they have learned.
  • Set up social media communities to provide support and learning at the moment of need.  Send periodic reminders to visit the communities to give and get assistance and advice.

Often there is not a choice as to when training is offered and when new skills can be practiced.  We should all do our best to try to reduce the time between training and performance.  And when it’s not possible to narrow the gap, provide support tools to narrow the gap.

Until next time…


Let Learning Sink In – To Make it Stick

April 1, 2014

I’ve found some interesting cognitive psychology research that I think you’ll find interesting because it can be applied to making learning stick.

  • Have you ever wondered why new learning needs to “sink in”?
  • Did you know that students who study right before an exam don’t do as well as those who study at least several days ahead of time?
  • Would you like some ideas for maximizing training retention and transfer?

Read on –

Scientists have known for over a century that we have two types of memory, short-term and long-term, located in two different parts of the brain.  Short-term memory is converted into the more stable long-term memory, which is then be drawn upon to solve problems and make decisions.

The process works this way.  The information is first gathered in the learning event through the senses and is processed in the brain’s short-term memory, where it is related to existing information already stored in long-term memory.  From here the new information is transferred to long-term memory storage and it becomes encoded into neuron patterns.  New synapses (spaces between nerve endings) are then formed through protein synthesis.  This process is called long-term potentiation and was first demonstrated by Nobel-winning scientist Dr. Erik Kandel.  The important point here is that t takes a few weeks for the protein synthesis and new neuron patterns to form.    

This is why we often say something new needs to “sink in”, and why students who cram right before a test don’t do as well as students who study a few days ahead or even the night before.  And this is why we need to provide spaced learning, repetition, and/or practice to help learners retain and apply the learning whether it is face-to-face, live virtual, e-learning, or a combination of these.

Here are a few suggestions to help learning sink in and stand a better chance of being applied:

  • Divide the learning into at least two events, spaced at least 3-4 weeks apart.   For ease in scheduling, consider a live virtual (“webinar”) or e-learning format for one or more of these learning events.  The potentiation research indicates that the longer the spacing, the better the retention.  This of course needs to balanced against other things that compete for memory space – if the learning is spaced too far, the initial learning may be completely lost.
  • Do not test at the end of the class.  Instead use the test to follow up, no sooner than two weeks after the end of the learning event.  The research clearly shows that allowing some time before testing will result in better learning and retention.  And remember, the goal is not to pass the test… the goal is to retain the learning so it can be used. 
  • Require participation in learning communities – discussion boards, blogs, communities of practice – as part of the class.  Don’t award credit for the class until a required number of posts are made in the community.  This serves to reinforce the initial learning, provide application ideas, and aid the brain in connecting new, short-term memory learning with long-term prior learning so that it can be used.

And the next time you need to remember something yourself, keep in mind that your brain needs time to form new neuron patterns and protein synthesis.  Let some time pass and then revisit the information.

Until next time….





P.S. Follow me on Twitter: @StickyTraining


Get every new post delivered to your Inbox.

Join 314 other followers