Making Training Stick and Bloom’s Action Verbs

May 18, 2016

Action verbs can be a handy tool to help make your training stick.  I’ve discussed sticky objectives in a previous newsletter and in my white paper on transfer technologies.

As I suggest in these publications, “sticky objectives” should replace traditional instructional objectives for trainees and for their managers. Instructional objectives can be helpful for trainers to use for instructional design; however, before-training and beginning-of-training objectives should specify what the participant should know how to do and be able to do after training, in their job.  These “sticky objectives” signal to the participant what they should do with what they are learning.

 

Transforming instructional objectives into sticky objectives usually involves just a few subtle yet specific changes in wording.  The action verbs below which are linked with Bloom’s Taxonomy** application level can help to transform your instructional objectives into sticky objectives that your participants, their managers, and senior leaders see before, during, and after the training.  Naturally, the specific wording will depend on the skills/information being learned.

Start by taking each instructional objective for your training.  Delete “at the completion of this class” if this phrase or a similar one is there.  Replace it with “In your job” or “on the job”.  Then add an action verb from the list below, and complete the sentence with a description of the skill the participant should use on the job:

For example:  Upon  completion of this training   In your job, you should be able to identify the most appropriate leadership style for a particular employee and use it to obtain desired results.

Note: In some cases you may want to customize some objectives for particular groups of participants.  For example, in a management training program a group of supervisors on the shop floor may have slightly different objective(s) than a group of sales supervisors.

If you start with action verbs like these, the rest of the objective will usually fall into place.

Action Verbs based on Bloom’s Taxonomy application level

Apply

Calculate

Compute

Balance…..by…..

Demonstrate….by….

Determine….by…..

Decide

Employ…..to…..

Illustrate…..by…..

Identify…..and…..

Indicate

Measure…..with…..

Operate

Schedule

Solve…..by…..

Use…..to…..

Utilize…..to…..

Special thanks to Julie at the Association for Iowa Continuing Nursing Education Fall Conference for suggesting this topic.

Until Next Time…

          

**P.S. Where did Bloom’s Taxonomy come from?  In 1956 a team of faculty members at the University of Chicago under the leadership of Benjamin Bloom were seeking to help educators move beyond rote learning of facts.  They developed a taxonomy, or levels, of learning.  These levels of learning are frequently used in educational settings including workplace learning.  The levels are: knowledge (recall), comprehension (understand), application (use), analysis (analyze), and evaluate (judge or assess).  Bloom and his team also identified three domains of learning:  cognitive (thinking and evaluating), psychomotor (physical and perceptual), and affective (feelings and preferences, values).  For more information from a variety of sources, Google Bloom’s Taxonomy.

Trump University and Making Training Stick

March 10, 2016

Donald Trump’s Trump University has been in the news lately.  Allegations are that while the organization purported to be an educational institution providing “real estate secrets” from Donald Trump, in reality it was a sales organization focused on selling products and services.   Several lawsuits are pending, and Trump’s political opponents point out that statements and claims by Trump himself about the “university” have proven to be false.  Human resource development professionals take note:  this seems to be simply a politically charged, high profile example of level 1 vs level 3 evaluation.  They liked it, but could and did they use it?

Trump claims, according to a November 2015 article in Time magazine, that 98% of the students who took the $1500 real estate course gave it “rave ratings.”  He has copies of the end-of-class surveys students completed to back up claims that students believed they got something of value in the classes.  Attorneys for the plaintiffs, that is, students who took the class and are now suing, state that the survey results were not credible because they were not anonymous and were completed at the end of the class — level 1, in Kirkpatrick terminology.  Trump’s lawyers have pointed out that many of the students filled out favorable end-of-class questionnaires and later “changed their minds”.  They changed their minds about the value of the class when they tried to apply what was taught – level 3.

While most learning evaluations are not as high-profile as this one, it provides an opportunity to pause and take note:

  • Do you conduct level 1 end-of-class evaluations of all classes?  If you, as 97% of workplace learning professionals do, do you use them to guage the success of the class?  How is learning success defined in your organization?  Sandra Merwin, noted workplace learning consultant and now artist, did some research years ago that found a negative correlation between level 1 “likes” and level 3 “applies”.  In other words, individuals who at the end of class did not give high “like” marks to a class at the end were more likely to apply what they learned.  And those who did give high “like” scores after training were less likely to apply it.  The explanation might be that when prior knowledge is challenged, it sets up an uncomfortable creative tension.  Just one study, but something to think about.
  • Consider surveying participants and perhaps their managers three months post-training to see how they’re applying what they learned. Doing this will likely provide more value than end-of-class, level 1 evaluations.   My Sticky Note Easy Level 3 Evaluation provides useful tips for doing this.
  • For completed classes with level 1 evaluation data available, conduct a level 3 evaluation and compare the results.  Some LMSs can provide graphic comparisons.  You may be surprised, pleasantly or otherwise. 
  • Add post-class reinforcement to increase application of what has been learned.  Recent transfer technologies make doing this less time-consuming and easier for learners.  For an overview of available technologies, see my Training Transfer Technologies white paper.  My books Making Learning Stick and Making Elearning Stick provide evidence-based, easy-to-use techniques for both before and after learning events to increase application of learning.

While most learning evaluations don’t capture media attention the way Trump University has, the recent publicity provides a helpful reminder that our end goal in workplace learning is not for participants to like the training but for them to use it, to make the training stick.

 

 

Until Next Time…

Barbara

Multitasking and Making Training Stick

October 20, 2015

Multitasking is rampant today – almost everyone does it at one time or another.  I attended a workshop recently and soon after it began I noticed that at least one-third of the 100ish participants were tapping away on laptops, tablets and/or phones (yes, a few were using more than one device).  I got up and peeked around.  No one seemed to be taking notes on the lecture.  They were reading and responding to emails or texts, posting on social media, and/or surfing the internet.

Is multitasking bad?  Does it interfere with learning and retention?  In study after study, test scores for people who multitasked in learning environments were significantly lower than for those who did not, or if scores were comparable, the multitaskers required considerably more time to achieve the same learning outcomes.  One study found that people who sat close to multitaskers in the classroom had lower levels of learning even though they themselves weren’t doing it.  Even worse, in one study multitaskers who performed poorly on a learning assessment believed they performed as well as their non-multitasking peers.   

What’s a trainer to do?  Neuroscientists tell us that most people’s maximum attention span is about 20 minutes.  As attention begins to drift the temptation to multitask increases, and the digital device(s) beckon.

A trainer can and should vary the activities every 20 minutes:  small group discussions, Q&A, application exercises, a video, and inward reflection activities are popular choices.

Most trainers ask participants to turn off phones at the beginning of training and to close out all unnecessary websites.  E-learning designers should consider adding a slide with this message at the beginning of each online module.

Consider also these strategies based on current neuroscience research to reduce participants’ tendency to multitask:

  • Tests and quizzes.  Whether printed or on a slide, response required or rhetorical, studies show that when people are tested on material, especially if they know they will be tested, they have higher levels of learning and retention.  Flash a few quiz questions on the screen or pass out a written quiz.  Quiz features in most e-learning software makes this a snap.  Be sure to make the test reasonably challenging and base it on the learning objectives.

Let participants know in advance they will be tested, or give a “pop quiz” without advance notice.  People will soon figure out they are going to be tested periodically.  Motivation to learn increases if testing results are recorded which makes the participant personally accountable.

Tests and quizzes need not be confined to the learning module.  Require completing an after-class test as a condition for class credit.  Or use a training transfer technology tool to deliver one question at a time.  The newest revision of my Training Training Technology white paper has an overview of the latest tools.

  • Flip the classroom.  Do you really need to deliver the content via lecture?  All of it?  The temptation to multitask is most compelling during lectures.   Consider instead:  an online module, pre-reading, and/orvideo viewing ahead of time followed by in-class discussion, practice, and application.  Of course a typical issue with pre-work is that some participants come to class without having done it.  Guard against this after the first time by giving a test at the beginning of class or starting class with small group discussionthat draws on the pre-work content.  Ask those who did not do well to spend their first break reviewing the material.  (Have a few extra laptops and/or tablets available if needed.)
  • Chunk content into smaller segments of 15-20 minutes. Obviously this will not work for every class especially if participants travel to get there, but when possible, shorter training chunks (think stand-up meeting or short online module) mean participants will pay better attention and therefore they will learn more in less time.
  • Tell participants about the research on multitasking and learning.  Ask them to consider how they and the organization will benefit when they achieve the outcomes of the class, and how their multitasking might interfere with their learning goals.
  • Use available tools on the technology platform. For virtual instructor-led classes, be sure to use the tools available such as raising hands. Prepare a question or two to ask an individual when the system notifies you that a participant is inattentive or has navigated to another screen.   For self-paced e-learning, review key metrics such as time on slide and navigation and consider ways to shorten/revise the content.

These strategies probably won’t completely eliminate multitasking during training but chances are that the more of them you use, you will reduce it.

Until next time…

Barbara

Big Bang’s Sheldon Shows Us How Not to Teach

August 11, 2015

I enjoy the TV sitcom Big Bang Theory.  Apparently a lot of other people do too because reruns seem to appear often – on several channels in different time slots.  Recently I saw a segment where Sheldon teaches – or tries to teach – Penny physics.  His teaching is a great example of how not to engage a learner as well as many other mistakes some instructors make.

The six minute version on YouTube provides an opportunity to apply the before-during-after time periods related to transfer and also to apply some of my Training Transfer Process model.

Click here to watch the exerpt of the episode.  While you’re watching I hope you’ll think about what Sheldon should have done differently and what he could have done before and after the session with Penny that might have helped her apply her learning.  Also think about your own teaching/training…. I know I recognized a few things that I have to watch and keep myself from doing!

When you’ve finished viewing, read my comments below the clip about how Sheldon could’ve applied aspects of my Training Transfer Process.

Enjoy!

Barbara

A Retention Aid – Interleaving

July 14, 2015

I recently went on a hiking vacation in Utah with a group of like-minded people. Our guide was a professional wildlife conservationist and geologist.  As we hiked along trails, he pointed out many different species of birds and trees as well as geology formations.  This information was interesting but a lot to take in as I concentrated on breathing and avoiding boulders on the steep paths.  When we encountered something we had seen previously, he would ask us if we could recall the name or something about it.  Sometimes he would also use the opportunity to ask us a few questions about other things we had seen.

By the end of the trip I was able to identify 5 different types of pine trees, 3 type of hawks, and many different desert mountain flowers and plants.  Pretty cool….considering that my goal was to hike and enjoy the scenery and I really didn’t care whether or not I learned the names of the flora and fauna.

Why was I able to have such good recall?  Our guide, a former teacher, used a technique called “interleaving.”  This teaching/learning technique involves mixing up recall and practice nonsequentially.  It is the opposite of “block practice”, where lesson, practice, and recall are done all at once.  Think about it this way:   If you want to teach 3 learning points, A, B, and C, a block practice session would look something like this:  AAABBBCCC.   An interleaved practice session would look like this:  ACBABCB AC (randomized).

Numerous studies support the effectiveness of interleaving vs block practice to achieve long-term learning and retention.  (Interestingly, many people in studies who used interleaved practice performed worse than their counterparts using block practice during the practice session, but performed better when tested at a later date.)

Here are some suggestions for how to use interleaving in your training.  Detailed descriptions of how organizations such as Farmers Insurance and Jiffy Lube use it in their training are in the book Make It Stick (good title!) by Brown, Roediger and McDaniel.

  • Begin your teaching – whether live, virtual, synchronous, or asynchronous – with a story or two that illustrates the application of several learning points.
  • As you present each learning point, relate it back to the story and point out how it was used, could have been used, and so on.
  • As the learning module(s) progresses and you go on to other topics, randomly ask questions about one or more of the previously-presented learning points and relate back to the application story. 
  • After the learning event (class or elearning module) is finished, continue randomly quizzing via occasional emails.  For more information about after-training technologies to do this, see my white paper Training Transfer Technologies.
  • Consider using hard copy aids such as flash cards (digital versions could be created in PowerPoint) to randomize quizzing.

While my hiking guide used this technique intuitively (I asked him), many of us should do it intentionally.  I have made some adjustments to classes I have designed to accommodate this technique.  I’d like to hear from other trainers who use who or are trying interleaving.

 

Note:  The term interleaving is also used in computer disc storage.  Here it refers to rearranging blocks of digital information on storage discs to improve speed of access.

Until next time…

Barbara

Action Plans – Do they make training stick?

April 20, 2015

Action plans have long been thought to support training transfer and to make the training stick.  Actually the truth is….. yes, no, and …. sometimes.

End-of-training goals signal what is important, they provide a sense of direction, and they can be a focus for evaluation and feedback on performance of the task(s).   But simply asking participants to jot down their goals, or what they plan to do to apply their learning, will have limited effectiveness.  Also limited in effectiveness is setting difficult goals that are hard to reach.  However, goals that focus on specific behavior outcomes can be very effective in producing on-the-job application of skills learned in training, particularly for complex skills.   Also, when feedback is solicited from participants’ colleagues  and participants know this will happen, action plans will spring to life.

Consider applying the following evidence-based enhancements to your current action planning segments.  You will significantly increase how well participants transfer their learning.  For additional ideas on action planning see my earlier Sticky Note.

  • Develop specific behavior outcomes that identify skills to be applied.  These behavior outcomes might replace the course objectives or they can be used to enhance them and make them more “actionable”.   For related information on sticky class objectives, see my earlier Sticky Note.

Here are a few examples of behavior outcomes:    There should be between 3 and 15 of them depending of course on the length of the class:  “Gives employees feedback on a regular basis” (supervisor training), “Calls the customer by name” (customer service training), “Uses current discussion boards for current issues and solutions” (software or help desk).

  • As part of the end-of-class action planning segment, provide participants with a list of the behavior outcomes that they should expect to apply.  Either suggest that they choose among these for inclusion in their own action plans, or provide the action plan pre-completed with the behavior outcomes listed.  Numerous studies have found that when action plans have specific behavior outcomes, participants are more likely to do them.
  • Ask participants to supply the names/contact info for at least 2 coworkers, subordinates, or manager(s) they work with, who will be asked later for feedback.
  • Survey these coworkers, subordinates, or managers 3 months after the training, using the behavior outcomes previously developed.  Assure them the results will be confidential.   Consider whether or not to share the feedback from the survey with the participant, and if so, how to protect the confidentiality of those who responded, particularly since there will be  small numbers of respondents.

Online survey tools are helpful for providing anonymous feedback but may not be enough to assure confidentiality with a small number of respondents.  Note: whether or not these individuals are actually surveyed may not be as much of a motivator as that the participants believe they will be surveyed.

The purpose here is to make participants accountable for applying the skills, but of course the information collected might be used for evaluation of the class or for follow-up training.

  • Adapt for elearning in the following ways:

o   List the behavior outcomes close to the end of the last module in the training.

o   Provide an action plan form with behavior outcomes listed.  Direct the trainee to fill in a date or milestone, and suggest that they print it for reference.

o   Set up your LMS to send the survey mentioned above to participants’ managers (who are generally already loaded into the LMS).

o   Consider developing an interactive action planning tool participant

Until next time…

Barbara

How do you know training has “stuck”?

February 27, 2015

Assessing whether learning is really being used on the job is challenging for many trainers.  End-of-class level 2 evaluation is easy enough to do:  the participant demonstrates skill or knowledge acquisition at the end of the training.  We might also assess pre-training  learning and compare.  But determining  whether skills and knowledge are actually being used on the job is another matter entirely.

The most straightforward way to do this is to simply ask participants if they are using what they learned and to what extent.  But are we actually getting an accurate measure of whether a participant is using their learning?  In a previous Sticky Note I mentioned neuroscience research that points out people often think they know, have experienced, or are experiencing something when in fact they have not.  This illusion calls into question the practice of assessing on-the-job use of skills learned in training, by asking participants if they are using it.  The key issues are:

  • Are participants actually aware of how and in what ways they are using what they learned in a training class?
  • Can participants distinguish between what they are applying from a particular class and what they are doing for other reasons such as another previous training, intuition or trial and error?
  • How much are these self-reports tempered with wanting to provide the appropriate response, to please the trainer, the boss, the organization?
  • Studies on these types of self-reports indicate they are unreliable.  One study found that self-reports produced positive responses that were 35% more positive than reports by participants’ managers.

What’s a trainer to do?  What are more accurate ways to test whether learning is being applied?

  • In post-training reaction level 1 evaluation, ask participants about their “intention to transfer”, that is, whether they plan to use what they have learned, and how they plan to use it.  Studies show there is a strong link between intention to transfer and later actual transfer.
  • After training at a point in time when participants should have had an opportunity to use the training, ask their managers (a quick survey, or more detailed focus group) whether their employee is using the skills learned in training and how they are using them.  Six weeks and three months post-training are popular times.
  • To reduce the tendency to give the desired positive response, ask managers and participants specific behavior-based questions, known as a Behavior Observation Scales (BOS).  Assessing on a 5 point scale (1 = Almost Never,  5 = Almost Always), specific behaviors linked to class objectives are addressed.  For example, for a class on coaching, one behavior observation scale item is “Provides feedback regularly”.  A BOS item for a sales training class: “Reviews individual productivity results with manager”.
  • Instead of – or in addition to – asking participants and their managers, poll a select group of individuals, perhaps one level above participants’ managers,who are in a position to see many participants’ on-the-job behavior.   One study paired an HR rep with each of these individuals, and the role of the HR rep was to assist the manager with completing the Behavior Observation Scales.
  • Instead of assessing the learning application for every participant, assess a sample of participants.  In general, 30% of the total number of participants should provide a reasonably accurate representation of all trainees in a particular training program.

Don’t rely on your “gut feelings” about whether trainees are using what they learn in training.  Use popular, free survey software or features of your LMS to find out how much of your training is sticking!

 

Until Next Time…

 

 

PS:  Join me at ATD (formerly ASTD) International Conference and Expo May 17-20.  I will be presenting on Evidence-Based Techniques for Training Transfer.  

Sticky Objectives

December 11, 2014

I’ve been working recently converting training/learning objectives to “sticky objectives,” and I’d like to share a few thoughts with you. As most of us know, good instructional objectives are essential for effective training and evaluation. A couple of points about instructional objectives before I continue:

  • A good instructional objective should include 3 things: 1) the performance (what the trainee should be able to do after the training, 2) the condition (when, ex. “when conducting a performance evaluation”), and 3) the criteria (how well). If the objective does not contain all three of these elements, it can’t effectively indicate the desired result of the training.
  • Most instructional objectives are preceded by the this phrase: “at the conclusion of the training, the participant will be able to:”
  • It may not be useful to share the instructional objectives with the trainees. It may be more helpful to develop instructional objectives for use in the design process with trainers, program sponsors, and other “insiders,” and to write and publish objectives that are focus on specific job performance, such as “conduct an effective performance review,” “use the 6 key functions in Excel,” and “use the Situational Leadership model to identify the appropriate mix of direction and support for an employee”…. for use with trainees in the learning events. In my experience, many trainees are intimidated or just don’t relate well with instructional objectives (some trainers too, but that’s a different issue).

take-a-lookNow that I’ve commented on instructional objectives, I’d like you to consider this:

  • What is the purpose or point of the training and of the training objectives? Is it to demonstrate knowledge or a skill or possibly even an attitude change at the end of the training? In most cases, the answer to this question is “no.” The purpose of most training is for trainees to apply certain knowledge, skills or attitudes to their jobs so that their performance is more effective in specific, targeted ways.

So if the purpose of the training is for trainees to use certain skills in their job performance,

  • The objectives should be written to describe what trainees should be able to do, on the job, after the training. If we look at instructional objectives from this perspective, the performance, conditions, and criteria may not change. What will change, though, is the statement that precedes the objectives: “at the conclusion of the training, the participants will be able to….” Instead substitute “in their job performance, the participants will….” Remember, our focus should be on what they will do, not what they can or will be able to do.yearend-book-bundleMaking these simple adjustments in the wording of instructional objectives – and in the more general objectives shared with trainees – can keep trainer and trainee focused on the true goal of the training – on-the-job performance.

    Until Next Time…
    Barbara

The Illusion of Learning

September 19, 2014

Are your participants guilty of the illusion of learning?

I came across an interesting concept as I was reviewing recent studies in neuroscience related to Making Training Stick®. It’s called the Illusion of Knowing and refers to people’s errors in perception. For example, we seem to be hearing more in the news about individuals who have been convicted and incarcerated for crimes that later DNA testing proves they didn’t commit. Eye witness accounts have identified the individual as the perpetrator only to be proven wrong years later. These memory distortions arise out of our discomfort for ambiguity and our desire to “have the right answer”.

recent-booksSo in a level 1 evaluation when participants are asked “what did you learn?” their responses may well be shaped more by this illusion of what they would like to have learned and what they know they should have learned, than by what they have actually learned.

Another illusion is called Imagination Inflation, which is the tendency of people who when asked to imagine an event, will sometimes begin to believe, when asked about it later, that the event actually occurred. For example, if a level 3 post-training survey asks a participant how they are applying their learning, they may believe that they have applied it or are applying it when they actually have not or are not. This is particularly troublesome with complex learning of tasks/behaviors where application is less than straightforward such as soft skills: customer service, management, communication.

How to overcome these illusions of learning? Feedback. Studies show that when students have an opportunity to reflect on their demonstration of learning and on their performance, their perceptions of their learning and performance become more accurate. This is called metacognition, the awareness and understanding of one’s own thought processes and learning. If you are one of the many individuals who, based on the notion that adults do not want to be tested, have shyed away from testing (as I have), I urge you to reconsider. Testing and then providing participants with correct answers is one way to provide feedback and reduce their Illusions of Knowing regarding what they have learned. This of course also serves as level 2 evaluation of learning.

Another feedback tool is the Behavior Observation Scale (BOS). This can reduce participants’ possible illusions about what they are applying or have applied. Develop a set of behaviors that demonstrate successful application of the skill(s) during or prior to the design phase. This can be done by the designer and/or other interested stakeholders in the training. Provide a Likert 5-choice scale for each behavior. This BOS should then be used by the participant and/or their manager to assess their on-the-job application. This of course may double as a level 3 evaluation.

mels-burstHere’s what you can do to reduce illusions of learning and improve course feedback from level 1, 2, and 3 evaluations:

  • Review your end-of-training level 1 evaluation form and eliminate questions that ask participants what they learned. Consider instead asking a question or two about how they intend to apply their learning. (Research studies have found a strong relationship between reports of intention to apply and actual application.)
  • Consider adding testing to all training. If you currently use tests, incorporate opportunities for participants to review their answers vis a vis the correct answers.
  • Develop Behavior Observation Scales for complex learning. Distribute these to participants and their managers post-training, when they are most likely to have had an opportunity to apply their learning.
  • Consider ways to “motivate” them to respond. (My favorite is to withhold credit for the class until post-training feedback has been received.)

Until next time…

Barbara

Letter to Self – Easy closing activity that Makes Training Stick

June 30, 2014

One of the first closing activities I used was called a “letter to myself.”   At the end of training, participants were asked to reflect on what they learned and how they were going to apply it, and to write a letter to themselves, complete with self-addressing an envelope.   Then I picked up the envelopes, stashed them away for a few weeks, and mailed them back to their authors.  I got a lot of positive feedback from people.  One time when I took a workshop I participated in this activity and experienced first-hand how energizing and motivating it was to receive that letter with the reminders and encouragements I had written.

 

A recent experimental study has demonstrated what I’ve always believed:  that this activity is more than a “nice to do”.  Trainees in the study who participated in this type of activity had higher levels of self-efficacy (the belief that they could apply the skills they had learned) and they demonstrated application of their training.

 

Researchers Amanda Shantz and Gary Latham did a study on what they termed “written self-guidance”.  Half of their trainees who participated in a soft skills training program participated in a “letter to self” type of activity in which they reflected on what they had learned and how they planned to apply it.  Those who participated in the activity demonstrated significantly higher levels of application of the training than those participants who did not.  This activity is not the same as having participants write a reflection paper, develop an action plan, or write a class summary because it requires trainees to write motivational letters directed to the self, and the participants at a later point in time receive a letter written by themselves, to themselves.

 

Here are some specific guidelines for using this activity in training you facilitate, develop, or administer:

  • After a summary of the training content, ask participants to write a letter to themselves  – “Dear Self” – in which they outline their key learnings and how they plan to apply what they learned.
  • In the instructions, stress that they are the only ones who will see their letters – they will seal them before they leave the class.
  • Ask them not to pay attention to or be concerned about grammar or spelling.
  • Encourage participants to include self-affirming and comments that are relevant for them.  Provide examples.
  • As they finish, pass out blank mailing envelopes and ask them to write their full mailing address (interoffice, home address, etc.).
  • Allow approximately 15 minutes for this activity.  At the end of the time, collect the letters.
  • Store them safely (remember, they’re confidential) in your office and tickler your calendar to mail them in 3 weeks.   (The experiment used a 5 week interval but I’ve found that 3 weeks is better in today’s fast-paced work environments.)
  • Mail them at the appointed time.

This activity can be adapted for live virtual or elearning in the following way:

  • Ask participants to open their email system and type an email to themselves.  Use the same instructions as above.
  • Then ask them to save this email as a draft.
  • Mark your calendar, and 3 weeks later get in touch with each participant (email, text, etc.) and ask them to open their drafts folder and read their letter to themselves.

Click here to download a copy of the article that describes the details of this study.

 

**I hope to see many of you at Training Mag’s Online Learning Conference Sept 22-25!