|When someone finishes a training class they are usually prepared to begin using the skills they have acquired. They have learned what they needed to learn. But will they begin using the skills back on the job, especially if the new skill requires unlearning former behavior patterns in favor of new ones?
Edgar Schein, an author and consultant on change management points out that when a person is confronted with the prospect of learning something new, a “learning anxiety” is created. The learner’s learning anxiety interferes with their applying the newly learned skill. The list below contains some of the more common sources of learning anxiety, each with a suggestion or two for learning professionals to reduce or eliminate them (which Schein calls creating psychological safety).
It is up to those of us in the learning and development arena to help learners and their managers capitalize on survival anxiety and reduce learning anxiety!
Until next time…
Sign up for my free Sticky Notes newsletter:
Archive for the ‘Uncategorized’ Category
Many end-of-training evaluations ask participants to respond to a statement like this: “I plan to (or will be able to) apply this training to my job”. Have you ever wondered if this is an accurate prediction that the training will stick? Technically this is referred to as intention to transfer.
Several research studies have been conducted that compare intent to transfer with actual use of skills on the job. In each case there was a fairly high connection between intention to transfer and the actual transfer. But —
- Don’t people often tell us what we want to hear (or what they think we want to hear)?
- How many people have kept their new year’s resolutions? (How many of us remember what they were?)
- Aren’t people’s perceptions of their own behavior often different from what other people see?
The answer is yes, people often tell us what they think we want to hear but this “socially desirable response” (SDR) bias has really only been studied on personal habits such as healthy food choices and substance abuse. There is no evidence that SDR plays a part in assessing workplace learning or intention to use it. Yes, people’s perceptions of their own behavior are sometimes different from what others see, but just because learning doesn’t show up in observable behavior doesn’t mean it hasn’t stuck, particularly with leadership and soft skills training where behavior changes may be subtle, and observed only by one or two individuals.
So – is it useful at the end of the class to ask your trainees how they intend to use what they have just learned in training? Definitely. While a few people may not accurately indicate what they intend to do to apply what they have learned, multiple research studies have found that for the majority of trainees, particularly in soft skills training, those who report their intention to transfer specific skills, actually do it.
In addition to asking about intent to use in end-of-class evaluations, here are some more ideas:
- Incorporate it with an action planning activity. (See this prior Sticky Note for a closer look at action planning and Making Training Stick®). Have participants develop their action plan, then a reflection activity on intent to do it.
- Email participants 2-3 weeks after training and ask: “What have you done as a consequence of the training?” and “….if you have not started yet, what do you intend to do?” Note: In one study, the trainer sent the follow-up email to each participant’s managing director, who then sent out the email. They got a very high response rate.
- Repeat the above email 6 weeks–3 months post-training. Ask the same questions and compare the responses.
- Large number of trainees? Develop a short questionnaire with multiple choice responses – no more than 5-8 questions. Each question would be a key learning point from the training, with response choices 1-5, and would have two parts: to what extent are you using this skill/learning point? If you haven’t use it yet, to what extent do you intend to use it?
- If it’s not possible to “boil down” to 5-8 specific questions, send more than one questionnaire. Just because the learning content is grouped into one learning event doesn’t mean the feedback and evaluation on it has to be.
Remember, when trainees tell you they intend to transfer what they have learned, they usually do it. That’s pretty cool!
Until next time…
Donald Trump’s Trump University has been in the news lately. Allegations are that while the organization purported to be an educational institution providing “real estate secrets” from Donald Trump, in reality it was a sales organization focused on selling products and services. Several lawsuits are pending, and Trump’s political opponents point out that statements and claims by Trump himself about the “university” have proven to be false. Human resource development professionals take note: this seems to be simply a politically charged, high profile example of level 1 vs level 3 evaluation. They liked it, but could and did they use it?
Trump claims, according to a November 2015 article in Time magazine, that 98% of the students who took the $1500 real estate course gave it “rave ratings.” He has copies of the end-of-class surveys students completed to back up claims that students believed they got something of value in the classes. Attorneys for the plaintiffs, that is, students who took the class and are now suing, state that the survey results were not credible because they were not anonymous and were completed at the end of the class — level 1, in Kirkpatrick terminology. Trump’s lawyers have pointed out that many of the students filled out favorable end-of-class questionnaires and later “changed their minds”. They changed their minds about the value of the class when they tried to apply what was taught – level 3.
While most learning evaluations are not as high-profile as this one, it provides an opportunity to pause and take note:
- Do you conduct level 1 end-of-class evaluations of all classes? If you, as 97% of workplace learning professionals do, do you use them to guage the success of the class? How is learning success defined in your organization? Sandra Merwin, noted workplace learning consultant and now artist, did some research years ago that found a negative correlation between level 1 “likes” and level 3 “applies”. In other words, individuals who at the end of class did not give high “like” marks to a class at the end were more likely to apply what they learned. And those who did give high “like” scores after training were less likely to apply it. The explanation might be that when prior knowledge is challenged, it sets up an uncomfortable creative tension. Just one study, but something to think about.
- Consider surveying participants and perhaps their managers three months post-training to see how they’re applying what they learned. Doing this will likely provide more value than end-of-class, level 1 evaluations. My Sticky Note Easy Level 3 Evaluation provides useful tips for doing this.
- For completed classes with level 1 evaluation data available, conduct a level 3 evaluation and compare the results. Some LMSs can provide graphic comparisons. You may be surprised, pleasantly or otherwise.
- Add post-class reinforcement to increase application of what has been learned. Recent transfer technologies make doing this less time-consuming and easier for learners. For an overview of available technologies, see my Training Transfer Technologies white paper. My books Making Learning Stick and Making Elearning Stick provide evidence-based, easy-to-use techniques for both before and after learning events to increase application of learning.
While most learning evaluations don’t capture media attention the way Trump University has, the recent publicity provides a helpful reminder that our end goal in workplace learning is not for participants to like the training but for them to use it, to make the training stick.
Until Next Time…
Multitasking is rampant today – almost everyone does it at one time or another. I attended a workshop recently and soon after it began I noticed that at least one-third of the 100ish participants were tapping away on laptops, tablets and/or phones (yes, a few were using more than one device). I got up and peeked around. No one seemed to be taking notes on the lecture. They were reading and responding to emails or texts, posting on social media, and/or surfing the internet.
Is multitasking bad? Does it interfere with learning and retention? In study after study, test scores for people who multitasked in learning environments were significantly lower than for those who did not, or if scores were comparable, the multitaskers required considerably more time to achieve the same learning outcomes. One study found that people who sat close to multitaskers in the classroom had lower levels of learning even though they themselves weren’t doing it. Even worse, in one study multitaskers who performed poorly on a learning assessment believed they performed as well as their non-multitasking peers.
What’s a trainer to do? Neuroscientists tell us that most people’s maximum attention span is about 20 minutes. As attention begins to drift the temptation to multitask increases, and the digital device(s) beckon.
A trainer can and should vary the activities every 20 minutes: small group discussions, Q&A, application exercises, a video, and inward reflection activities are popular choices.
Most trainers ask participants to turn off phones at the beginning of training and to close out all unnecessary websites. E-learning designers should consider adding a slide with this message at the beginning of each online module.
Consider also these strategies based on current neuroscience research to reduce participants’ tendency to multitask:
- Tests and quizzes. Whether printed or on a slide, response required or rhetorical, studies show that when people are tested on material, especially if they know they will be tested, they have higher levels of learning and retention. Flash a few quiz questions on the screen or pass out a written quiz. Quiz features in most e-learning software makes this a snap. Be sure to make the test reasonably challenging and base it on the learning objectives.
Let participants know in advance they will be tested, or give a “pop quiz” without advance notice. People will soon figure out they are going to be tested periodically. Motivation to learn increases if testing results are recorded which makes the participant personally accountable.
Tests and quizzes need not be confined to the learning module. Require completing an after-class test as a condition for class credit. Or use a training transfer technology tool to deliver one question at a time. The newest revision of my Training Training Technology white paper has an overview of the latest tools.
- Flip the classroom. Do you really need to deliver the content via lecture? All of it? The temptation to multitask is most compelling during lectures. Consider instead: an online module, pre-reading, and/orvideo viewing ahead of time followed by in-class discussion, practice, and application. Of course a typical issue with pre-work is that some participants come to class without having done it. Guard against this after the first time by giving a test at the beginning of class or starting class with small group discussionthat draws on the pre-work content. Ask those who did not do well to spend their first break reviewing the material. (Have a few extra laptops and/or tablets available if needed.)
- Chunk content into smaller segments of 15-20 minutes. Obviously this will not work for every class especially if participants travel to get there, but when possible, shorter training chunks (think stand-up meeting or short online module) mean participants will pay better attention and therefore they will learn more in less time.
- Tell participants about the research on multitasking and learning. Ask them to consider how they and the organization will benefit when they achieve the outcomes of the class, and how their multitasking might interfere with their learning goals.
- Use available tools on the technology platform. For virtual instructor-led classes, be sure to use the tools available such as raising hands. Prepare a question or two to ask an individual when the system notifies you that a participant is inattentive or has navigated to another screen. For self-paced e-learning, review key metrics such as time on slide and navigation and consider ways to shorten/revise the content.
These strategies probably won’t completely eliminate multitasking during training but chances are that the more of them you use, you will reduce it.
Until next time…
I enjoy the TV sitcom Big Bang Theory. Apparently a lot of other people do too because reruns seem to appear often – on several channels in different time slots. Recently I saw a segment where Sheldon teaches – or tries to teach – Penny physics. His teaching is a great example of how not to engage a learner as well as many other mistakes some instructors make.
Click here to watch the exerpt of the episode. While you’re watching I hope you’ll think about what Sheldon should have done differently and what he could have done before and after the session with Penny that might have helped her apply her learning. Also think about your own teaching/training…. I know I recognized a few things that I have to watch and keep myself from doing!
When you’ve finished viewing, read my comments below the clip about how Sheldon could’ve applied aspects of my Training Transfer Process.
Once upon a time at the North Pole, Santa had a problem. Boys and girls were not very happy with their presents from Santa. In the past his elves had done a good job making the toys for girls and boys with their hammers and other tools. But recently the children were asking for electronic games and gadgets, not the simple toys the elves made. Santa had to either re-train his elves or lay them off. It was unrealistic to turn the elves into electronics engineers, but he decided to train them to become negotiators and expediters to work with the electronics manufacturers.
Santa enlisted Rudolph to train the elves. Rudolph attended train-the-trainer sessions and educated himself on the skills the elves needed to learn. While he wasn’t a SME, he did learn enough to put together lesson plans and teach the elves what they needed to know for their new roles. The trainings went well. The elves demonstrated they had learned the material and then they went back to their workshop. Three weeks later Santa and Rudolph visited the workshop….and their jaws dropped. The elves were still making dolls and trucks at their workbenches – just as they had done before the training. The headsets and phone lines Santa had set up for their new roles were untouched. The training didn’t stick!
Santa and Rudolph thought and thought. “What can we do that will help the elves remember and use what they have learned?” They pulled their copy of the book Making Learning Stick off the shelf and started using some of the Techniques to Integrate Education (TIEs, for short) to reinforce the training. In just a short time the elves starting using more of what they learned, the boys and girls got the electronic toys they wanted, and Santa and Rudolph were heroes once again.
And they lived happily ever after – that is, until the next major change.
My best wishes for a wonderful Holiday Season and a happy and prosperous New Year! Until next time…..
Get the new Making Training Stick Field Guide – 2nd edition