Not every mentoring program has to be formal and top-down. In fact, we often recommend that our clients set up a "do it yourself" version called Reverse Mentoring, which leverages cross-generational relationships. Reverse Mentoring expands individuals’ skill sets and opens their eyes to some positive generational differences. Devon Scheef will be sharing how you can implement this cutting-edge program in your organization during her session Reverse Mentoring: A DIY Approach at this year’s California HR Conference presented by PIHRA. Visit cahrconference.org/) for additional information on how to register for this powerful session.
Devon Scheef   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Sep 05, 2015 04:13am</span>
Having trouble finding and hiring candidates with the right skill set? You may want to consider broadening your scope to include younger prospects. This huge—and hugely promising—demographic could hold the key to your company’s short- and long-term success. Click here to read the full article in Insulation Outlook.
Devon Scheef   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Sep 05, 2015 04:13am</span>
Harold Stolovitch has sold more training books than anyone else with one simple message, "Tellin’ Ain’t Trainin.’"  We know it’s true. We know we really don’t actually learn much by listening or by reading. Reading, listening or watching a short video can work well when used in context as performance support. But for actual learning, nothing takes the place of real experience, trial and error, and self-discovery.  I remembered this lesson while I was working with scuba students last weekend.  Let’s go back to the scuba instructor laboratory again.  Learning to scuba dive is a pretty complex skill, and fairly critical. It’s a relatively safe sport, but good skills can save your life. When we begin to teach skills in the pool, we can overload students with too much to remember at one time. For example: swim horizontally not vertically; don’t exhale through your nose; don’t bend your knees, kick from your hips; breathe all the time, never hold your breath; ascend slowly and in control. Well, you get the idea. Truthfully, the last two items in this list are the only two important ones. They are the only ones that can get you hurt if you forget them. Unfortunately, this happens all too often in corporate learning. We overload people trying to make them perfect in their first class. And we create those e-learning programs where we overload learners with reading and listening and details without real experience or practice or feed-back. We put a check in a box, but have we spent our organization’s training dollars wisely? No matter how much my students struggle through their first pool session, I don’t over-correct them. I find something they did well and praise them for it. They come back the next week a little more fearless and a lot more enthusiastic. By their fifth pool session they are ready for the real world. Learning is a process, not an event. As a learning professional, you know what works. Explain your plan to your client. Tell them you want to spend their money wisely on what works. I bet they’ll trust you.
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Sep 05, 2015 04:13am</span>
Context is the kingdom. I wish I had come up with this clever line but I didn’t. My friend, Tony O’Driscoll, Professor of the Practice of Business Administration at Duke University, did.  He is a diver too.  Tony’s line comes as a response to the often quoted phrase, "Content is King". We all know that training requires a lot of complete, accurate content. For many of us, content has become king. But the point that my fellow diver, Tony, is trying to make is that content alone will not change behavior. Receiving information or performing skills in context has become the kingdom and the key to real performance. I recently read a debate in, of all things, Scuba Diving magazine on the effectiveness of training standards. In defending the standards, H. Kelly Levendorf talks about the recent proliferation of independent study and online learning, which is completely content centric. He says, "Rather than bemoan less time in the classroom, instructors should embrace the additional time it provides to teach where it’s most valuable: in the water."  In learning scuba, water is the context. By teaching scuba, I re-learned something I already knew; that people learn best by using content in the context of how they will use that content. This can be successfully applied to any kind of learning situation. This is really the whole basis for the concept of blended learning. Content may be delivered by one means, but application in context may need to to be delivered by a different modality. What does that say about the value of stand-alone, page-turning e-learning courses? In addition to pointing out the importance of context, Mr. Levendorf goes on to say, "Student divers need ample time for practice and mastery. Instructors must focus on performance-based—not time-based—learning. I’ll take that topic on next week.
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Sep 05, 2015 04:11am</span>
Once you have used your data to revise your instructional strategy and your prototype, you can continue developing the rest of the course with a new sense of confidence—and that critical buy-in from your client and learners. Now that you are ready to implement the entire course, how do you know that everything is working perfectly? If your first implementation is an instructor-lead course for 20 to 50 people, your risk of a perfect implementation may not be that great. A few minor issues will be tolerated. But what if you are releasing a new compliance course to 25,000 learners who need to complete the course in thirty days? Easy—we just apply the "cost vs. risk" rule again. We’re going to conduct a Field Test, according to the Robert Stake Matrix of Formative Evaluation. In Field Test, we are not evaluating a prototype; we are evaluating the entire finished course. We are not validating our instructional strategy, we’ve already done that. We are looking for any other minor failures or corrections that need to be done. These can range from fixing typos, to fixing test questions that don’t work, to fixing an entire objective where learners are not achieving mastery. We try to get a test audience of 12 to 20 learners for the field test. This is a more formal test where we try to simulate the real learning environment as closely as possible. If learning takes place in the learner’s work space, that’s where we conduct the test. Our data gathering is similar; we interview, we observe and we interview again. The observation may not be one on one with 20 people, but we augment data gathering by asking learners to keep a list of any errors or questions/concerns and where they occurred. With twenty people, we develop very strong trends and can easily separate opinions with two or three people recording an issue from strong trends with fifteen or more people recording an issue. Although a Field Test often requires a little more time to conduct and may require more logistical support, the relative cost is still low, because everyone who takes part in the test receives his or her training as a result of participating in the test. The side benefit is all that data you get to identify where your problems are before the course is released to your entire learning audience. Do I dare say it again? Why guess when you can measure? 6/8 of series
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Sep 05, 2015 04:09am</span>
First, we have to define what a successful Field Test looks like. Having everyone tell us that everything looks great might save us a lot of revisions before implementation, but if we aren’t careful and objective about how we gather our data, we may be handling dozens or even hundreds of complaints from busy learners who don’t have time to help us correct our mistakes. A field test is a more formal type of test than a Learner Try-out. We sit with a learner one-on-one in a Try-out and record his or her stream of consciousness. In a Learner Try-out, we are validating our general strategy and approach. In a Field Test, we are validating everything. Our approach is entirely different. We want to simulate the actual learning environment for the learning events as closely as possible. We don’t shadow each learner, but we have coaches available to record questions or set things back on track if things break down. And remember, a break-down is not a bad thing; it is an opportunity for improvement. We ask learners to write down everything that comes to mind down on an observation sheet, just like the ones our staff observers use. The kinds of things we find are typos and writing corrections, test questions that need to be re-written or thrown out, and sometimes weak lessons that need to be re-written or revised. Field Tests are not just for self-paced or e-learning. We also use them for all of our instructor lead courses. They are even easier to justify for instructor lead courses. Someone has to be in the first class. That first audience becomes our Field Test group. By now, I hope you are saying to yourself, "Why guess when I can measure?" 7/8 of series
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Sep 05, 2015 04:09am</span>
Sometimes we have very big projects.  One example was a large regional merger that included forty-four hours of self-paced e-learning and several days of instructor lead training for several different lines of business. It was one of those things where, on Monday morning, twenty thousand people needed to know how to do the same job, but in many cases they had to do it quite differently for a new company and with many new tools and processes. If we go back to our cost vs. risk question, the risk was quite high. Most of the training had to take place one to two weeks before that critical Monday morning opening as a different company. Fortunately for us, the learning programs were not the only things that needed testing before implementation. We took advantage of those tests for our own testing purposes. Before people could test the systems and processes, they needed to know how to use them. We had no trouble finding twenty or thirty people who needed to learn how to use new processes before they could test them. One new problem for us was with so many programs needing testing at one time, enlistment of many people from our client’s staff to do the actual field testing, including observing and recording data was necessary. That meant we had to be super organized. The Field Test went like most Field Tests do, despite its large size and criticality. We had two lessons where we had an unusually large number of people score below 80%. We looked at the test questions, and they looked valid and reliable, clear and understandable. We looked at the lessons and that’s where we found the problem. There were some things about the way they used to do perform those particular tasks that were throwing them off. We took that into account, revised the lessons, administered the two lessons and the tests again, and everyone scored well above 90%. We spent another two or three days making revisions and turned everything over for implementation. Then we waited. In the first week after the big Monday morning conversion, our customer recorded less than fifty support calls related to merger operations and training. Whenever we see a high risk project on our horizon, we know that a careful investment in Formative Evaluation will protect us, our learners and our client. After all, Formative Evaluation is our "all purpose magic". 8/8 of series
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Sep 05, 2015 04:09am</span>
Mobile learning has become the current "next big thing" in training, and the numbers show it. Recent enterprise technology studies show that nearly 73% of companies allow smart phones for work purposes and 50-60% have added tablets to their tech offerings. Naturally, if a company is going to invest in these resources they want to see them used, and training is a likely candidate for mobile application deployment. As you think about your mobile learning philosophy, there’s one fundamental question to ask: Why would a learner access training resources on a mobile device instead of a conventional desktop or laptop computer? If you own a smart phone think about how you use it. My guess is you usually have it with you and it’s probably within arm’s reach - inside a pocket or nearby bag, or next to you on your desk right now (Did you just check for it?). I would also guess you always keep your device powered on. Not counting the talk and text functions, how do you use the "smart" capabilities of your phone? E-mail, news, looking up directions to a destination as you leave, or looking up more information about a topic that comes up in conversation are all commonly reported uses.  Mobile devices are always available and when we need information throughout the day it’s the first resource we turn to. Your clients and/or employees are going to look for training on their mobile devices to function the same way, providing support that is just-in-time and just enough. Unless your learners have completely replaced their desktop or laptop computer with a tablet, it is unlikely they will access a traditional training course on their mobile device.  What they will look for are job aids, short audio or video demonstrations, and interactive knowledge banks for an answer to a specific job-related question at the moment they need an answer. When you are asked to create mobile-based training, keep this in mind and design mobile tools that are enhanced by the always available, just-in-time way we use our devices. Your training will see more wide-spread adoption and will become a useful resource for your learners. The second part of this series will get into the nuts-and-bolts of designing for mobile devices. Part one of two Peter Engels, Instructional Designer/Developer, Handshaw, Inc.
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Sep 05, 2015 04:09am</span>
Last time we talked about how and when to use mobile devices in a larger training strategy that will leverage their strengths of accessibility and immediacy. This week, let’s look at how mobile devices impact resource design. There are a few unique characteristics that you should keep in mind as you design mobile resources: small screen size, touch capabilities, and synchronization. Small Screen Size Tablets have screens comparable in size to small notebook computers, but mobile phone screen sizes vary wildly, from as small as 320 x 240 pixels on some Blackberries to as large as 640 x 960 on the iPhone. With less real estate, every element added must be meaningful and intentional. Keep your content direct to limit extra text and keep images and videos small enough that they can be viewed without scrolling. Many newer devices also replace the physical keyboard with a digital version that appears when needed and can require up to half the screen space on a mobile device. Avoiding text entry interactions removes this concern. Touch Capabilities Touch screens add a fun, interactive element to your resources with the right planning. Your learners will be using one fingertip for control, so size and space elements with this in mind. Try this: move your mouse pointer to the darker border of this blog and place your finger next to it on the screen. See the size difference? That is how much bigger your clickable elements have to be. Fingertip control comes with different functionality, too. Think through how your learners will interact with the resource as you design it: a rollover is a great way to present information, but not if the screen prevents them from accessing it. Clicks, swipes, and drag-and-drops will work the same using a mouse or a touch screen. Synchronization Most users synch their devices so that they can access the same resources on their phone, tablet, and desktop. Job aids, databases, and other mobile resources should also be tested on traditional computers to ensure learners will have access to the same help in any situation. It is also common to use more than one device at once. If an employee is experiencing software troubles, a troubleshooting resource on their mobile device will allow them to look at both screens at once. Using images and instructions that can be directly compared from one screen to another can be a powerful intervention. Matching the right design with the right purpose will help your mobile learning strategy grow to become a valuable, effective resource for your learners. Part two of two Peter Engels, Instructional Designer/Developer, Handshaw, Inc.
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Sep 05, 2015 04:08am</span>
I recently conducted a learner tryout event of a training program for a client who is implementing a new, enterprise-wide system that will significantly change the way their employees complete their daily work tasks.  The prototype that I tested was a combination of web-based training and instructor-led training.  After the event, the client expressed their surprise at the positive feedback received by the participants as it relates to the system.  There is a cultural concern that employees will not like the new system and resist the change.  However, the majority of the feedback during the event was a mildly surprised, "I like the system."  Guess what the client attributed to this positive feedback?  The training. Any training interventions you design have the opportunity to sell learners on the business goals, and not just teach the tasks required to meet them.  Often during our analysis phase, we uncover attitudinal goals which require learners to make a choice to perform a task, and we include this performance objective as part of our training.  I believe that there are always attitudinal goals to consider, whether they are a documented part of our task analysis or not. As a performance partner, we want to take advantage of every opportunity our training materials have to establish our learners’ buy-in to the change.   The ideal way to facilitate the sometimes unspoken attitudinal goal of business case adoption is to take a step back from the training and help our clients look at the bigger picture of how to best sell and deliver the change.  Developing a communication plan helps establish a clear channel between the drivers of the change and the receivers of the change.  Once a plan is established, we can help deliver that communication consistently downstream through training materials and before, during, and after training events. If that strategic level is unavailable to us for a given client or project, we can still help facilitate the change for learners solely through training materials.  As an instructional designer, the better you understand the client’s business case for the change, the more you can communicate that through your writing in the materials.  Additionally, ensuring that the training is built to be the best that it can be can help adoption.  For instance, building web-based training with hands-on practice that functions consistently, provides practice opportunities, and offers valuable feedback helps learners more easily accept the change by making them feel comfortable with it.  For this client, our training materials were able to help learners accept the change by understanding the reasons behind the change.  At the same time, the design of our web-based training courses made them comfortable navigating through the system, and begin to like working in it.  One special concern in the case of training for a new system is that we manage expectations between system-simulated web-based training and real-life experience.  By teaching learners coping strategies for when something goes wrong along with the tasks, we can bridge the buy-in we were able to establish through early change communications and during training with learners’ actual on-the-job experience.
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Sep 05, 2015 04:08am</span>
Displaying 12471 - 12480 of 43689 total records
No Resources were found.