Loader bar Loading...

Type Name, Speaker's Name, Speaker's Company, Sponsor Name, or Slide Title and Press Enter

During the Development phase of each project, we build a prototype of the courseware and test out our instructional and assessment strategies with a small group of our target audience in a "Learner Tryout" (LTO).  This process never fails to provide us with valuable and critical feedback that helps us ensure our final deliverables will be as effective as they can be.  However, while the process is down to a science, the art of the LTO is a different story.  You’re bound to encounter at least two of the following "Murphy’s Laws of LTO:" You’ll never have enough time to build the prototype.  You could estimate how long you think it’ll take to build, add on a week, and you’d still be wrestling with it at the last minute before the LTO.  That’s because during the prototype, you’re ironing out your process, hammering out the technical details of your development tools, wrestling with content for the first time, and sorting through a few hundred other details.  The good news is you’re preparing for a smooth ride ahead.  Just keep this fact in mind during your schedule planning and try to build in any extra time you can afford. If it can go wrong, it will.  Even if you’ve tested your prototype thoroughly before the LTO, you’ll show up for the LTO and something will "break."  Sounds bad, but you’re actually testing out a piece of the final courseware in the client’s environment well before implementation, so you can work out technical details now instead of on the day it’s released to thousands of learners.  When something does not work as expected, get creative about work-arounds, explain the situation to your participants (who are usually more than understanding), and try to keep your technical person on-hand and available by phone. Expect the unexpected.  Something will always take you by surprise.  You could go over your design and prototype with a fine-toothed comb, but the minute you place the prototype in front of an actual learner, you’ll discover something new.  Often it’s something obvious that you observe with the first participant, like the participant tries to Tab through the fields and your simulation requires them to press Enter.  Discoveries like these are LTO gold. Depending on which of these you encounter, conducting a successful LTO can require flexibility, patience, creativity, and more, but you’ll always walk away with something of value. Beth Hughes is a Senior Instructional Designer at Handshaw, Inc. She takes projects through the entire process of instructional design and development, incorporating learning principles, instructional needs, and methodologies into the best learning solution for each client. Beth earned her M.Ed. in Instructional Systems Technology from UNCC.
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:48am</span>
June is a great time for a visit to our nation’s capital and I’m really looking forward to my time there. On the 12th, I’ll be visiting the Potomac ISPI - Greater Washington Area chapter and will be presenting "SMEs: It’s a Marriage for Better or Worse." As the title suggests, all of us who develop training rely heavily on our Subject Matter Experts for success. We can’t work without them, but many of us find challenges working with them. I called on Handshaw, Inc. clients and employees to find out some of the toughest issues they experience working subject matter experts and successful strategies for resolving them. I will also use the experience of my audience as we discuss the Top Ten issues based on my research and their potential solutions. On June 14th, I will be visiting the ASTD Maryland - Baltimore, MD Area chapter. I’ll be presenting, "Doing More with Less: Shortcuts Don’t Work." In order to run a profitable learning business for the past 28 years, we’ve learned what shortcuts work and which ones don’t. Most of them are not very effective. It seems there is no substitute for just doing things right the first time. In this presentation we will first examine the true cost of developing learning and take a surprising look at where most of the expense goes—and it’s not where most organizations think it goes. Then we’ll look at several shortcuts that don’t work and a few that do. We’ll wrap up by discussing some strategies that will help you do more with less. Don’t expect any silver bullets here; just a good dose of reality. In closing, I’d like to thank the members of ASTD Chattanooga for a great turnout and an enthusiastic reception during my visit last month. I presented "Performance Partnering." As usual, the live reframing role play was the big hit of the presentation. Special thanks to the two volunteers who participated and helped make this session a success.
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:48am</span>
I enjoyed a great trip to Washington DC on the 12th of June when I spoke to the ISPI Potomac chapter at the National Science Foundation.  The best thing about this meeting was that I didn’t have to do all the talking; I may not have even done most of the talking. The membership was very vocal with their experiences and opinions about working with subject matter experts. Even though not a lot of people talk about this topic (I know Daryl Sink does) it always sparks a lot of debate. All instructional designers rely on subject matter experts to do their jobs and they may not always get enough of our attention.  Thank you to the Potomac Chapter for your insights and opinions on the matter. On Thursday the 14th, I visited the Baltimore Area ASTD Chapter at a very nice University of Maryland campus near the Baltimore Washington Airport. Our topic was something I haven’t talked about in a while, "Doing More with Less." Just because I haven’t talked about this topic in five years doesn’t mean things have changed. We are still being asked to do more with less. If anything, the pressure to do more with less has only gotten greater. The topic must have resonated with my audience because once again, we had a spirited discussion allowing me to do a lot of listening as well as talking. My bonus for the day was getting to Baltimore in time to visit the Inner Harbor for the "Sailabration" which is part of the celebration for the 200th anniversary of the War of 1812. Seeing all the tall ships in the harbor was a nice unexpected benefit. What’s Next? On Thursday I’ll be talking about "Doing More with Less" again, this time with the Armed Forces Chapter of ISPI.  I’ll be leading a webinar on the 28th of June at 1300 hours. I’ll be joined by my colleague and producer for the event, John Wyville, Program Chair with the Armed Forces Chapter. If you are a member, please join us. I would like to thank the Armed Forces Chapter for inviting me, and especially, thank you for your service.
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:48am</span>
If you look at the Events tab you’ll see that I am in Jacksonville and Orlando, Florida on the same day. It’s not a typo. I’ve done multi-day workshops many times. I even did several two-day workshops in the same week in different cities in Nebraska once. Last month I did presentations in Baltimore and Washington in the same week. This month, we are trying something different. I’ll do a 90 minute presentation on Performance Partnering for ASTD in Jacksonville during a lunch meeting from 11:45am to 1:15pm. Then I’ll drive to Orlando to do an evening meeting from 6:30 until 8:00 for the newly formed Central Florida Chapter of ISPI. I’m glad to be able to share this presentation with both groups and also glad I don’t travel home until the following day! The large volume of activity I’ve had with the Performance Partnering workshop and presentation tells me that people are interested in becoming consulting partners to their clients, rather than being order takers. The Robinsons, who inspired me with their books, workshops and presentations for fifteen years, have retired but there seems to be plenty of interest in their work. Although the design of my presentations and workshops is entirely my own, the methods, strategies and tactics I use are derived from their work and what I have learned from them. I consider their work to be a great legacy for all of us and I am doing my best to carry on their work while I add my own thirty years of experience. Performance partnering is not magic and it’s not difficult. Any learning professional can learn and implement the eight principles I use in both proactive and reactive consulting. By using these skills and applying the simple process of creating a "gaps map," we solve real performance problems much better than we can with learning solutions alone.  Not only is this welcome to the bottom line of our organizations and to the careers of our colleagues, but it makes learning professionals far more valuable and less likely to disappear in hard economic times. As you can tell, I’m looking forward to a busy day in Florida.
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:48am</span>
Earlier this year, Chris Adams and I submitted our proposal to speak at the 3rd annual mLearnCon in June.  We planned to tell the stories of how and why we developed mobile learning solutions for at least three of our clients.  By the time we got to San Jose, however, we only had one complete story to tell.                  What I learned at the conference is that we are not alone. Generally speaking, what I heard from the experts is, "now is the time for mobile learning, but…"  The "but" is the part that got in the way of our other two success stories.  I came to the conference for answers to questions. Instead, what I found were others with similar uncertainties of how to build and implement mobile learning solutions. Like all good conferences, mLearnCon offered a wide variety of topics.  I chose to attend sessions on selling mobile learning to the enterprise, timing the introduction of mobile learning, and so on.  Chris, on the other hand, fed his insatiable appetite for knowledge in various development-focused sessions.  The remainder of this blog will certainly not satisfy those with similar tastes but Chris will address that topic in another post. Heading into each session I had several questions.  How do you develop mobile learning for an organization without a standard set of devices?  (In my opinion BYOD, or Bring Your Own Device, doesn’t answer that question.)  Is there a delivery platform that is going to win out (like Flash did)?  How do you satisfy a client accustomed to the rich media associated with traditional development tools and delivery methods?  Tablet or smart phone?  Nothing I heard convinced me that learners will be taking courses on iPads instead of computers in the next few years.  I’m not nearly as convinced as one of the keynote speakers at the conference was that our kids won’t be going to traditional 4-year colleges in 15 years.  If a person doesn’t perform his or her job on an iPad, then I don’t see why he or she should take courses on one. Two tenets of mobile learning that resonate with me are the immediacy and the size.  We have always sought to develop bite-size learning objects that learners can get into, complete, and of course log completion of in a small amount of time.  Likewise, we try to make learning easy to access.  Unfortunately, learning management systems sometimes make that an impossible goal.  What’s faster and more digestible than content delivered on your tablet or phone?  The story Chris and I were able to tell is a great one.  We developed a mobile solution for the Coast Guard to enable maintenance technicians on an 87’ cutter to access and complete Maintenance Procedure Cards (MPCs) on a hand-held device.  Each morning the techs "check out" a tablet and use it to view videos, drawings, and so on that enables them to service assets on the ship.  They can also log notes and progress.  At the end of the day, they check the tablet back in to record data in a management system.  It is not without its kinks, but it is a mobile solution that closes a performance gap. While I didn’t walk away from mLearnCon 2012 with solutions to the challenges posed in the other two opportunities that we were to discuss, I at least learned that we’re asking the right questions.  It seems as though we’re catching this wave pretty early and we just have to figure out how best to ride it out. David Carmichael, Vice President, Operations at Handshaw, Inc.
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:48am</span>
Most moderate to large organizations now have a learning management system (LMS) to track learning and development.  LMSs may even track certifications, skills, or competencies.  Some are even starting to apply gamification strategies and include achievement "badges" and leader boards.  But two recent experiences have me thinking that perhaps the LMS should be replaced by a new category of application: the Accomplishment Management System or AMS. The first experience is my recent re-read of Thomas Gilbert’s Human Competence.  This foundational book on performance improvement challenges us to measure and engineer accomplishments rather than activities.  Gilbert’s "First Leisurely Theorem" describes worthy performance as a ratio of valuable accomplishments to costly behaviors. Noted performance improvement practitioner Carol Panza, speaking at a recent ISPI Charlotte chapter meeting, gave an excellent framework for distinguishing accomplishment from behavior.  Just think about any task you perform that has been automated - like creating a business document.  Thirty years ago, the behavior may have involved handwriting a draft with a pad and pencil, then loading paper into a typewriter, then typing out the final version.  More than one resource may have been involved in the effort!  Now, the behavior almost certainly includes opening a word processing application - maybe on a mobile tablet.  But the accomplishment - creating a business document - has remained constant.Just from this one example, it becomes clear that if we want to store meaningful information about learning and development, we should look to accomplishments and not behaviors. So, where’s my AMS? More on that next week in part 2 of this blog series … Chris Adams, Learning Technology Consultant at Handshaw, Inc.
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:48am</span>
So, where’s my AMS? A quick Google search returns no results for the system I’m after.  There are a few entries for "achievement management systems" targeted at K-12 education, but these are focused on student achievement - on enhancing the traditional grade book using technology.  Unless enlightened teachers are allowed by school systems to base grades on meaningful performance, these systems won’t meet the need. The second experience shaping my thinking around the AMS is the E-Learning Guild’s recent mLearn 2012 conference. I was there to co-present a case study in mobile performance support with David Carmichael and within 5 minutes, we both noted that the conference was overrun with information about a new technology: Tin Can.  The Tin Can API is the first part of the intended replacement for the ubiquitous but aging SCORM standard.  SCORM is the standard by which e-learning content and LMSs communicate.  Tin Can seeks to update the standard to use current and near-future web technologies. One of the most exciting things about Tin Can is that it uses a simple subject-verb-object or, "I did this," format to track learner actions.  You could as easily say, "Sally attended a mentoring session," as you could, "Charles completed a compliance course."  This greatly extends what LMSs can track today and COULD be a basis for accomplishment tracking.  A look at the list of proposed standard verbs (quoted here from www.scorm.com) shows a few rays of hope for accomplishment management: Verb Result experienced, read, watched, studied, reviewed, learned Completion attempted, performed, played, simulated, answered completion, success, score, (interaction details) completed, passed, failed Special Interacted (interaction details) Achieved completion, success, score, (interaction details) Attended Completion taught (by), mentored (by) Completion Commented Comment Asked Question created, authored, wrote, edited, blogged, shared, posted completion, success, score As you can see, a few of those verbs are well suited for tracking accomplishments.  Created fits the needs of our earlier example, while authored, wrote, edited, blogged, shared, posted, performed, and even answered fill similar roles.  But, a great number of the remaining verbs belong to what Gilbert calls the "Cult of Behavior."  I’d like to see some additional accomplishment-related verbs become part of the standard like: built, assembled, generated, designed, decided, and approved.  In the meantime, how could Tin Can help LMSs become AMSs? I think the key is for learning and development areas to transition into performance improvement roles - first by employing good, performance-based instructional design.  If we first seek to identify the real accomplishments required of the audiences we serve, we can align our own learning and development accomplishments - including the tracking, measurement, and management of accomplishments rather than just activities. What about you?  Do you have an AMS?  Have you seen one?  What accomplishments are you tracking?  What verbs could help your LMS become an AMS? Chris Adams, Learning Technology Consultant at Handshaw, Inc.
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:48am</span>
I conducted the first two days of our four day Results-Based Instructional Design Series for a new client last week and it was a great experience for me and hopefully all the participants. As usual, my clients taught me at least as much as I taught them. This was a small group, but it certainly was one of the most energetic groups I have worked with. The manager of this group has been a training manager with another organization with a background in pharmaceuticals. The other members of the team are new to training and all are accomplished subject matter experts in pharmaceuticals and manufacturing. I thought SME’s with industry specific backgrounds more so than training might struggle a bit with the detail and rigor involved with the instructional design process.   However, this was not the case and this is the part where I began learning from them. My expectations were proven wrong. Their ability to approach the instructional design process without preconceived ideas of their own made them more open and receptive to the principles I have been using and teaching for decades. This group probably produced two times the work output during the class than what I normally see. While I still think it is more difficult for a subject matter expert working alone to design good learning programs, I do think that when they are able to separate their roles of subject matter expert and instructional designer, they can be very successful. As we produced actual work samples of task analysis and performance objectives, one person served the role of the subject matter expert and the rest of the team served as instructional designers. As we switched to different projects, a different person assumed the role of the subject matter expert. I was amazed at how well this worked. So what I learned was: You can be a subject matter expert at heart, but when called upon to play the role of instructional designer, with a good process and good skills, you can design perfectly good instructional materials…
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:48am</span>
In the last guest blog post, Chris Adams described how Thomas Gilbert’s Human Competence is challenging him to measure and engineer accomplishments rather than activity.  I’m reading the book also and I have already noticed how the "accomplishment" mentality has expanded my view of goal identification. During an initial analysis conversation with key stakeholders on a project, we began talking about project goals.  The client had a long list of tasks that they needed their learners to be able to do in a new system.  It was pretty clear that we’d have no trouble identifying the project’s instructional goals.  However, I wanted to understand more about the real accomplishments the client expects from their learners.  To learn more I asked questions about the gap that led them to select and adopt a new system, what they were hoping to accomplish by implementing the system, and who is currently their best performer.  Then I brought the conversation around to measurement - what would indicate that their learners had met this accomplishment and how were they planning to measure success?  The stakeholders didn’t have all the answers, but asking these types of questions definitely got the conversation going and I’m pretty sure they were thinking about their answers long after the call ended. As instructional designers we need to identify the instructional goals, or behaviors that the learners need to exhibit on the job, in order to develop effective training.  Bringing a performance perspective to this work, where we look at accomplishments in addition to behaviors, improves the outputs of our instructional design process and our client relationships, allowing us to recommend solutions that more comprehensively meet our clients’ goals.  This perspective also provides us with the information needed to more accurately and completely measure the success of the training solutions we provide to our clients. Beth Hughes is a Senior Instructional Designer at Handshaw, Inc. She takes projects through the entire process of instructional design and development, incorporating learning principles, instructional needs, and methodologies into the best learning solution for each client. Beth earned her M.Ed. in Instructional Systems Technology from UNCC.  
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:48am</span>
While I have been spending a lot of energy on Performance Improvement lately, the discipline most of us in the learning profession use most of the time is Instructional Design.  This week I will be finishing the Results-Based Instructional Design Series with a client. They already completed the first two days of in-class instruction and team-based practice. They took about a month to do a great deal of homework based on the first two days. The payoff will be that they will have four completely designed training programs by the time we finish their final two days of in-class instruction and practice. Not bad—four days of class and four completely designed training programs. On the 21st of this month, I will be headed to the Midlands ASTD Chapter in Columbia, SC for a breakfast meeting. This group also chose an instructional design related topic. Their presentation is called, "Learner Validation: Why Guess When You Can Measure?" This is a practice that I have been using since my graduate school days at Indiana University. In my very first big corporate project for the former First Union National Bank back in 1979, I had the opportunity to design and develop my first eLearning course, even though it wasn’t called eLearning at the time. I had an idea of how and why my instructional strategy would work in the delivery option, but I really didn’t know for sure if it would work, since I had never seen a program like the one I was building. So I selected ten learners from my sample audience and asked them to help me validate my design strategy. I asked them to take about 45 minutes of instruction from an early microcomputer attached to a video cassette player while I observed them and asked questions. On the first try, my strategy failed miserably.  "Not my fault," I said. "I’ve never seen this done before."  So, I tried another approach based on what I observed and asked the same ten people to help me again. This time, they figured out what I was trying to do and gave me lots of direction on what to do to make it really work for them. I followed their advice and made the changes they recommended. They asked me to see the final product, so I tested the 45 minutes of instruction with them once more. It worked and they loved it. The success of this one program built the foundation for the rest of my career. Were it not for what I learned from my learners about designing eLearning back then, I probably wouldn’t be part of this company and I wouldn’t have the honor of working with some of the best instructional designers I know. So, that’s why I say, "Why guess when you can measure?"
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:48am</span>
I volunteered to write a guest blog this week while Dick divides his time between presenting to the South Carolina Midlands Chapter of ASTD and serving on the board of an NC community bank.  Usually, when I volunteer, I have an idea of what it is I want to discuss.  This time, I had no clue, but since Dick said that September’s focus is on Instructional Design, I’ll follow suit by describing what we’re doing at Handshaw to evolve our methodology. A couple weeks ago, Dick and I had the pleasure of meeting with a graduate of Florida State’s Instructional Systems program who now lives and works in the Charlotte area.  Because most of our designers were schooled on the Dick and Carey instructional design model, and it’s the basis for our methodology today, Dick couldn’t help but to ask for some inside stories on Walter Dick, a former professor at Florida State who co-authored the model.   His former student described Walter Dick as a great professor and a regular guy, with whom she actually recalled playing on a softball team.  I can only assume Lou Carey is similarly down to earth.  She was in fact a graduate student when she began to collaborate with Walter Dick on The Systematic Design of Instruction.  So, I walked away from the meeting with not only a new colleague, but also a reminder of the origins of the Dick and Carey model and a sense that neither would take issue with our tweaking of their model. For years we have inserted what we call the "Blueprint" phase into our methodology, where we work collaboratively with our clients to arrive at agreed-upon design recommendations.  This phase is a departure from the traditional Dick and Carey model.  Now, as our company continues to evolve, we’re looking for further opportunities to expand our methodology in a manner that’s representative of the types of projects on which we work.   Whereas the Dick and Carey model focuses and begins appropriately with the identification of an instructional goal, Handshaw’s cause is slightly different.  There are certainly instances where clients approach us with a clearly defined instructional objective (derived from a business goal and performance goal) on which we can begin our Analysis process.  However, more often, we are tasked with first identifying the business goal either because we need to justify the project to stakeholders or we want to ensure alignment of our efforts for the success of the project and/or future relationship. As a result, lately we have debated the inclusion of the tasks required to define the business goal (and associated performance goals) in our standard methodology.  Our Production team will meet later this week in an effort to (among other things) standardize our Intake phase and determine how (or if) it should be represented in our methodology.  It’s important to our practice that we continue to evolve and refine our methodology since instructional models are not set in stone.  Regardless of the outcome, my new colleague’s description of her softball-playing instructional design professor leads me to believe Walter Dick will be just fine with whatever updates we make to his model.     David Carmichael, Vice President, Operations at Handshaw, Inc.   
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:47am</span>
My friend George Piskurich, who has written text books for instructional design, once said that, "Instructional design is a terrible, wonderful thing." I think we can probably say the same thing about the use of PowerPoint in the e-learning world. PowerPoint is easy, it’s ubiquitous and in e-learning it can be insidious. In my opinion, the quality of e-learning is worse today than at any point in my career. And PowerPoint may be to blame—or is it? PowerPoint slides are a great way to distribute information. We can all create them and they don’t require a lot of time. If they are used for what they are good for—conveying information— they are indeed a wonderful thing. Most of us are familiar with Harold Stolovitch’s famous quote, "Telling Ain’t Training." With that in mind then, it is safe to say that conveying information, no matter how easily and well it’s done, is not training. And that brings me to my point about the "teachable moment." I have asked many e-learning developers if their learners actually read or listen to all the information they pack into their courses. Without fail the answer is "no." So it seems Mr. Stolovitch is right, people don’t learn from information alone. People learn through practice and feedback. The "teachable moment" in e-learning comes from that magic moment when a learner tries something in a simulation, or picks the wrong answer to a challenging question, and is presented with a hint or feedback. This is when learners stop everything and pay attention.  How you manage this magic moment of attention and interest is the most important part of your e-learning course. There are two basic ways you can use practice and feedback to improve learning from your courses. First, you must create those "teachable moments" by giving your learners the opportunity to do something and make a mistake. Second, you must make the most of those moments by providing an opportunity for the learner to try again, perhaps with a hint or some new knowledge, which enables him or her to succeed. Achieving success by following guided feedback from an incorrect response is the best form of feedback. So, maybe PowerPoint doesn’t get the blame for all bad e-learning. It’s PowerPoint alone without some form of practice and feedback that can cause our e-learning to fail. In future blog entries I’ll address some specific strategies that create opportunities to achieve the "teachable moment" in e-learning. In October, I’ll be sharing my ideas about e-learning design with the ASTD Hawkeye chapter in Cedar Rapids, Iowa. Part One of Three in series
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:47am</span>
In my last blog I talked about the importance of the "teachable moment." Now, let’s get down to the practicalities of making the teachable moment work for you. I can’t stress enough the critical links from task analysis to performance objectives to measurement strategy and instructional strategy. It is very difficult to begin designing a learning solution when you don’t know the output that is required from your learner. It’s that old "Start with the end in mind" thing that Stephen Covey always talked about. A simple flowchart showing what people have to do and what they have to know in order to complete a task is all that is required. This type of task analysis is usually easy for your clients and subject matter experts to read and understand. Performance objectives serve as a written agreement among designers, facilitators, SMEs and clients describing exactly what the learner outcome will be in performance terms. Perhaps their most important purpose is to help us design our measurement strategy. Apparently we all love multiple choice questions, because I see them used a lot, likely because they are easy and fun to create in e-learning. But a multiple choice question can’t measure everything. Only a well written performance objective, based on a task analysis, can tell you what type of measurement instrument to use. Remember, in the instructional design process, the output of one step is the input to the next. Instructional strategies don’t just happen because something sounds cool to do. They happen for a reason.  It stands to reason that once you know how to measure the successful completion of a task, you will know how to teach a task. If you look at your testing instrument, it will become obvious how to teach to that test. And in the training world, teaching to the test is usually a good thing. If you have done all this, you will be focusing on tasks, results and outcomes more than content. Content may be king, but as my friend Tony O’Driscoll points out, "Context is the kingdom." Only real experience through job-like simulations and decision points of questions can put your content and learning in context. Learners quickly forget what they read or hear. They remember and master what they do in the context of what they have to perform on the job. To get them to the teachable moment, give them some practice, give them some feedback, and they will gain the confidence to learn the rest on the job. I will be teaching all of this in a new workshop entitled, "e-Learning starts with Instructional Design," in Cedar Rapids Iowa to the ASTD Hawkeye Chapter on October 11th. Part Two of Three in a series
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:47am</span>
We have defined the "Teachable Moment" as the point at which the learner has been asked to do something or answer a question and has failed, or maybe succeeded by guessing correctly. It is at this point that we have that individual’s attention and it is what we do next that will help that person discover and learn. Since multiple choice questions are so popular, let’s start with one. Suppose I select a wrong answer to a multiple choice question, and the only feedback I get is that my answer is incorrect. My incorrect response is scored and I go on without really knowing why I was wrong. But suppose instead I get a well written hint that makes me think. I try again, and this time I get the correct answer. Even if I only get the correct answer because I guessed, I still get a message telling me why my answer is correct. Right or wrong, either way I learn something. And, if I get the wrong answer on my second try, I still get a message telling me the correct answer with an explanation of why it is the correct answer. We have been writing prescriptive feedback at Handshaw since I started the company in 1985. I know this practice isn’t new, but I see a lot of e-learning with no prescriptive feedback. I can guess the reason people don’t use prescriptive feedback is because it takes time and effort. I once had a developer tell me that he was so busy he "had to take the path of least resistance and just get it out there." I can understand his predicament. But what about all that content that we pack into all those text screens with a few decorative graphics that every developer I’ve ever asked says people don’t read anyway? You might find it easier to present simple concepts with a question or simulation exercise and let the feedback do the instruction. If learners get things right the first time, they don’t have to read or listen to all that content. If they guessed the correct answer, they have the option to listen or read. If they have done something wrong, you can be assured you have their attention. They will listen, read, and learn. The approach I just described also makes e-learning that is self-paced and personalized—something e-learning is supposed to offer our learners. Sure, we’re all going to have to make those informational programs from time to time. But, when you are designing something that will require learners to be able to perform important tasks and use knowledge to solve problems, prescriptive feedback used during the teachable moment is the most efficient and effective way to get the job done. Part three of three in series 
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:47am</span>
Last week I had the pleasure of trying out a new workshop for the ASTD Hawkeye Chapter in Cedar Rapids, Iowa. In fact, the workshop was requested and designed by their VP of Programs, Marcie Van Note. Thank you, Marcie! Based on the level of activity the group kept all day, this one is going into our regular offerings. Marcie wanted a program on e-learning that could make new designers and developers more successful. We settled on the title: E-learning Starts with Instructional Design. I wanted her ASTD members to understand the links from task analysis to performance objectives to measurement and instructional strategies. We spent the morning doing real task analyses with five different groups. By mid-afternoon, we had written performance objectives, which we used to create measurement and instructional strategies. We kept a focus on e-learning, although at least one group could see that e-learning alone would not do everything they needed for their particular learning solution around sales training. Around 3:30, Chris Adams and Beth Hughes from our staff took over via a live webinar to show examples of some e-learning strategies we have used, along with discussions on tools and other tips. When I asked people to be honest and tell me if they could really see themselves using these instructional design techniques in their work, they expressed concerns about having enough time to complete the tasks and if management would support the new practices.  My best advice to them was to simply try some of what they learned; see if it works and is well received. The best way to prove a point is by good example. I have to compliment all 25 participants for being engaged and very dedicated through the day. I really enjoyed your enthusiasm and willingness to try new things. I enjoyed being with you all and welcome the opportunity to come back to Cedar Rapids in the future. Here are a couple of pictures of the group hard at work.  
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:47am</span>
On the sliding scale of too much versus not enough content, many training developers and subject matter experts err on the side of too much.  Chris Adams, Handshaw’s Learning Technology Consultant, often says, "Our clients think the hardest part of developing training is deciding what to include; instructional designers know that the hardest part is deciding what to leave out."  Omitting unnecessary training objectives benefits our clients and their learners: our clients do not have to pay for extraneous training to be developed and save lost productivity costs for the extra time their employees spend in training.  Their learners also reduce time spent in training, and with more relevant content, motivation is higher because the content is applicable, and retention are increased because cognitive load is more effectively managed. So, how do we decide what to leave out?  This is what we seek to answer during our task analysis work, and two factors in particular that help us make this determination:  1.       Performance need - Must the learner complete this task to meet the identified accomplishment? 2.       Learner deficiency - Does the learner not know how to do it? The key to figuring out whether to include the task in training is to realize it’s not whether either factor is true, it’s whether both are true that makes the determination. For example, let’s say we are conducting a task analysis for a course on making coffee and a task of selecting the right filter size for the coffee machine.  During our analysis, we may discover that the correct size filters are delivered with the rest of the office supplies, so this isn’t actually a selection that learners need to make.  If we use either factor to determine whether this task should be included, we could say, "It’s true that learners don’t know how to select the right size filters, so we’ll include it in the course."  However, if we insist that both factors must be true in order to include the task, we could say, "Well, it’s true that learners don’t know how to select the right size filters, but they don’t actually need to do so in order to make coffee, so let’s leave it out." If you require that both questions be true about a task in order for it to make the cut into your course content, instead of including it if either question is true, the content of your course will more accurately and more efficiently help your learners accomplish their performance goal. Beth Hughes, Senior Instructional Designer, Handshaw, Inc.
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:47am</span>
Ever notice that when you tell someone you’re an instructional designer, they have a hard time trying to figure out what you do? Try telling people you’re a performance consultant. It’s even worse. This got me to thinking about the differences between the two disciplines and how to draw a line between them? I often tell people, "instructional design is doing things right and performance improvement is doing the right things." But is it right to separate the two in actual practice? I think practicing instructional design without performance improvement does your client a disservice and can potentially waste a lot of time and money. Even if a designer does a great job of designing instruction that enables performers to achieve an instructional goal, if that instructional goal isn’t linked to a business goal, your client may not achieve the business goal. When I bring this concept up to a room full of instructional designers—which I have - I get a lot of push back  People tell me it’s not their job to suggest other solutions to a problem if they do not involve learning. Some tell me it’s not even their place to talk to the "true client" when they are given their direction from a training manager. And even when they do get to talk to the actual owner of the line of business they have a difficult time asking for the desired business goal. I want to know what you think. Do you distinguish between instructional design and performance improvement or do you think they should be practiced together?
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:47am</span>
In our industry, a lot of time is spent talking about performance goals and accomplishments. They are the driving force in the design process that will ensure training is focused on the right tasks. But what drives the performance goals? What "thing" is management so concerned about that employee accomplishments need to be addressed? Whether or not your stakeholders explicitly say so, there is a business goal at risk. Or, as Dick would say, "the thing that keeps managers awake at night." When we know the business goal we can use it to direct the design and assessment strategies. Time and resources will be saved when all of the effort has been focused on addressing the issue at the root of the problem. The project stakeholders will appreciate receiving an effective solution the first time and will start to see you as a trusted consultant with a valuable opinion. So how do you recognize the business goal when speaking with stakeholders? Jim and Dana Robinson identify two types of business goals: Business Problems - a gap impacting the business that needs to be fixed Business Opportunities - a process that is working fine now, but could be optimized to work better Use this formula to identify the elements of a business goal during your conversations: What needs to change + By what amount (from X to Y) + In what time frame + To what effect Let’s look at an example: "Reduce production costs due to error from $3,000 per 100 units to $1,000 per 100 units in the next three months to meet Q2 financial goals." You can clearly see what is keeping management up at night: they are worried about profits and have discovered that there are costly errors during the production process. You now have a concrete accomplishment for learners (reduce production errors), the time frame you have to improve it (three months), a metric for assessment (cost of errors per unit), and the impact your training will have on the business (improving the bottom line). Remember, it could take several conversations to reach this point. You may not always be able to get a business goal, and that’s okay. Bringing up the topic can be enough to get everyone thinking about the right solution. When you take the time to build stakeholders’ trust in you and can talk about the business needs alongside training needs, you can be sure your work is effective and valuable. Peter Engels, Instructional Designer/Developer, Handshaw, Inc.
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:47am</span>
…or for a few days at least. In December, I’ll be visiting two of my favorite southern cities: Birmingham, AL and Memphis, TN. Two things they both have in common are great hospitality and great barbecue, but that’s not really why I’m going. On December 3rd, I’ll be visiting the Greater Birmingham Chapter of ASTD - AL to deliver one of my most requested presentations, "Instructional Design: Demonstrating Value through Results". The industry and many of our clients have seen a resurgence of interest in instructional design in the last five years or so. This program will focus on three low cost, high value components of instructional design that I don’t see in use that often. The primary focus of the presentation is increasing the best practices of instructional design and getting positive results. On December 7th, I’ll be attending and speaking at the annual Employee Learning Week Event sponsored by the ASTD Memphis, TN chapter. This year the focus is on performance improvement with a one and a half hour session called, "Performance Partnering - Proactive and Reactive Performance Consulting". This session is based on the work of Jim and Dana Robinson. We all know we would like to have earlier input with our clients and we would like to be valued more as consultants and business partners. This session uses video and live role plays to help you master the skills of proactive and reactive performance consulting. If you live near Birmingham or Memphis, please take the opportunity to attend.
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:47am</span>
"Instructional Design: Demonstrating Value through Results"
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:46am</span>
Annual Employee Learning Week Event "Performance Partnering - Proactive and Reactive Performance Consulting"
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:46am</span>
"Leading the Learning Organization"  Join us at 1:00 PM EST…
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:46am</span>
"Performance Partnering - Proactive and Reactive Performance Consulting" - 3-hour Clinic
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:45am</span>
This has been a challenging year for internal training departments. Our company has seen many organizations go through major re-organizations in 2012. And with every re-organization we are familiar with, we have seen layoffs of trainers and instructional designers. Where is the role of performance consulting in these organizations? In many cases, it has been difficult to find evidence of a performance improvement effort. It’s clear that many organizations are trying to cut costs by moving people around and reducing headcount. This is neither a new nor innovative strategy. A better idea might be to reduce the number of "Band-Aid" type training programs that are developed and only develop training for training goals that can be linked to business goals. This is the role of performance improvement, to link training goals to strategic business goals. This would dramatically reduce the amount of time productive employees spend away from work attending training classes or sitting through e-learning programs and increase the quality of training programs that contribute directly to the achievement of strategic business goals. Perhaps the answer to why this isn’t happening is because there is some personal risk involved on the part of those who choose to practice performance improvement instead of being an "order-taker" for any training request that comes their way. We are definitely sticking our necks out when we offer to be responsible for affecting outcomes rather than completing activities. This might also have something to do with the fact that many organizations rarely follow up training initiatives with any kind of data gathering to see if any of the desired results were achieved. Measurement and evaluation go hand in hand with performance improvement. It seems to me that our resources would be better spent to find out why certain business results are not achieved, identify all the causes for the shortfall—not just those related to performance—and implement a variety of solutions designed to mitigate the causes of failure. Launching a successful performance improvement initiative in your organization will take time, but it is not difficult. There are risks, as I mentioned, but the rewards will outweigh the risks. One of the most important factors in creating a successful performance improvement initiative in your organization is to find or cultivate leadership with a vision. Having committed leadership is the only way these initiatives can achieve sustained results. Next, you must build internal skills in a consistent manner. Attending training and reading books is not enough to build skills as a performance consultant. You need to practice your skills with a colleague who can coach you and give you feedback on your performance. You need to seek champions in your organization and make them successful. Once you can accomplish a successful solution or two, you only need to leverage your results to keep a sustained effort. Become a true Performance Partner to your organization… Next week’s blog will provide more thoughts on creating a performance improvement organization. Part one of three 
Dick Handshaw   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 31, 2015 09:45am</span>
Displaying 26617 - 26640 of 43689 total records