It is better to know some of the questions than all of the answers. — James Thurber When you work independently, sometimes the project you accept wouldn’t be your first choice, but you don’t have much in the way of other choices. Some of my work last year fit into the latter category.  For quite a while I was writing or rewriting standard operating procedures in a manufacturing facility-not as exciting as it sounds. Opportunities lately pique my interest more, either because of the client’s business itself or because of how that business connects to the outside world. That’s the "work worth doing" part-when what you’re working on seems to have a larger value than the compensation you receive for that work. Someone who seems to enjoy his work is Jeffrey Levy, a On a different track, I’ve been finding out more about government and social software tools.  On Twitter, I follow Jeffrey Levy (@levyj413).  He’s director of web communications for the Environmental Protection Agency. I’m very interested in how the government will make use of new tools, and Levy’s approach makes  a lot of sense to me: Remember: mission first, choose the right tool, measure and evaluate, and then teach the rest of us. Thanks to Levy’s messages, I find useful ideas and smart people like Gwynne Kostin, whose blog, Gwynne on dot-gov, asks, "How do we use technology and communications tools to make government more useful, more efficient, and more transparent?" As with her post, Open data: compare and contrast.  It highlights some of the constraints government works under-for example, implications of the Privacy Act, which bars agencies from releasing records without a request from, or the consent of, the individual to whom the record pertains. No answers, but good questions to think about before everyone’s entire data history is available online to everyone else.
Dave Ferguson   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 05:20pm</span>
Until this morning, I didn’t know about Ada Lovelace Day, which seems to have resulted from a pledge/challenge by Suw Charman-Anderson.  The idea was to highlight women in technology. Looking at my feedreader, my Twitter stream, and my Facebook page, I see quite a few women whose work hinges on some way in technology.  But the first name that came to mind is someone not all that well known any more. In her later years, she was known as "Mother COBOL."  Typing that reminds me that COBOL isn’t all that well known any more, either. Grace Murray Hopper taught mathematics at Vassar in the 1930s while earning her PhD at Yale.  She resigned her position late in 1943 to join the Navy Reserve WAVES.  As a lieutenant with the Bureau of Ordinance, she was assigned to work on the Mark I computing machine, for which she eventually produced a 500-page manual of operations. The Navy used the Mark I for gunnery and ballistics calculations.  It was 55 feet long, 8 feet high, and contained over 750,000 parts.  It was the predecessor to several other early computers such as UNIVAC. Hopper invented the compiler-the program that translates computer programs into machine language.  She claimed that she did so because she was lazy; the compiler did the grunt work and allowed her to focus more on mathematics.  Her FLOW-MATIC compiler so greatly influenced COBOL that she’s known as the mother of COBOL. In 1997, the Gartner Group estimated that 80% of the world’s business ran on COBOL. After World War II, Hopper tried to transfer to the regular Navy, but was turned down because of her age (she was 38).  She remained in the Navy Reserve until 1966, retiring as a commander.  She was recalled to duty "for a six-month period" that lasted four years, and after retiring again was asked to return once more. When she finally retired for good, Rear Admiral Grace Hopper at age 79 was the oldest officer in the Navy.  The ceremony was held on board the Constitution, the oldest ship in the Navy. Hopper died in 1992, at 85, and is buried in Arlington National Cemetery. In 1969, Hopper won the Data Processing Management Association’s first "man of the year" award. The Association for Computing Machinery has an annual Grace Murray Hopper Award for young computing professionals. The USS Hopper, a guided missile destroyer, is only the second U.S. Navy warship named for a woman. (I added the following very late in the day.) Information about Hopper from the Naval Historical Center in Washington, DC, including this image.  It’s a page from the log book for the Mark II Aiken Relay Calculator used at Harvard University.  The entry, for Sept. 9, 1945, explains that a moth was found at Relay #70, Panel F. "First actual case of bug being found."  Reportedly, the moth was taped into the log book, and the entry made, by Grace Hopper.
Dave Ferguson   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 05:20pm</span>
Series: Ten Steps to Complex Learning • Next post in this series » I’ve been (slowly) reading Ten Steps to Complex Learning, by Jeroen J. G. van Merriënboer and Paul A. Kirschner.  The subtitle explains why: A Systematic Approach to Four-Component Instructional Design. I read a lot about the death of instructional design, the end of training, and the New Jerusalem of learning that’s due any day.  Certainly a lot of superstition and nonsense gets daubed with the label "instructional design," like a kind of cognitive Clearasil.  Still, I can’t help think that few people are going to learn to manage power-generation stations, conduct clinical trials, sell aircraft engines, or produce FEMA-acceptable flood elevation certificates solely through self-guided learning. So I decided to plow through this book, which I’ve described with a bit of humor as being written in a language very much like English: the prose is dense, and very academic.  So far it’s worth the effort, and I’m going to summarize key parts here. (Key part: something I pay enough attention to that I make a note on paper as I’m reading.  This is an ancient custom among my people.) Van Merriënboer and Kirschner aren’t shy: The fundamental problem facing the field of instructional design these days is the inabiliity of education and training to achieve transfer of learning. Which is something like AIG not being able to actually insure anything, isn’t it? One point the authors make is that most complex skills require the learner to coordinate from a range of "qualitatively different constituent skills."  That last phrase is important to them: not only is the whole of a complex task more than its parts, but the constituent skills are not parts of the larger task but aspects of it.  They’re not sub-skills, which you add together to make up the Big Skill. Which, they argue, makes the analytic approach of many traditional instructional design approaches counter-productive.  For example, what they call the transfer paradox comes into play: the instructional methods that work best for isolated objectives often work poorly for integrated objectives. To make that plainer: we spend too much time fiddling around with nice, clear, low-level objectives.  Then we lack time and money (and, perhaps, the will) to develop integrated learning.  Then we wonder why the training/learning function has such a dismal reputation. But those isolated ones are what we tend to grab onto, because it’s easier to design around them, easier to create test items, and easier to cram them into an LMS ("Lessons Mean Simplicity"). Learning to use the Amtrak reservation system is a complicated task, but maybe not all that complex.  Learning to act on traveler’s questions is also complicated.  Developing training for either set of skills is inherently less difficult than developing holistic training for an effective Amtrak reservation agent-but that’s what Amtrak’s really looking for. The usual answer to the problem often seems to be "watchful waiting." The performers go from training to the job, and we hope that their random encounters with reality end up filling the gaps. Van Merriënboer and Kirschner want to grapple directly with such complex learning problems.  The model they advocate sees four main components to a learning blueprint: The learning tasks that someone needs to master.  (Strictly speaking, I’d say these are the on-the-job tasks which the person currently doesn’t know how to do, but it’s not my model.) The supportive information that comes into play when you’re working with skills that are performed differently from problem to problem.  These skills, which they call schema-based, benefit from things like mental models of the overall domain (e.g., pharma research) and cognitive strategies for working in that domain. The procedural information that guides those skills that are performed the same way from problem to problem.  This is the how-to knowledge (e.g., using the clinical trials database) that’s a routine part of the overall task. Part-task practice to strengthen and automate certain "recurrent constituent skills." Van Merriënboer and Kirschner argue that people can only perform certain constituent skills (which are aspects of the larger task, remember) if those people have a certain level of knowledge about the larger domain.  "Select an appropriate database," as they point out, doesn’t make any sense if you don’t know what makes databases appropriate to the search you’d like to perform. To foster integration and avoid compartmentalization, their model includes an emphasis on inductive learning: you as the learner work with specific problems so you build and improve mental models for the principles behind those specific problems. …all learning tasks [should] differ from each other on all dimensions that also differ in the real world, such as the context…in which the task is performed, the way in which the task is presented, the saliency of the defining characteristics, and so forth.  This allows the learners to abstract more general infromation from the details of each single task. That’s how we learn a great deal of what we know.  And, yes, a good deal of that happens informally, though I don’t see that as an argument for not trying to create learning situations when the informal can happen more predictably and more rapidly. Related to this idea, the authors advocate always having learners work with whole tasks.  That might mean starting with simple cases or examples.  Other approaches include providing support (say, a process overview for the clinical-trial system) and guidance (a job aid for forming database queries).  They also make use of task classes, by which they mean categories of tasks.  In their ongoing database-query example, one task class has to do with performing searches when the concepts are clear, the keywords are in a specific database, when the search involves few terms, and where the result includes only a limited number of articles. I’d call that the "clear, simple search" class. You can imagine the other extreme: a poorly phrased request involving unclear concepts, with little knowledge of the appropriate databases, calling for complex search queries and producing large numbers of relevant articles which require further analysis. How many task classes do you need?  That seems to depend on the range of variation between the Clear Simple Search class and the Nightmare Search class. There’s a lot more going on; without intending to, I guess I’m starting another series. The posts in this series: Complex learning, step by step (that's this post) Complex learning (coffee on the side)Ten little steps, and how One grew
Dave Ferguson   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 05:19pm</span>
Series: Ten Steps to Complex Learning « Previous post in this series • Next post in this series » Ten Steps to Complex Learning says that plans for such learning should always include the learning tasks, supportive information (for skills you apply differently from problem to problem), procedural information (for skills you apply the same way each time), and part-task practice for skills that demand a high level of automaticity. Components: how you put complex learning together The components work to integrate rather than compartmentalize skills, to coordinate the application of skills with associated knowledge and attitudes, and to differentiate between the methods that help people learn different kinds of tasks. In his comment on the previous post in this series, Dave Wilkins talked about call center workers who could either use the software well, or connect well with customers, but not both-because their training never integrated the two. That doesn’t mean you try to teach everything at once.  You can, and should, present realistic problems that start by presenting a simplified version of everything at once… a whole task with a great deal of scaffolding (as Van Merriënboer and Kirschner call it). "Whole task," to me, doesn’t mean the entire job (Amtrak ticket agent, trauma center nurse, Starbuck’s store manager).  It’s a flexible term, like "relatives,"  and it makes sense in context. Even if, like me, you’ve never worked at a Starbuck’s, you can imagine some high-level whole tasks for the store manager: Keep the store equipped and supplied. Keep the store staffed. Comply with company policies and procedures. Serve customers. Where to begin?  In one sense, it doesn’t matter. The Ten Steps model is systematic (there are inputs, processes, outputs; the outputs from one area become inputs to another) and systemic (activity in one part influences another). Manage learning so they learn to manage Here’s one way I see this in action: let’s say you begin with "keep the store staffed" tasks for that Starbuck’s manager-to-be.  Constituent tasks might include hiring, scheduling, training, and coaching. (You’ve already noticed that training and coaching have connections to that "serve customers" cluster of tasks, and probably to the "comply with policies and procedures" one, haven’t you?) I’m not going to do a whole store-manager analysis (unless Starbuck’s is dazzled by my insight); I’ve just chosen this as a complex learning problem. You can picture specifics in the "staff the store" area: identify staffing needs, recruit candidates, interview candidates, hire employees, train employees, schedule employees.  (Task analysis, by the way, is a prerequisite but lies outside the Ten Steps framework.  Can’t teach tasks if you don’t know what they are.) How do staffing problems vary? Simple learning problem with lots of scaffolding: "Carla is sick; she won’t be here at 2." "Okay, let me check her shift and see who’s off today."  Carla: 2 - 6.  Roster: Tomas, Junelle, and Van are off; Ben came in at 10 and leaves at 3; Paula comes in at 4. That could be the beginning of a full case study (on paper, in video, whatever) about which the learner would answer questions or make judgments.  You add richness by pointing out that Carla is a barista, but Paula hasn’t yet learned to make all the drinks-so Paula can’t fill in for Carla. A variation might include information about the skills of other people scheduled to work.  "Irene can make the drinks, and I’ll put Paula at the register." The most-difficult case is what vM&K call a conventional task: a situation and an outcome to reach, period.  Like, "The new store opens on the 15th.  Staff it." Example of supportive information: principles for asking (or telling) people to work overtime; guidance for offering extra hours.  You’d make these appropriate to what vM&K call the task class (a group of equivalent learning tasks with roughly the same level of difficulty).  You’d also present the supportive information ahead of time, because it interferes with on-the-job performance. By contract, procedural information is helpful when it’s just-in-time.  One definition of a job aid is "an on-the-job guide that tells you what to do and when to do it."  Imagine a scheduling tool that displays hours per day and per week for each employee, so the manager could see at a glance that Jeff’s got too many hours to be a substitute for the ailing Carla. As for part-task practice, you might want to strengthen the new manager’s ability to quickly and accurately track hours in the store’s scheduling and payrolls systems. That’s because applying rule-based procedures successfully strengthens the use of those procedures. (Please don’t mistake these speculations as actual advice for training managers of coffee shops.  They’re hypothetical examples—like metaphors, but you can charge more.) Meanwhile, back on the job… Van Merriënboer and Kirschner say that the fundamental problem of instructional design is the failure to transfer learning to job performance.  The Ten Steps approach tries to avoid that failure in several ways: Whole-task learning helps integrate skill, knowledge, and attitude, which means you’re more likely to connect a new situation to things you already know. A progression from easy to difficult tasks builds your ability to coordinate the various skills, knowledge, and attitudes in context.  In the olden days, we talked about "increasing approximations of on-the-job behavior." Combining rule-based information (for skills you perform the same way each time) and schema-based information (for skills you apply in different ways each time) has a double effect: Automaticity frees up cognitive resources.  You’re not analyzing individual letters or grammar structure as you read this post because your reading-text skills are automated. A mental model  (a schema) helps you interpret new situations in terms of the structures you already know. Another benefit of using a schema: you learn to monitor and adjust your own performance.  When I was trying to learn CSS, time and again I’d try one of the problems in Head First HMTL (which, in case I haven’t said so this month, is a fantastic example of complex learning in action) only to have my solution fail. But I’d made enough progress that I could study the problem and my code.  Eventually, I’d say "Ohh!" with a combination of insight and exasperation.  I saw what the mistake was, and I  connected the situation (what I wanted to do) with the solution. Next time, I promise, I’ll start talking about the ten steps.  Be warned, though: vM&K fibbed in the title. Though there are-theoretically-ten steps which could be followed in a specific order, in real-life instructional design projects, switches between those activities are common, yielding zigzag design behaviors. To compensate, you’ll get a bonus learning theory at no extra cost. CC-licensed "coffee learner" photo by Earl - What I Saw 2.0. CC-licensed "self-awareness" photo by jasoneppink. The posts in this series: Complex learning, step by stepComplex learning (coffee on the side) (that's this post) Ten little steps, and how One grewProblem solving, scaffolding, and varied practice
Dave Ferguson   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 05:19pm</span>
Series: Ten Steps to Complex Learning « Previous post in this series • Next post in this series » You’ve probably said to yourself, "Is obair-latha tòiseachadh." No?  Maybe you’ll agree that getting started is a day’s work.  Part of what I’ve worked on (or fretted about) the past few days was how to move from the overview chapters in Ten Steps to Complex Learning to those deal.ing with the actual steps. So here’s a chart I made, showing the four components for complex learning (boxes on the left), and the steps aligning with each component.  Remember, the application of the steps is not necessarily linear, but the diagram is, and the discussion will probably be. Whew.  On to Step 1… Admit it: sometimes you think it’s the content. Van Merriënboer and Kirschner claim, maybe a bit simplistically, that "traditional" instructional design starts with the subject matter and adds practice items.  The Ten Steps, in contrast, start with "whole-task practice tasks" which become the backbone for everything else. "Whole-task practice tasks" isn’t a felicitous phrase, but by now the idea is clear: meaningful tasks that look like a complete element within the overall complex job.  "Interview a job candidate" is a meaningful task; "ask open and closed questions," in my opinion, is not-it’s one of those constituent skills that make sense to the learner in the context of the whole task (interviewing). We’re talking about complex learning here.  In real life, it’s hard to figure out a complex problem; it’s hard to be sure the solution will work.  In fact, recognized experts will disagree about the best solution.  (How do you prevent traffic jams?  How do you unravel an existing jam?) Using real-life tasks as the basis for learning tasks…[confronts] learners with…the constituent skills that make up complex task performance….it [engages] the learners in activities that directly involve them with the constituent skills…as opposed to activities in which they have to study general information about or related to the skills. So the French were right:  en forgeant, on devient forgeron. By working at smithing, you become a blacksmith.  You sometimes see this translated as "practice makes perfect," but that’s not the case.  Practice can reinforce what you’re doing, but on its own doesn’t guarantee you’re doing the right things, let alone doing them right. vM&K advocate that the learning tasks you design put learning before performance.  By that they mean: have learners focus their attention on the cognitive processes for learning, rather than simply on performing the tasks. I think that may be one of the biggest challenges in this model.  We’re not accustomed to thinking about how we learn.  We’re not used to stepping back and watching ourselves as we perform.  This isn’t going to please the training-as-sheepdip crowd. For people who actually want to encourage complex learning, though, the fact that it’s not simple isn’t a deterrent.  They want to know ways to make whole-task learning effective- like by changing the environment in which they perform the tasks, and by providing support and guidance. Environments: getting real about simulation You can learn many complex skills in the real work environment.  But often, the on-the-job setting hinders learning.  You can’t provide the necessary support (no expert available; no place to put her; she has no time).  You can’t present the right task at the right time (because useful Problem X doesn’t occur predictably).  It’s expensive, inefficient, or dangerous for someone to learn on the job. Finally, the amount of detail in the real-life setting can overwhelm the novice. Thus, simulation.  And I’ll bet you’re thinking of high-tech machinery or immersive online environments.  Not bad, but not the only way to go.  vM&K argue that simulations can differ from real life in two important ways-physically (looks like the real job) and psychologically (feels like the real job).  And it’s the psychological fidelity that’s more important.  A setting with too much physical realism often provides "seductive detail" that distracts the learner. That explains why you hate those fatheaded computer simulations where you have to walk through doorways and press elevator buttons and open doors.  You already know how to do that stuff-what you want to know how to do is do a real-life complex task like manage a project or negotiate with a vendor. Support with the problem, guidance with the process To design effective learning tasks, you need not only the real-life problem with its situation and details, but also an acceptable solution and, ideally, the problem-solving process that generated the solution.  This means you’ve got a worked example (which provides the task support) and the process-related information (the guidance). vM&K give an example task: controlling air traffic.  Worked examples  might include radar and voice information about a particular problem situation (the "given"), similar information showing a safe resolution of the problem (the goal), and actions necessary to reach or maintain safety (the acceptable solutions). Guidance helps the learner approach the problem through useful approaches and heuristics-for example, strategies to help reach or maintain safe air traffic situations. Setting aside the process stuff for a bit, what you have is a kind of high-level recipe for learning tasks.  Each task has a given state, a goal, and one or more acceptable solutions.  Varying those elements gives you different types of tasks: A reverse task presents a goal and an acceptable solution.  The learner has to figure out the given.  (The images on your web page, which looked fine yesterday [example A], are messed up and look like this [example B].  How come?) An imitation task give you a case study (given, goal, solution) and a "conventional task" (given and goal); you work from the case study to solve the new problem.  (If you need to sell this to clients, it’s "case-based reasoning.") Tasks with non-specific goals push learners to explore the problem area-to move past solving the immediate problem and think about how problems get solved.  In vM&K’s ongoing research example, the learner receives a research question and a highly specific goal.  The task: come up with as many research queries as possible that could be relevant, and make those queries. Completion tasks provide the givens, criteria for a goal state, and partial solutions.  These require learners to study the partial solutions.  Completion tasks, the authors claim, are especially useful in design-oriented task areas. All these approaches encourage the learner to think about the problem, the solution, and useful steps.  That means they’re also abstracting like crazy, mining the solutions and using induction to build cognitive frameworks. For those complex, non-recurrent skills, this means more than lots of the same type of practice.  "The bottom line is that having students solve many problems on their own is often not the best thing for teaching them problem solving." Or: extra work doesn’t make you better at math; it just makes for extra work. Speaking of work, there’s enough of it in this post.  I’m not quite done with Step One, and so the next post in this series will touch on tools for problem solving, guidance, and induction. CC-licensed mockup of computer app by striatic. The posts in this series: Complex learning, step by stepComplex learning (coffee on the side)Ten little steps, and how One grew (that's this post) Problem solving, scaffolding, and varied practice
Dave Ferguson   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 05:19pm</span>
The quick-and-easy definition of a job aid is something you use on the job to tell you what to do and when to do it, so you don’t have to memorize the information. Job aids often struggle against what Tom Gilbert called the great cult of knowledge.  How many times have you heard performance dismissed with, "He had to look it up?"  A senior executive at Amtrak resisted the use of job aids for the reservation system because people "are supposed to know this stuff." Fortunately for the ticket and reservation agents, that view didn’t prevail.  For what was the new reservation, we produced 137 separate job aids.  One of those, for the "availability" command  (used to check schedules), hads seven optional parts and 288 possible ways to combine them. So the question’s not "how does a person learn this entry?"  It’s "how does a person do the job?"  Job aids offload some of the alleged learning (memorization) so people can accomplished useful results. Learning by doing Yes, some on-the-job performance should be virtually automatic.  If you’re an Amtrak reservation agent, you used the availability entry a lot-but not all 288 forms.  As you worked, you came to memorize five or ten combinations that suited the requests you handled most often. You relied on the job aid for the oddball requests.  Or you used the standard entry because you’d learned there are only two trains on the route in question, and they’d appear in response to any of the 288 combinations.  (This knowledge, by the way, is a heuristic.) So one of the functions of a job aid is to serve as training wheels.  Job aids guide the novice so that he produces results similar to those of an expert without having to internalize all the knowledge the expert has. Repeated successful use of the job aid is reinforcing on two levels.  First, you come to trust the job aid; later, you tend to incorporate the job aid’s guidance into your own repetory of skill.  You don’t need the job aid any more, because you’ve learned the task through on-the-job performance. What not to learn In some cases, though, the organization doesn’t want you to learn the task.  Usually, that means there are high consequences to incorrect performance. We really don’t want you making a mistake because you relied on your memory.  Another reason to avoid memorization: the task frequently changes.  Instead of trying to teach you the new way once a month, the organization wants you to rely on the job aid. Job aids used like this-think of an airline’s preflight checklist-are a kind of guard rail.  The job aid protects you from incorrect or unsafe performance.  (In addition, the organization needs to foster reliance on the job aid, in part to overcome the I-know-this-stuff attitude.) In the photo above, the bicylist has training wheels to help her learn the basics of riding.  The bridge she’s crossing is wide enough for people to cross without having to have railings-but the risk of someone falling is far greater than the cost of having those railings. The railings are like performance support built into the overall system.  Long ago at Amtrak, if someone wanted to travel from Detroit to San Diego, you had to know that the trip required a change of trains in Chicago and in Los Angeles.  The computer system couldn’t figure that for you.  So lots of training time went into "route structure." Today, while it’s helpful for an Amtrak agent to have a mental model of the routes, she can enter a request with just the origin and destination cities.  Route structure and sensible connections are now built into the reservation system.  If the passenger wants to go by way of San Francisco, the agent can modify the entry (possibly with the help of a job aid) to get the system to figure this alternate route. Guard rails, training wheels: they both help you get where you want to go. CC-licensed training wheel photo by Magalie L’Abbé.
Dave Ferguson   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 05:19pm</span>
Old men forget. Yet all shall be forgot But he’ll remember with advantages what feats he did this day. (Henry V, Act IV, scene 3) April 1st is my dad’s birthday (he turned 96 today).  When we were kids, we had lots of fun with the April Fool’s aspect.  My youngest brother, who had a more easygoing relationship with Dad than any of the older kids, used to pull elaborate pranks.  He once slit the glue from the bottom of a paper lunch bag, knowing that in his ready-for-work morning routine, Dad would take that bag and casually toss in his deviled-ham-on-homemade bread sandwich. Dad’s world has gotten much smaller in the past two years.  His hearing has deteriorated, his vision is much poorer, and his memory-well, it fades here, and it’s missing over there, and in this other place it stops and dwells for a while. He doesn’t have dementia, but I thought about him this weekend as I heard a radio program on Alzheimer’s, Memory, and Being.  I learned of a writing workshop for people in the early stages of Alzheimer’s.  Psychologist Alan Dienstag was urged by novelist Don DeLillo to encourage such people to write their stories while they could. Writing, DeLillo said, is a form of memory.  Through writing, people in the workshop changed how they saw what was left of their conscious lives.  Instead of losing their memories, they were giving them away. My dad’s always been a great teller of stories.  They’ve tapered off, but one or two still emerge, pieces of his life that he gives freely. Those of us who work in areas of training, learning, and communications have tools of vastly more power and reach than the kitchen tables of Nova Scotia and Detroit, Calgary and Boston, where Dad talked while downing vats of tea.  But that power’s not infinite. I’m glad that, even without realizing it, Dad’s given away so many of his memories, and I’m sorry not to have kept more of them than I have.
Dave Ferguson   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 05:19pm</span>
Series: Ten Steps to Complex Learning « Previous post in this series • Next post in this series » In the Ten Steps to Complex Learning, I’m still working through Step 1: Design Learning Tasks.  The previous post focused on support related directly to the learning tasks: the givens, the goal, and the solution(s).  A central idea for van Merriënboer and Kirschner is the primacy of working with whole tasks; by varying these factors (e.g., having learners work with a partial solution), you can provide richer learning experiences. Complex skills are, in vM&K’s terms, non-recurrent: you apply the skills differently to new problems.  To do so effectively, you need to build cognitive strategies.  You’re creating for yourself mental models for grappling with problems in this field.  Step 1 gives three examples of such guidance: modeling, process worksheets, and performance constraints. Modeling: learning by example One common form of on-the-job learning is seelou training-"See Lou?  Do what Lou does." The intention is good; Lou is (theoretically) a skilled performer, and if you could do what Lou does, you’d be one too. As Ten Steps points out, modeling as a form of learning guidance is more complex.  You need a skilled professional to perform the task and also to explain or think out loud as she does. Becoming this kind of model, like becoming a mentor, isn’t something that happens because the boss dubs someone an expert. Successful modeling requires not only a skilled and credible role model, but explicit attention to to the process of solving problems. I think it’s critical to prepare that skilled performer, both in terms of highlighting whole tasks and in terms of focusing on the process. (You can, of course, use virtual models, such as a recorded performance or even, in theory, a simulation of an actual performance, though that’s easier to say than to do.) One form might be what’s often called shadowing-following the skilled performer through the workday.  Without the explicit focus on how the skilled person approaches problem-solving, though, shadowing isn’t likely to teach the novice performer any complex skills. Process worksheets: cookbooks for chefs Ten Steps describes a process worksheet as way of guiding learners through problem-solving.  Here’s an oversimplified example to guide a paralegal in writing an internal memorandum of law.  The left column lists the task steps; a right column deals with the processes involved. vM&K favor using questions when the process worksheet is a training tool: "learners are provoked to think about the rules-of-thumb." You could provide this type of guidance at a much more elaborate and detailed level.  vM&K refer to SAPs (systematic approaches to problem-solving), of which the process worksheet is an example, and note that some SAPs include branching so that the learner goes to different subparts for different types of problems. And you’re right, that looks a lot like a performance-support system.  One difference I see is that we’re talking about learning, whereas the performance support system is about… well, performance.  So it tends not to deal explicitly with the cognitive processes behind the performance. Rolling right along: performance constraints A third way to guide learners is by constraining what they can do.  Ten Steps calls this the training wheels approach.  When a child’s learning to ride a bicycle, there’s a lot of skill to focus on: pedaling, steering, maintaining balance.  Training wheels remove balance from the equation; they constrain the performance.  The authors believe that performance constraints are particularly useful for early phases in the learning process. There’s a certain amount of relativism here.  It’s like the rhetorical question, "How long is a rope?"  If the overarching skill is riding a bicycle, what constitutes a whole task?  "Moving the bike along a path" might be one; the kid’s still moving along even when the bike has training wheels attached.  Removing the wheels creates a whole task (moving) with less support (moving and balance on your own). Which makes a good link to the final point in Step 1: scaffolding Scaffolding: support that fades "Scaffolding" is a technique that combines providing support to the learner (like modeling, process worksheets, examples) with gradually reducing or fading that support until the learner faces what vM&K call a conventional task ("here’s the problem; solve it"). Research on expertise reversal indicated that highly effective instructional methods for novice learners can lose their effectiveness and even have negative effected when used with more experienced learners…. There is overwhelming evidence that conventional tasks [meaning, those with no support] force novice learners to use weak problem-solving methods that bear little relation to schema-construction processes… Or: if you just have novices work at solving complex problems but don’t provide them support while they’re learning, they won’t know what they’re doing, and they won’t learn how to do better-at least not in any efficient way.  I think this explains why so much alleged on-the-job training does so poorly with complex skills: novices focus on externals, on what looks important (or on what Lou thinks is important). A corollary is that when someone’s already skilled, the support and guidance can actually hinder his performance.  The skilled performer has built her own mental models, and they work (or else she wouldn’t be skilled).  When you start providing other models to skilled performers, you’re struggling against what’s already there. And you’re probably losing. Scaffolding (support that gradually faces as you move through the training program) is a tool to replace external guidance with self guidance. Variation: the road from concrete to abstract vM&K talk about "variability of practice," though I think "variation" is a better word (not that they asked).  I’ve discussed this in earlier posts; mentioning it again stresses the emphasis that Ten Steps puts on inductive learning. Students construct general cognitive schemas of how to approach problems in the domain, and of how the domain is organized, based on their concrete experiences offered by the tasks. Variation in the tasks makes for richer learning experiences, which in turn makes for stronger schemas.  Generalization and discrimination are obvious ways that people build their cognitive maps.  So is mindful abstraction, deliberate efforts to generate alternative concepts or solutions. New to me was the concept of implicit learning.  Some tasks lack straightforward decision guides but have a lot of information.  Giving learners a wide range of positive and negative examples can help them derive useful schemas.  The situation that vM&K talk about involves showing air traffic controllers thousands of examples of control situations-some dangerous, some safe. A final recommendation: learning tasks adjacent to each other should have learners practice different versions of the constituent skills.  The higher "contextual interference" will actually help learners solve new problems better. Let’s say you’re working on effective writing.  Imagine three elements like using parallel structure, using personal pronouns, and avoiding passive verbs.  Low interference would come from a cluster of problems, each of which involves parallelism in email, followed by another cluster involving personal pronouns in email.  Eventually you get to a second batch: parallelism in reports, personal pronouns in reports, and so on. Higher interference would come from practice tasks that interspersing both the elements and the situation: parallelism in an email, then passive voice in a report, then personal pronouns in the comments section of a form. Since vM&K end the chapter with a warning, I’ll repeat its essence here: Novices don’t learn well from conventional tasks ("here’s the problem, what’s the answer?"). A randomized set of conventional tasks is even worse. High variability with ample support and guidance helps novices learn. High interference (how the variability is presented) also helps them learn. So, for novices learning complex skills: without support, your learning won’t hold up. CC-licensed dubbing photo by The Other Dan CC-licensed "I know this already" photo by orionoir. The posts in this series: Complex learning, step by stepComplex learning (coffee on the side)Ten little steps, and how One grewProblem solving, scaffolding, and varied practice (that's this post) Step 2: sequencing tasks, or, what next?Clusters, chains, and part-task sequencing
Dave Ferguson   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 05:19pm</span>
One more chore when I’m galactic emperor:  suburban office buildings-you know the kind, with medical group practices  and title companies and other purveyors of services-would have to have visible house numbers. Visible as in, you can read them from a moving car on the opposite side of the street.  Before you pass the building. I’m opposed to capital punishment, so if you violated this edict, I’d handcuff you to a desk for a month. The desk would be inside It’s a Small World. This is really a rant (or a lament) on the gulf between theory and practice when it comes to how "services" deal on a practical basis with those they serve.  I was designated driver for someone who had to go to one of those stone-and-glass Skinner boxes.  I found myself thinking that some stakeholders-the designers, the builders, the landlords, or the tenants-have never visited the place as customers. Four floors of imaging centers, gastroenterology practices, OB/GYN partnerships, and so on-thus, likely attracting the ill, the concerned, and those concerned for them-and yet… One "house number", about eight feet off the ground, and about the size of the number on my own house.  You can barely see it from the curb, let alone the road.  Nearby: eight or ten other interchangeable buildings, all members of  "Who Needs Addresses?" Nothing to distinguish one side of the building from another; until you try the doors, you don’t realize that the east and west entrances are to individual businesses, not to the building lobby. Not a single public-area bench or seat, meaning that patients waiting near the door for rides have to stand. Something of the same lack of awareness at the office-level: practice’s receptionists did not greet patients so much as shove paperwork on clipboards at them.  Overheard phone call: "This is Dr. Whoozi’s office.  The doctor needs to have you come in 30 minutes early." I was reminded of a customer-service job aid I saw taped to the staff side of a cash register at a Wisconsin doughnut shop: Look at me. Listen to me. Smile at me. Thank me. An open invitation: Last month’s edition of the Working/Learning blog carnival was the largest and most-read yet.  Host for the next one (April 20th) is Dave Wilkins at The Social Learner. If you blog about anything that relates to how people work at learning, or how learning happens at work, I hope you’ll consider taking part.  Then let Dave know.  (Details on how to participate are here.) CC-licensed to-do photo by Great Beyond.
Dave Ferguson   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 05:18pm</span>
Series: Ten Steps to Complex Learning « Previous post in this series • Next post in this series » Step 2 in Ten Steps to Complex Learning is "sequence task classes."  (The image on the right shows that this step is part of the "learning tasks" component in van Merriënboer and Kirschner’s model; click to enlarge.) You may recall that task classes (which vM&K say are also known as equivalence classes, problem sets, or case types) are categories of whole-task problems ("learning tasks") with the same degree of difficulty. A very short training program might have only one task class, though I’m having trouble picturing complex learning that you could accomplish that way.  Maybe specialized training for skilled people to increase their ability in a new but related area, like training computer technicians to troubleshoot a new type of equipment. This chapter of the book deals with sequencing task classes, learner support in task classes, and (gasp) part-task sequencing. The whole is more than the sum of the tasks The Ten Steps strongly advocate a whole-task approach to sequencing….on the premise that learners should quickly acquire a complete view of the whole skill that is gradually embellished during training.  Ideally, even the first task class refers to the easiest version of whole tasks that professionals encounter in the real world…. This provides learners the best opportunities to pay attention to the integration of all skills, knowledge, and attitudes involved, and to the necessary coordination of constituent skills. vM&K mention "global before local skills," a principle of cognitive apprenticeship, as well as Reigeluth’s metaphor of a zoom lens.  The idea is that you begin with the wide-angle view (the whole task), zoom in for detail (an early task class), zoom out again to see the part in relation to the whole, zoom in on other details, and so on. Sequencing through simplification One way to sequence task classes is to simplify conditions.  The learner works with all the constituent skills, starting with the easiest version of the whole task that professionals might encounter. Let’s say the whole task is for an Amtrak reservation agent to answer customer questions about schedules.  (As the subject-matter expert here, I’m deciding "schedule questions" is one task class and "fare questions" is another.)  Factors affecting the difficulty of the whole task might be the complexity of the route, the firmness of the dates, the available accommodations, and the number of trains satisfying the request. Those factors or dimensions act as a kind of recipe for building task classes.  In the easiest class, for example, the learner works with schedule requests where the routes aren’t complex, the dates are firm, there are few accommodation options, and there aren’t many trains to choose from. "Can I go from Memphis to Chicago next Friday?"  (There’s one train per day; no connection required; either a seat or a roomette.) In a complex-request class, you might get a request like: "I’m thinking of taking my husband and our grandson from Cleveland to Portland, Oregon, some time this summer.  I wonder if we can visit my sister in Denver and also get to see Glacier National Park?"  (Lots of routes, many possible accommodations, complex scheduling.) Sequencing through "emphasis manipulation" Another way to sequence task classes is to focus on different sets of the constituent skills in different classes. How’s that different from just teaching parts of the overall task?  Because although you’re shifting (or "manipulating") the emphasis between various constituent skills, you’re still working on the whole task. vM&K give an example of student teachers learning to teach lessons ("teaching a lesson" being the whole task in this example, no matter what you think teachers ought to be doing or not doing). A first task class might emphasize presenting the subject matter.  The next task class might involve teaching with an emphasis on questioning; another class, teaching with a focus on initiating and encouraging group discussion.  The final class would involve teaching (the whole task), with the idea that the student teachers integrate the skills of presentation, questioning, and discussion. Training for a supervisor in a pharmaceutical plant might have a whole task for preparing an employee to operate the cartoning machine.  One task class might emphasize safety; another, standard operation; a third, troubleshooting; a fourth, changeover for different products; a fifth, different levels of cleaning and maintenance. One drawback to "emphasis manipulation" is that it presents the learner with complex situations from the beginning.  Remember, you’re not simplifying the situation, only emphasizing certain constituent skills in the context of the whole.  This approach probably isn’t a good one for the early task classes of a highly complex task. Sequencing through knowledge progression A third way to build task classes is via the underlying body of knowledge.  If that sounds confusing to you, I’m glad to have the company. More seriously, vM&K suggest thinking about this as first determining the bodies of knowledge needed, and using those to determine task classes.  In other words: Analyze any progressions of cognitive strategies (approaches to problems), and Analyze any progressions of mental models (pictures of how the domain is organized). Those are Steps 5 and 6 in the model, which means I haven’t read those chapters yet.  We’ll get there. In the meantime, you can, of course, combine the types of sequencing.  You’ll likely use simplification first because it’s…less complex than emphasis manipulation. Speaking of simplicity and complexity, we’ve still got a straightforward topic (learner support) and a confusing one (part-task sequencing) to deal with in Step 2.  We’ll knock off the easy stuff and save the hard part for next time. Support as sawtooth You’ve probably figured already that scaffolding (diminishing support and guidance) applies within a task class.  So if your learning involves four task classes, you’ve got four sequences of scaffolding, each with high support and guidance in early examples, fading to none by the end of the class. For the task class "simple Amtrak schedule requests," a first learning task might involve a mini case study of someone handling a simple request.  The learner would explain what happened to make the response effective. The next learning task removes some scaffolding-for instance, giving a partial response, for which the learner must provide the missing pieces. As a last learning task in the "simple schedule" class, the learner would work what vM&K call a conventional task — given a passenger request, reach the goal. * * * The last part of this chapter deals with "part-task sequencing of learning tasks."  It’s complicated enough that (a) it would make this post too long and (b) I don’t yet know how to explain it.  That’s reason enough to leave it till next time. CC-licensed simplified camera photo by John Kratz. CC-licensed lens photo by Anders Ljungberg. My sawtooth image adapted from a CC-licensed leaf-edge photo by P/\UL. The posts in this series: Complex learning, step by stepComplex learning (coffee on the side)Ten little steps, and how One grewProblem solving, scaffolding, and varied practiceStep 2: sequencing tasks, or, what next? (that's this post) Clusters, chains, and part-task sequencingStep 3: performance objectives (the how of the what)
Dave Ferguson   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Aug 19, 2015 05:18pm</span>
Displaying 19681 - 19690 of 43689 total records
No Resources were found.