Blogs
17
Shared
If you’re like many modern companies today, you probably use PowerPoint to a range of ends. One of the best and most effective is as a learning tool for your employees. Perhaps you use it in a group setting, where a leader lectures to listeners with the aid of slides, or maybe you employ it to enable self-directed learning. Whatever the case, upping the engagement level of your PowerPoint Presentations can significantly increase the amount of learning that takes place.
That being said, most people assume that making PowerPoint more interesting involves super-fancy graphics or crazy slide structures that would take a genius to unravel … not to mention put together. Luckily, that’s not the case! Below we offer 10 simple strategies for making presentations more interesting, more accessible and more learning-oriented.
Read more »
iSpring Solutions
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 03:00pm</span>
|
14
Shared
Maggie Webster is a Senior Lecturer in Religious Studies from the Faculty of Education. Find out how iSpring "motivated [her] to be more creative in [her] teaching" with its familiar PowerPoint interface and versatile toolkit:
A: My name is Maggie Webster, I’m a senior lecturer in the Faculty of Education at Edge Hill University, and I teach a variety of programs, but mainly religious education and religious studies. I’m also a year leader for third-year part-timers; a lot of those trainees tend to be mature and part-time, so they do a lot of distance stuff, and they also come into university once a week. So, I have a selection of students, some of them mature, some of them who are undergraduates who are traditionally 18, etc., and they all have varying competencies with technology, and also various interests with it. So, I’m trying to reach a broad range of people when I’m teaching, if I can.Read more »
iSpring Solutions
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:59pm</span>
|
74
Shared
According to Carnegie Mellon Professor Jeanne M. VanBriesen, self-directed learning is when individuals "take initiative and responsibility for learning" and "select, manage, and assess their own learning activities." She goes on to say that motivation and volition are critical, that students should experience independence in goal setting and determining what to learn, and that the role of teachers or trainers is to provide scaffolding and support.
If your mouth just fell open in a giant
Whaaaa? … don’t worry. It might sound like a lot of steps, but really all VanBriesen is saying is that students learn best (i.e. stick with material and retain it) when they’ve got a dog in the race. You can crib all these benefits for your own e-Learning materials by putting self-directed learning to work for you. The steps below will help you do it.
Read more »
iSpring Solutions
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:59pm</span>
|
12
Shared
Another intensive period of research has turned up a positive result: iSpring engineers confirm that SCORM 1.2 and 2004 courses published with iSpring authoring tools work beautifully with
TOPYX LMS, "The Learning Management System That People Love." So now all you TOPYX fans out there can confidently create stimulating, media-rich courses and quizzes with iSpring and upload them to the LMS, knowing that all statistics will be properly reported.
Read more »
iSpring Solutions
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:59pm</span>
|
7
Shared
Carl Simmons is the Lead Partnership Quality Officer in the Faculty of Education. He used iSpring to deliver an "online mentor training package" for their instructors in the field. Here he shares his thoughts on why they chose iSpring to solve their accessibility issues:
Q: Tell us a little about yourself and what motivated you to consider iSpring?
A: My name’s Carl Simmons, I work in the Faculty of Education. My focus is initial teacher training; I’m a senior lecturer. But for this project, what I was focused on is we have an online mentor training package for our teachers out in school, which was currently hosted on Moodle, and then BlackBoard. That caused us some issues in terms of it was difficult to update, and also you needed passwords to access that system. Read more »
iSpring Solutions
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:59pm</span>
|
21
Shared
Good teaching is a mysterious recipe: one part good content, one part excellent presentation, and one part fairy dust that brings the whole thing together. Especially with e-Learning, where students are on their own for much of the process and their education depends on engagement and entertainment, it’s very important to keep that magical element in mind.
We submit that myth, legend, and tall tales have a lot to offer the world of e-Learning. Since learners love stories, why not follow the formula of fantasy to make your e-Learning courses a huge success? Below we offer 8 fairy tale takeaways that you can and should apply to your next e-course or PowerPoint presentation.Read more »
iSpring Solutions
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:59pm</span>
|
0
Shared
It’s been a long time since the very first release of Adobe Flash Player that painted the Internet with blazing colors and gave motion to the pre-video web era. It has served us well so far and provided content and game developers with great tools.
Then it started descending like the Roman empire. The turning point was incompatibility with mobile devices driven by iOS. It seems that now it’s time that the Flash project could be temporarily closed soon and wiped off the face of the Internet.
Despite the fact that we don’t want Flash to be killed, there have been several occurrences that prompted us to see the writing on the wall.
Read more »
Flash gets blocked on Mozilla Firefox
This week (July 14, 2015), Mozilla stopped supporting the Adobe Flash plugin in all versions of the Firefox browser. It blocks all .swf and .flv files by default due to a vulnerability in Flash Player that attackers exploited. Cyber-thieves can use these security holes to install malicious software and steal data.
Updating Flash Player to the most recent version (18.0.0.209) fixes this issue in Firefox (39.0). The environment is constantly changing and this version may be blocked soon as well.
Facebook claims for Flash termination
The Chief Security Officer at Facebook, Alex Stamos, unambiguously called for closing the Flash project. Despite Adobe’s actions and bugfixes they plan to release in the future, Facebook users are recommended not to use this vulnerable technology at any time.
YouTube uses HTML5
The biggest flash video (.flv) provider earlier this year (January 2015) stopped serving videos using the Flash plugin. The YouTube site now uses HTML5 video player for all modern browsers.
Read more on YouTube Engineering and Developers blog.
What does it mean to iSpring users and other content authors
Fortunately, with iSpring 7’s various publishing options, you don’t have to worry about any block actions against Adobe Flash. You can always use HTML5 output, which gives you the same experience as Flash.
Difference in playing back interactive rich-media content
Flash is played by means of Adobe’s proprietary plugin: Flash Player.
HTML5 web presentation is played by means of your web browser.
When you publish to Desktop (Flash) it may cause this message to appear in Mozilla Firefox or Chrome:
Solution
Please use Mobile (HTML5) or at least the Combined (Flash + HTML5) output Publishing option.
In this case, your Web presentation will work on all mobile devices and desktops with modern web browsers without a third-party plugin that may be blocked. The combo mode will also provide compatibility with older browsers like IE8.
Read more about this drama with Flash in the world’s largest news feeds:
Wired: Flash. Must. Die.
BBC News: Mozilla blocks Flash by default on Firefox browser
Daily Mail: Google and Mozilla pull the plug on Adobe Flash
Inquisitr: Mozilla Firefox Bans Adobe Flash Player
Macworld: You don't have to be a villain to say Flash must die
The Guardian: Flash is dead, and YouTube dealt the killing blow
The Verge: YouTube drops Flash for HTML5 video as default
iSpring Solutions
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:59pm</span>
|
In today's New York Times, columnist and economist Paul Krugman details new data that shows that Americans are literally losing stature. Here's a quote from the article.The data show that Americans, who in the words of a recent paper by the economic historian John Komlos and Benjamin Lauderdale in Social Science Quarterly, were "tallest in the world between colonial times and the middle of the 20th century," have now "become shorter (and fatter) than Western and Northern Europeans. In fact, the U.S. population is currently at the bottom end of the height distribution in advanced industrial countries."
This is not a trivial matter. As the paper says, "height is indicative of how well the human organism thrives in its socioeconomic environment." The link to the article is here, but you have to be a subscriber to read it.
How might this relate to those of us in the United State's learning-and-performance field? Well, mostly this is an interesting tidbit that we have little control over. On the other hand, it might give us pause. After all, if we create learning programs of equal effectiveness to our overseas competitors, but their learners are healthier than our learners, their learners will learn more and perform better in their work. Their companies will have a competitive advantage. We will all die penniless and alone. (Exaggeration).
Krugman reflects on the argument that American's unhealthy ways might be related the fact that we work too much, and thus don't have time to exercise and eat right. Is that a hook into our responsibility as learning professionals? Is there anything that we can do to lower the average time our workers are swimming in the ocean of work responsibilities?
Don't just think content here. Preaching and information are not likely to help that much.
Well, I'm brainstorming here (as I have no idea), we can encourage e-learning to be done on work time, maybe by utilizing more synchronous, and more social interactions. We can realize that learners will forget a large chunk of what we teach, and either cut the forgettable crap out of our courses or demand from our organization and vendors that it be put into performance support. Maybe we can design m-learning interactions that are especially appropriate to be used during exercise. I don't know how to do this, and it may not be doable, but maybe you can be the one to figure it out. Maybe we can provide truly healthy and delicious food for our training participants.
What else? I don't know. Do you?
Or do you think it's outside our influence?
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:59pm</span>
|
Elliott Masie asked a question last week in his blog/newsletter. It's a fun question and because it is accompanied by the promise of food and public spectacle at his upcoming conference, a clever marketing device as well. If nothing else, Elliott's got a strong stomach for this type of distraction, and he got me thinking.
Here was his question: Cooking and Learning. Are They Similar? Here's what it made me think of:
How are cooking and learning similar?
Today, most of us don't have time to do either of them right. We don't have time to shop for the best ingredients or blend them properly. We take prepackaged crap and call it nutritious. We fall for false advertising, pretty packages, and recommendations from the well-coiffed and well-spoken. We're suckers for celebrity chefs, even if our neighbors cook a better meal. Most of the food on the store shelves is filled with harmful ingredients. We reach for the latest concoction, not the greatest value. We measure the immediate pleasure and forget the long-term impact. Because we hunger so much to get smiles and kind words at the end of the meal, we're willing to add butter and salt and whatever else it takes. We definitely wouldn't think of challenging our guests with brocolli rabe, ostrich patties, or sorbet. We're fat and happy, and when the meal is done, we think we've succeeded in grand fashion. Our guests leave satisfied into the darkness of the slow-moving night. They live under threshold. They die young.
There are lots of celebrity chefs, hash slingers, and short-order cooks. There are very few who can blend nutrition, taste, and world-class quality into a meal.
Come to think of it, learning and cooking have a lot in common.
Postscript: My wife and I once went to celebrate Valentine's Day at Chef Ming's Blue Ginger restaurant. We don't have cable so we didn't resonate with his celebrity, but we'd heard good things about the restaurant from trusted friends and colleagues.
The result. One Valentine's Day ruined with food poisoning.
We now refer to Celebrity Chef Ming's restaurant as the Blue Vomit.
Is this the way it has to happen? Are we in the learning-and-performance field immune?
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:59pm</span>
|
Suppose you need to hire someone on a contract basis in the learning-and-performance field? Or maybe you're on the other end of the transaction—you have your own business and need to find work?
Do you have to rely on your personal network alone? Can the big job boards help? Do our trade organizations' job board's have enough focus on contractors? Is there any way to know what people and organizations are good?
Recently, I've come across two organizations that may help, Learning Gurus and Clarity Consultants. This is how they described themselves to me:
Learning Gurus, Inc.
Learning Gurus connects Workplace Learning Professionals with companies that need them. We provide short-term contractors and full-time employees who design and develop training and performance solutions for corporate, government and educational institutions.
Our learning gurus are skilled in areas such as:
* Instructional Design & Development* Facilitation & Instructor Led Training (ILT)* Performance Analysis & Needs Assessment* Project Management* eLearning, Web-Based Training, On-line Learning* Multimedia/CBT Development* Graphics Arts* Technical Writing & Documentation* Quality Assurance & Evaluation
There is a trend among workplace learning professionals to move around quite a bit! Many in our field change jobs every 2-3 years, and many more are seeking the flexibility and variety that contract/consulting work offers. In the past, we've had to rely on our local professional networks, chance meetings with potential employers, and recruiters who, quite frankly, don't really "get" what we do. On the flip side, very few companies know where to find quality workplace learning professionals, and it's not uncommon for them to have an open requisition for 3-6 months. Enter Learning Gurus! Learning Gurus connects workplace learning professionals with the companies that are searching for them. Learning Gurus has a nationwide network of workplace learning professionals who are seeking additional contract and employment opportunities. There are no fees to join the network - your hourly rate or salary is marked up by a small percentage, which is paid by the client. Clients love it because they save valuable time by having Learning Gurus find their resource and they know they're getting a solid, pre-screened professional. Workplace learning pros love it because they have their own personal sales and marketing department to find leads, negotiate rates, and handle contracts and payroll. Now that's a Win-Win!
Karen J. BoylePresidentLearning Gurus, Inc. The Source for Workplace Learning Professionals www.learninggurus.comOffice: 619.236.0308karenb@learninggurus.com
Clarity Consultants
With Clarity Consultants, you can reap the expertise of Big 4 consulting -- without paying the Big 4 price. Clarity Consultants represents hourly consultants with expertise in Instructional Design, Project Management, SAP software implementation, training facilitation and specialized consulting. Our consultants have proven expertise in Software Implementation, New Hire Training, Sales, Customer Service, Leadership, Technical Training, Business Process and Compliance and other areas of organizational development. If your company has new software to implement, new products to sell or a new process to roll out to employees, Clarity Consultants can help you. For over 14 years, we've provided Fortune 1000 companies with contract training professionals. For more info, please visit www.clarityconsultants.com James LeeMarketing Associatejlee@clarityconsultants.com(p) 408.369.6558
Are there other organizations that I'm missing? Let me know.
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:59pm</span>
|
The Carbon Offset idea works like this. We all pollute, but when we do so we can help limit the damaging effects by either (1) offsetting our damage by doing good in other ways (for example if we have to drive a large car we can replace all our light bulbs with energy-saving flourescents), or (2) we can donate money to projects that help support renewable energy, energy efficiency, and reforestation. For example, check out the not-for-profit organizations CarbonFund.org and The Clean Air Conservancy.
Here's some ideas for those of us in the training and development field:
Encourage the use of e-learning, which limits the carbon footprint of travel. And, make sure you build e-learning that is effective and engaging, so more folks will want to use e-learning.
When calculating the "cost" of training, calculate carbon footprint costs as well. See for example, The Carbon Fund's calculators or The Clean Air Conservancy's calculators. Make these costs evident.
Encourage your company to buy carbon offsets when utilizing training. It's not just a good thing to do, but it may help your company attract business and recruit highly-educated employees.
In your e-learning courses, provide an option for learners to calculate how many tons of carbon dioxide they would have utilized had they had to travel from their location to headquarters.
What other ideas can you think of?
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:59pm</span>
|
The most important question that instructional designers can ask is:
"What do learners need to be able to do, and in what situations do they need to do those things?"
While we might discount such a simple question as insignificant, the question brilliantly forces us to focus on our ultimate goals and helps us to align our learning interventions with the human learning system.
Too many of us design with a focus on topics, content, knowledge. This tendency pushes us, almost unconsciously, to create learning that is too boring, filled with too much information, and bereft of practice in realistic situations.
The Magic Question requires us to be relevant. For workplace learning, it focuses our thinking toward learners' future job situations. For education learning, it focuses our thinking toward real-world relevance of our academic topics.
The Magic Question in Practice
In practice, the Magic Question forces us to begin our instructional-design efforts by not only creating a list of instructional objectives, but also by creating a list of performance situations. For example, if we're creating leadership training, we not only need to compile objectives like, "For most decisions, it can be helpful to bring your direct reports into decision-making, so as to increase the likelihood that they will bring energy and passion in implementing decisions." We also need to compile a list of situations were this objective is relevant, for example in weekly staff meetings, project meetings, in one-on-one face-to-face conversations, in phone conversations, etc. Also, for general decision making, but not in situations where time is urgent, where safety is an issue, where legal ramifications are evident, etc.
By framing our instructional-design projects in this way, we get to think about our learning designs in ways that are much more action-oriented, relevant, and practical. The framing makes it more likely that we will align our learning and performance contexts, making it more likely that our learners, in their future situations, will spontaneously remember what we've taught them. The framing makes it more likely that we will focus on practice instead of overloading our learners with information. The framing also makes it more likely that we will utilize relevant scenarios that more fully engage our learners. Finally, using the Magic Question forces our SME's (subject-matter experts) to reformulate their expertise into potent practical packages of relevant material. It's not always easy to bend SME's to this discipline, but after the pain, they'll thank you profusely as together you push their content to a much higher level.
Obviously, there is more to be said about how the Magic Question can be integrated into learning-design efforts. On the other hand, as my clients have reported, the Magic Question has within it a simple power to (1) change the way we think about instructional design, and (2) transform the learning interventions we build.
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:59pm</span>
|
I just came across another sighting of the mythologic numbers of memory retention, this time on the webpage of HRDQ.
Take a look:
They claim that, "Research shows people remember only 5% of what they hear in a lecture. But they retain 75% when they learn by doing." Bulloney!!
If you want to read my full debunking, click here.
If you want to see many bogus sightings, click here and scroll down.
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:59pm</span>
|
In today's New York Times there is a great article, Who's Minding the Mind?, by BENEDICT CAREY that sums up a large number of research studies on human cognition that show that human beings are more reactive than we might think. We tend to believe that we, as human beings, are very proactive and consciously in control of our thoughts and actions; but these studies show that much of what we do and think is due to hard-wired, often unconscious processes.
For example, the article cites how sitting near a briefcase (as opposed to a backpack) can make people more competitive. Or as Carey writes:New studies have found that people tidy up more thoroughly when there’s a faint tang of cleaning liquid in the air; they become more competitive if there’s a briefcase in sight, or more cooperative if they glimpse words like "dependable" and "support" — all without being aware of the change, or what prompted it.This basic fact about human behavior is relevant to the learning and performance field, of course. One of the things I've talked about for years is the notion of "spontaneous remembering." If we create learning right, we're more likely to help our learners—when they're on the job at a later time—by helping them spontaneously trigger memories of what they've learned. We can do this best by requiring our learners to utilize realistic cues in the learning context in making real-world decisions and taking real-world actions. This is why simulations are so effective (if they are well designed).
When learners process learning objectives or prequestions before encountering learning material, the learners are primed to pay attention to relevant learning material. It's not necessarily a conscious process, but it works.
There are many examples available, but here's another point: The learner-centric movement of the 1990's and 2000's has relied too heavily on the notion that the learners always know best, and that they are in conscious control of their learning and we just need to let them make the best decisions.
When we realize that our learners are more deterministically driven than the we want to believe (its about free will a little, isn't it?), we have more work to do if we really want to drive maximum performance. Even when our clients consciously want to do something, we may be able to help them reach their goals by setting up learning and performance situations that unconsciously trigger the behavior they want to achieve.
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:59pm</span>
|
It has been exactly one year since I offered $1,000 to anyone who could demonstrate that utilizing learning styles improved learning outcomes. Click here for the original challenge.
So far, no one has even come close.
For all the talk about learning styles over the last 15 years, we might expect that I was at risk of quickly losing my money.
Let me be clear, my argument is not that people don't have different learning styles, learning preferences, or learning skills. My argument is that for real-world instructional-development situations, learning styles is an ineffective and inefficient waste of resources that is unlikely to produce meaningful results.
Let me leave you with the original challenge:
"Can an e-learning program that utilizes learning-style information outperform an e-learning program that doesn't utilize such information by 10% or more on a realistic test of learning, even it is allowed to cost up to twice as much to build?"
The challenge is still on.
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:58pm</span>
|
I've just spent 3 wonderful days in San Jose at the eLearning Guild's DevLearn 2007 conference. Here were some of the highlights for me:
Hanging out with Ruth Clark a few times during lunches, keynotes, etc. We had a blast discussing research, the state of the profession, and the joy and challenge of doing the research-to-practice thing.
Seeing Ruth Clark and Silke Fleischer (of Adobe Systems) present the research AND practice of Richard Mayer's work on multimedia learning. Silke did a nice job of demonstrating e-learning examples in Adobe's Captivate. Ruth did a wonderful job discussing the research, framing it in terms of practical application, and describing its limitations. Adobe deserves a ton of credit for supporting the dissemination of Ruth's work and helping to distribute it to a wide audience. Three cheers for enlightened companies like Adobe and Questionmark who support research dissemination.
Having Google Maps change my behavior by including a public transportation option when I did a search in San Jose. I had actually made a reservation to rent a car so I could drive from my hotel to the downtown hotel where the conference was held. When I went to Google Maps to search for the best route, I was offered a public transportation search. I found out I could get downtown in 12 minutes for only $1.75. I cancelled my rental car and happily commuted. Saving me money (Google showed my gas savings), parking fees, aggravation, etc. Awesome!! Technology changes everything.
Seeing a great keynote by Paul Saffo who talked about technology innovation and reminded my how many failures are required before success is rewarded. It was one of those rare keynotes that was both well-delivered and superbly relevant for the conference. Way to go Heidi Fisk (of the eLearning Guild) for a great keynote selection.
Wonderful food. Yes. At a conference!! Healthy, fresh, tasty. Way to go Fairmont Hotel.
The now-famous DevLearn DemoFest where dozens of e-learning developers show off their wares. It's a great way to take a snapshot of the state of the e-learning industry.
Playing tennis with a Wii remote. My wife and I are not TV people, so I never pay attention to all the new remotes, Xbox's, etc. At the conference, I played tennis for about 2 minutes and had some fun. The cool thing about the Wii is that it tracks the remote's movement and simulates that on the screen. Actually, with the Wii my first serve percentage was about 100%, much better than real life.
My Wednesday Breakfast Bytes session on the intersection of e-Technology and Informal Learning. We had a great conversation and I learned some things.
My Thursday Breakfast Bytes session on Situation-Based Instructional Design. The basic nugget is that people behave by (1) Being in a situation, (2) Evaluating that situation to make sense of it, (3) Deciding what to do, and (4) taking Action. So, we ought to give our learners practice in doing the whole SEDA process (Situation-Evaluation-Decision-Action). And, we can benefit from asking the Magic Question. Yes, there is more to it than that. The bottom line for me is that my clients have found the concept very helpful in helping them design learning that goes beyond the typical topic-based designs.
The DevLearn Breakfast Bytes sessions do a great job in getting conversations going. As always, Guild members come to the conference with experience and are ready to share their insights and wisdom. I love Breakfast Bytes.
My regular session on Learning Measurement. Another great discussion with—what seemed to me—like lots of light bulbs going off. Fun, even after several days of conferencing.
I met a guy at the Demo Fest—John D’Amours—who had actually tried to do a control-group experiment comparing his e-learning design to a traditional design. Yes. Yes. Yes. We need more folks taking this kind of initiative.
Learning that Windows Vista NORMALLY runs slow with 2GB of memory. Glad it wasn't just my machine.
AND so many other great conversations and sessions. Sorry if I failed to mention you!! Hugs to all. I really learn a lot at eLearning Guild events. And I have to say, I feel that my contribution is especially appreciated. Thanks Guild members and staff !!!!!!!!!!
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:58pm</span>
|
In what has become an eternal vigil against the myth that "People remember 10% of what they..." I just hit the jackpot with the help of Jay Banks who just sent me an email.
The Wikipedia entry for Edgar Dale had two incorrect references to the bogus numbers that I talk about so often (see my blog category Myths and Worse). I fixed it today, I hope for good.
Here's what it looked like:
And here's what it looked like in Wikipedia:
For those who are shocked that information on the internet might be wrong—or that Wikipedia might be wrong—see my previous entries about Wikipedia (1st Most-Recent).
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:58pm</span>
|
Starting in 2006, Work-Learning Research offered the learning-and-performance community the Neon Elephant award to celebrate and honor an effort of extraordinary importance to our industry. Last year's honor went to Cal Wick of the Fort Hill Company for his work pushing the field toward true training transfer, leading the development of a tool that supports transfer (Friday 5's), and writing a book (The Six Disciplines of Breakthrough Learning...) to highlight these ideas and insights.
This year's award will once again be announced on the day of the Winter Solstice, to honor someone whose truly extraordinary work has helped bring light to our field.
Stay tuned...
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:58pm</span>
|
First of a Series
This is the first of a long series of blog entries devoted to the topic of learning measurement that I will offer over the next two weeks.
This series draws from my recent thinking on learning measurement and from my 2007 publication, Measuring Learning Results… It also introduces the findings from a remarkable research study that I participated in with the eLearning Guild and several other illustrious authors.
For the last year, I have spent many weeks devoted to rethinking the topic of learning measurement from the standpoint of the learning research. My research-to-practice report, Measuring learning results: Creating fair and valid assessments by considering findings from fundamental learning research, highlights the flaws in the current methods we use to measure learning results—and offers recommendations for how to improve our measurement practices. This report is available on my catalog. See below.
Why does Will Thalheimer Care about Measurement?
Why do I—a person who has spent the last 10 years attempting to bring fundamental learning research into focus—want to spend my "research time" on learning measurement?
Here’s why:
The performance of the learning-and-performance field is severely deficient—often creating learning that is not remembered and/or not utilized on the job.
Of the forces that control and influence our industry and the practices we use, measurement is one of the most critical.
Currently, our measurement practices provide us with poor and biased feedback about our performance as learning-and-performance professionals.
Because we do poor measurement, we don’t get good feedback (nor do our stakeholders), and so we have very little motivation to critically examine our practices—and improve them as valid feedback would suggest.
To put it simply, if we don’t measure better, we will continue to underperform—and we’ll continue to underserve our learners and organizations.
The eLearning Guild Report
The eLearning Guild report, "Measuring Success," is FREE to Guild members and to those who complete the research survey, even if not a member.
Also available, at $1,895 ($1,950 if you are not a member), is Direct Data Access (DDA) to the database of research results , including the ability to filter the results based on a variety of factors, including the survey respondents’ experience, industry, country, job title, etc. These Direct Data Access reports will be invaluable for vendors who want to know how well their products are rated on a number of dimensions (more on this later in this series), and valuable to for those who want to benchmark their efforts against other organizations that are similar to theirs. If you want to make a case for improving your measurement practices, you absolutely have to buy direct data access.
Disclaimer: I led the surveying and content efforts on the research report and was paid a small stipend for contributing my time, however, I will receive nothing from sales of the report. I recommend the report because it offers unique and valuable information, including wisdom from such stars as Allison Rossett (the Allison Rossett), Sharon Shrock, Bill Coscarelli, (both of Criterion-Referenced Testing fame) James Ong (at Stettler Henke where he leads in efforts of measuring learning results through comprehensive simulations), Roy Pollock (Chief Learning Officer at Fort Hill Company, which is providing innovative software and industry-leading ideas to support training transfer), Maggie Martinez (CEO of The Training Place, specializing in learning assessment and design), Brent Schenkler (a learning-technology guru at the eLearning Guild), and the incomparable Steve Wexler (The eLearning Guild’s Research Director, research-database wizard, publishing magnate, and tireless calico cat herder).
How to Get the Reports
1. eLearning Guild Measuring Success (Free to Most Guild Members)
If Member (Member+ or Premium): Just Click Here
If Associate Member, Take measurement survey, then access report.
If Non-member, Become associate member, take measurement survey, then access report.
2. My Report, Measuring Learning Results: Click through to My Catalog
The Series Continues Tomorrow...
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:58pm</span>
|
Let me start out by saying that I don’t know everything about learning measurement. It’s a topic that is too deep and complicated for easy answers.
My value-add is that I look at measurement from the standpoint of how learning works. As far as I can tell, this is a perspective that is new to most of our field’s discussions of measurement. This is ironic of course, because it’s learning measurement we’re talking about. So for example, when we know that learning begets forgetting, why is it that we measure learning before forgetting might even have an effect—thereby biasing the results ridiculously in our favor?
The second unique perspective I’m adding to the conversation is the importance of predicting future retrieval. I argue that we must validly predict future retrieval to give us good feedback about how well our learning programs are working. We do an absolutely terrible job of this now.
Finally, I’d like to think that I am pushing us beyond the conceptual prison engendered by our old models and methods. It’s not that these models and methods are bad. It’s that most of us—me included—have had a seriously difficult time thinking outside the boundaries of the models’ constraints. Let me use the Donald Kirkpatrick model as an example. Others may beat up on it, expand it, or expound on it for pleasure or treasure, but it’s a great model. It helps us make sense of the world by simplifying the confusion. But the model, by itself, doesn’t tell us everything we need to know about learning measurement. It certainly shouldn’t be seen as a prescription for how we measure. It is simply too crude for that.
Three Biases in Measuring Learning at Level 2
Measuring Immediately After Learning. A very high percentage of learning measurement is done immediately at the end of our learning events (In the eLearning Guild research, we found about 90% of e-learning measurement was done right at the end of learning). Immediate tests of learning only make sense if you don’t care about the learner’s future ability to retrieve information. When we measure "learning" what we really want to do is measure "retrieval." Moreover, what we really care about is future retrieval. Isn’t that the goal of learning after all? We want our learners to learn something so they can retrieve it and use the information later. I detail this point in much greater depth in the Guild research report and in even more depth in my report, Measuring Learning Results… By the way, the Guild report is free to Guild members and to those who complete the survey.
Measuring in the Same Context Where Learning Took Place. A high percentage of training is measured in the learning context (about 92% in the same or similar context in the Guild research). Unfortunately, context influences retrieval, and so when we measure in the learning context we bias our measurement results. Oh, and we bias them in our favor, again. So for example, in the classic research study, Smith, Glenberg, and Bjork (1978) found that when learners were tested in the same room in which they learned, they were able to recall more than 50% more than when they were tested in a different room from where they learned. This amount of bias is well within the bounds of the Barry Bonds level of cheating!! Would you vote yourself into the Hall of Fame?
Measuring Using Inauthentic Assessment Items (like Memorization). Most assessment items purporting to measure learning use memorization questions. Asking learners to simply retrieve what they have learned is bad assessment design because memorization is generally unrelated to future retrieval. So, if we test memorization, we know nothing (or very little) regarding whether our learners will be able to retrieve what is truly important. Sharon Shrock and Bill Coscarelli, two of my co-authors in the eLearning Guild Research Report, highlight the problems of memorization in the third edition of their excellent book, Criterion-Referenced Testing… One of the goals of criterion-referenced testing is to determine whether a learner can be "certified" as competent or knowledgeable about a particular topic area. Schrock and Coscarelli argue that only assessments done on (a) real-world tasks, (b) simulations, and (c) scenarios can be validly used for certification decisions, whereas memorization cannot be used. This is a change from the second edition of their book and provides a paradigmatic shift in our field. In future posts in this series, I will highlight my taxonomy of authenticity for assessment questions that follows up on Schrock and Coscarelli’s thoughtful certification ideas.
The Series Continues Tomorrow...
References:
Shrock, S. A., & Coscarelli, W. C. (2007). Criterion-Referenced Test Development: Technical and Legal Guidelines for Corporate Training (Third Edition). San Francisco: Pfeiffer.
Smith, S. M., Glenberg, A., & Bjork, R. A. (1978). Environmental context and human memory. Memory & Cognition, 6, 342-353.
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:58pm</span>
|
To lead into the weekend, let me hot-wax poetic:
Measurement is like a magnetically alluring supermodel we might see across the room at a party. We want to stare and absorb every curve of muscle, every glowing inch of skin. Yet, our primordial core forces our eyes away, perhaps ashamed of our own imperfections, perhaps following some failed inner calculus of future possibilities. The apparition seems impossible to grasp, so we turn away. With another opportunity lost, learning measurement keeps its mystery, its danger, and its transcendent ability to lift our practices to their highest potential.
Add your comments to analyze the paragraph above. What do you think I'm trying to say about the state of learning measurement in our industry? About our reasons for failing in this regard? What am I missing? What am I saying about supermodels? Add your own purple prose, poetry, etc., using the comments function below.
Note: Both men and women can be supermodels.
This quote comes directly out of the eLearning Guild Report.
Measurement Series Continues on Monday...
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:58pm</span>
|
What can e-learning add to measurement?
What can e-learning add to measurement? Does e-learning have unique capabilities that enable it to improve learning measurement? I think it does. Here’s a short list:
E-learning can capture more data more easily than classroom training.
E-learning can capture data during the learning program—not just at the end of the learning event—in a manner that the learners feel is a seamless and natural part of the event.
E-learning can track incoming proficiency through the use of pretests to determine whether the learning program actually meets a need, or determine who it meets a need for.
E-learning can collect data in a manner that can give learners comparison data while they complete an assessment.
E-learning can collect data on learner behaviors during the learning (for example, the click journey, time per screen, etc.)
E-learning can track pretest to posttest changes.
E-learning can randomly assign learners to program versions, making methodological comparisons possible. For example, a program version that uses immediate feedback can be compared to a program version using delayed feedback to determine which method is more effective.
E-learning can capture on-the-job performance data, including learners’ self-ratings, manager ratings, direct-report ratings, etc. This capability puts the focus on on-the-job performance, where benefits can accrue from management oversight and coaching, self-initiated development, and peer learning.
E-learning, because it can access and track learners at more than one point in time, can measure how well the learning intervention has performed in creating long-term remembering.
E-learning can capture data even when learners don’t know the learning program is being assessed. For example, the learning program can capture data when the learners think they are simply getting practice on the learning material.
E-learning can track learners as they move from the training event to the workplace. For example, e-learning programs can track learners’ goals to implement what they have learned to see how successful they have been in transferring the learning to the job.
With this power, comes responsibility, and a damn fun challenge. You can read my call-to-action later in this series and in the e-learning Guild Research Report as well.
The Measurement Series Continues Tomorrow...
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:58pm</span>
|
There are basically four types of software tools that can be used for developing instruments to measure learning. There are tools that are dedicated measurement-development tools, for example Questionmark's Perception. There are e-learning authoring tools that offer an assessment-development capability, for example Adobe's Captivate. There are learning content management systems, for example, Blackboard's Academic Suite. And finally, there are general purpose software-development tools, for example, Adobe Flash Professional. To put into a list form:
Dedicated Assessment-Development Tools
E-Learning Authoring Tools
Learning Content Management Systems
General-Purpose Software-Development Tools
When we asked e-learning professionals from the eLearning Guild membership about their use of tools in developing learning-measurement instruments, they told us an interesting story.
Specifically, we asked them, "What PRIMARY tool do you use to develop your measurement instruments?"
The most popular two answers were (1) we didn't use a tool, and (2) we used a tool developed in-house. See the graph below.
When we broke this down by corporate and education audiences, and looked at product market shares, other interesting findings appear.
Take a look at the corporate results, excluding all education and government respondents:
Adobe's Captivate dominates with over 50% of the market share—that is, over 50% of respondents said they used Captivate to develop their measurement instruments (they may have used other tools). Even more telling is that six of the top seven items are authoring tools or part of authoring tool suites. You read that right. Authoring tools are by far, by far, by far the way people develop assessment items in the corporate e-learning space. Only Questionmark's Perception and Adobe's Flash Professional sneak into the top nine responses before "Other" takes the tenth spot.
This makes sense if we assume, like a dismal economist, that people do what is easy to do. Our authoring tools remind us to add questions, so we add questions. It also tells me that maybe our field puts very little value on measuring learning if our behavior is so controlled by our surroundings that we don't look further than our authoring tools. Or, could it be that our authoring tools provide us with all we need?
Let's take a look at the education (with some government) results. (Note to those using the eLearning Guild's Direct Data Access capability: I filtered only for students, interns, academics, and practitioners).
The results for the education results are interesting as well, especially as compared with the corporate results. Note how many dedicated-assessment tools in the top ten. There are three (Respondus, StudyMate, and Questionmark's Perception). So perhaps educators care a little bit more about testing. Okay, that makes sense too. Still, there are a lot of e-learning authoring tools at the top, with Captivate dominating again.
The Leverage Point
The clearest conclusion I will draw from this data is that to improve our e-learning assessment practices, we need to do it at the one clear leverage point—at the one point that we seem to think about measurement the most—in our authoring tools. How might this work:
Okay, we could just train people to create better measurement instruments with the idea that they'll use that information the next time they boot up their authoring tool.
Better would be to train them to create better measurement instruments while they are using their authoring tool. And give them practice as well, with feedback, etc. You learning researchers will be chanting "encoding specificity" and "transfer-appropriate processing" and those of you who have ever had one of my workshops on the learning research will be thinking of "aligning the learning and performance contexts" to "create spontaneous remembering."
Better would be to develop job aids indexed to different screen shots of the authoring tool.
Better would be for the authoring tools to be seeded with performance-support tools that encouraged people to utilize better measurement practices.
Oh crap. The best way to do this is to get the authoring-tool developers to take responsibility for better measurement and better product design. Entrepreneurial minded readers will be thinking about all kinds of business opportunities. Hey Silke, how about giving me a call? SMILE.
Not much of this is going to happen anytime soon, is my guess. So, besides engaging someone like me to train your folks in how to create more authentic assessments, you're pretty much on your own.
And we know that's not going to happen either. At least that's what the data shows. Hardly anybody brings in outside experts to help with learning measurement.
I guess somebody thinks it's just not that important.
More on this as the series continues…
The data above was generated by a group of folks working through the eLearning Guild. The report we created is available by clicking here.
Here's some more detail:
The eLearning Guild Report
The eLearning Guild report, "Measuring Success," is FREE to Guild members and to those who complete the research survey, even if not a member.
Disclaimer: I led the surveying and content efforts on the research report and was paid a small stipend for contributing my time, however, I will receive nothing from sales of the report. I recommend the report because it offers unique and valuable information, including wisdom from such stars as Allison Rossett (the Allison Rossett), Sharon Shrock, Bill Coscarelli, (both of Criterion-Referenced Testing fame) James Ong (at Stettler Henke where he leads in efforts of measuring learning results through comprehensive simulations), Roy Pollock (Chief Learning Officer at Fort Hill Company, which is providing innovative software and industry-leading ideas to support training transfer), Maggie Martinez (CEO of The Training Place, specializing in learning assessment and design), Brent Schenkler (a learning-technology guru at the eLearning Guild), and the incomparable Steve Wexler (The eLearning Guild's Research Director, research-database wizard, publishing magnate, and tireless calico cat herder).
How to Get the Reports
1. eLearning Guild Measuring Success (Free to Most Guild Members)
If Member (Member+ or Premium): Just Click Here
If Associate Member, Take measurement survey, then access report.
If Non-member, Become associate member, take measurement survey, then access report.
2. My Report, Measuring Learning Results: Click through to My Catalog
More tomorrow in the Learning Measurement Series...
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:58pm</span>
|
In the eLearning Guild report I worked on with several other brilliant authors (SMILE), we asked e-learning professionals whether they were happy with the learning measurement they were able to do. Here's what they said (All the data reported in this blog post is for respondents who create e-learning for workers in their own organizations. The Guild's powerful database technology makes it possible to split the data in different ways).
In general, are people able to do the learning measurement they want to? See the graph below.
Only about 17 percent were happy with their current measurement practices. About 73 percent wanted to be able to do MORE or BETTER measurement. Clearly there is a lot of frustration.
In fact, one of the top reasons people say they can't do the measurement they want to is they don't have the knowledge or expertise to do it. It's in a virtual dead heat for the third most important reason given. See the diagram below.
The question then becomes, if people don't have the expertise to do measurement they way they want to, do they hire expertise from outside their organizations? A full 88.8% said they did all their measurement themselves. Wow! The graph below could not be more striking.
When we asked people what kind of expertise they do utilize—whether it was in house or contracted—they told us the following (I added a color-coded legend at the top):
Most of the folks doing learning measurement are instructional designers and developers with no particular expertise in measuring learning. A full 84% of respondents indicated that non-expert instructional designers are doing measurement at their organizations. Only 51.7% of respondents said their organization uses instructional developers with some advanced education. Only 20% of organizations have masters-level degrees in measurement. Only 6.7% utilize doctoral-level experts.
Note that when we look only at organizations that claim to be getting "high value" from doing learning measurement versus all the others, the results are intriguing.
Respondents Reporting the Level of Value They got from their Measurement Efforts
Less Than High Value
High Value for Measurement
Percentage of Respondents saying they Utilize People with MASTERS DEGREES ON STAFF.
16.4%
31.4%
Percentage of Respondents saying they Utilize People with DOCTORATES ON STAFF.
5.6%
10.1%
Percentage of Respondents saying they Utilize People with MASTERS DEGREES HIRED FROM OUTSIDE.
3.7%
11.8%
Percentage of Respondents saying they Utilize People with DOCTORATES HIRED FROM OUTSIDE.
3.4%
6.1%
Wow!! Those folks who think they are getting very high value for their measurement efforts utilize people with masters degrees on staff more than 91% more compared with those reporting less than high value for their measurement efforts. The high-value people also utilize more doctoral degrees on staff compared to the non-high-value people by 80%, more masters degrees hired from outside by 219%, and more doctoral degrees hired from outside by 79%. While this data is correlational and self-report data, it suggests some sort of relationship between the measurement expertise employed and the level of value an organization can get for their learning measurement.
Summary
To recap, in a survey of over 900 e-learning professionals, many are frustrated because they want to do more/better measurement. A significant portion of their frustration results from not having the expertise to do measurement correctly. They have very few high-level measurement experts on staff, and they hire almost nobody from the outside to help them.
What the hell is wrong with this picture?
It confirms for me that learning measurement is just not given the importance it deserves.
More to come tomorrow in the Learning Measurement series...
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 02:57pm</span>
|