Blogs
I urge you to support Elliott Masie's war on frivolous LMS Trademarks. Here's what Elliott wrote in his newsletter:
Help! Defend Learning Field
from Patent Suit! The learning "patent trolls" are at it again. There
is a company called IPLearn that has unfortunately successfully sued over a
dozen learning management system - claiming they have invented many of the core
elements of technology delivered learning. Sure, they wrote several
clever patents that claim to have created much of the field - and, with an
understaffed patent office, it was approved - though there were many prior
learning and technology implementations. And, their strategy, which has
worked, is to sue a company - get the discovery and legal costs up and up, and
finally settle for a fee and stock shares. All, without having invented,
produced or created anything - other than a few patent apps.
I have worked, without fee,
against their efforts several times and now they are at it again. They
have brought suits against several major LMS companies - and I am asking
Learning TRENDS readers to help gather any manuals, documents or other
experiences you have had with these earlier corporate learning systems:
Registrar," by Silton-Bookman
Systems.
Learning Organization
Information System (LOIS)," by KnowledgeSoft, Inc.
Etude," by Gerald
Hollingsworth and GPU, Inc.
Continuous Learning System
(CLS)," by AT&T Global Information Solutions International, Inc.
The lawyers defending against
the IPLearn suit would love to see any samples of anything that describes the
operation or public availability of these systems. For example:
user manuals, help files, demonstration videos, brochures, press releases, and
actual program disks/CDs.
If you can help, would you send
a note to my office at emasie@masie.com
and we will contact you back. Many thanks!
While I'm not a trademark expert, and I support people and organizations who truly do create something new and unique, trademarks that aren't deserving hurt our industry and our learners.
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:59pm</span>
|
As Quincy Jones once remarked, "I’ve always thought that a big laugh is a really loud noise from the soul saying, "Ain’t that the truth."
That said, Edu-fun Friday is a series devoted to adding some humor to the lives of teachers who visit this blog. After all, there’s nothing better than ending the week on a positive note! Plus, do we have the best topics to provide us with some comic relief or what?
As for these centers, are there really any other options at this time of the year!
Thanks to artist Mark Anderson for the TGIF laugh!
Edutech for Teachers team
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:59pm</span>
|
Clark Quinn (blog, website, Twitter) recently cited some of my thinking about instructional objectives in the instructional technology forum of AECT (ITFORUM). I wrote a long email to Clark in response, thanking him, and going into more detail. I am reprising my response to Clark here:
In a recent post to this list, Clark Quinn rightly notes that objectives
for learners and objectives for instructional designers need not be identical.
Indeed, as both Clark and I have previously noted, the probably shouldn’t be
identical.
Here’s the thinking: Objectives are designed to guide behavior. So, how
can it be that identically-worded objectives can adequately guide the behavior
of two disparate groups of individuals (learners and instructional designers)?
It just doesn’t make any sense!!
And indeed, Hamilton (1985) found that presenting learners with learning
objectives in the way Mager suggested, PRODUCES NO BENEFITS AND MAY BE HARMFUL.
Here’s what Hamilton wrote:
"[An instructional] objective that generally
identifies the information to be learned in the text will produce robust effects.
Including other information (per Mager’s, 1962, definition) will not
significantly help and it may hinder the effects of the objectives"
(Hamilton, 1985, p. 78).
Objectives are not only designed to change behavior for a particular set
of individuals, but they are also designed with particular purposes in mind—or they
should be.
So, when we talk of instructional objectives, we also need to think
about what purpose we have for them.
The quote above from Hamilton is focused on how well learning objectives
focus the attention of learners. Interestingly, this is the only area in which
extensive research has been done on learning objectives. You might be surprised
to know that learning objectives help learners focus on the information
targeted by learning objectives, but actually diminish their attention on
information in the learning materials not targeted by learning objectives. For
example, in two experiments using specific objectives, Rothkopf and Billington
(1979) found that when focusing objectives were provided to learners, performance
on material related to the objectives improved by 49% and 47% over situations
when focusing objectives were not used. However, the material not related to
the learning objectives was learned 39% and 33% WORSE than it would have been
if no learning objectives were used!
These types of instructional objectives—presented to learners prior to
subsequent learning—I call "focusing objectives" because they are designed for
the purpose of focusing learner attention on critical learning material. As the
Hamilton (1985) review pointed out, it does NOT help to add Mager’s criterion
information to focusing objectives, because it doesn’t help learners focus on
the critical material.
NOW, here’s an important point (I say to focus your attention): We don’t necessarily need to use focusing
objectives with learners if we have other means to focus their attention!! We
can use a relevant, gripping story. We can do a shout-out (example, "Here’s an
important point…"). We can have them attempt to answer a relevant
scenario-based question and struggle with it. Etcetera.
Here’s another important point: Focusing objectives are only one type of
objective we might want to utilize. I have a whole list, and I’m sure you can
think of more of them.
Instructional Objectives for Learners:
Table-of-Contents
Objectives
To give learners a big picture sense of what will be taught.
Performance
Objectives
To let learners know what performance will be expected of them.
Motivation
Objective
To ensure learners know why they might be motivated to engage the learning or
application of the learning.
Focusing
Objective
To guide learner attention to the most critical information in the learning
material.
Instructional Objectives for Developers:
Instructional-Design
Objective
To guide developers toward the ultimate goal of the learning intervention.
Evaluation
Objective
To guide developers (and other stakeholders) to the ultimate measurable
outcomes that the learning intervention will be measured by.
Situation
Objectives
To guide developers to the situations that learners must be prepared for.
Organization ObjectiveTo guide developers to the organizational effects targeted by the
instruction.
Questions:
So, here’s some questions for you:
Is it okay to use the word understand in an "instructional-design
objective"?
How about in a "focusing objective"?
Answer: It’s okay to use the word understand in a focusing objective—because
it does not hurt the learner in setting them up to focus attention on critical
concepts. But it is NOT okay to use the word understand in an
instructional-design objective—because the word "understand" doesn’t have
enough specificity to guide instructional design.
My point in asking these questions is to show that over-simplistic
notions about instructional objectives are likely to be harmful to your
instructional designs.
As usual, the research helps us see things we wouldn’t otherwise have
seen.
Hope this helps!!
= Will
References
Hamilton, R. J. (1985). A framework for the evaluation of
the effectiveness of adjunct questions and objectives. Review of Educational Research, 55, 47-85.
Mager, R. (1962). Preparing
Instructional Objectives. Palo Alto, CA: Fearon Publishers.
Rothkopf, E. Z., & Billington, M. J. (1979). Goal-guided learning
from text: Inferring a descriptive processing model from inspection
times and eye movements. Journal of Educational Psychology, 71(3), 310-327.
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:59pm</span>
|
We, the members of the workplace learning-and-performance field, think about our field from time to time. Of course we do.
Here's what I wonder. How much have our mental models changed in the last 50 years?
That's too big a question for me to answer right now, but it does raise an interesting question. Today, as I was reading an scientific article (citation below) on how people have insights, the authors reported that, in their study of naturalistic insights, most sudden insights occured when people looked at new data--after having spent a great deal of time before the new data thinking about the issue.
Here's the thing: We, in our field, haven't really created new data sets or methodologies too often. Yes, we have Jack Phillips ROI methodology and Robert Brinkerhoff's Success Case Method--both of which got many of us to rethink what we're doing--but these methods have been at the results end of the causal chain from learning to performance to results. Important stuff--there is no doubt--but not enough.
When it comes to getting new data about learning engagement, remembering, and on-the-job application; we haven't seen much innovation in our field.
If the research on insight is right, then without new data (or new methods to gather data in the case of an industry-wide perspective) we will not have breakthrough insights about how to improve our training and other learning interventions.
We need to continue working toward better data-gathering methods.
I think the Performance-Focused Smile Sheet offers a glimmer of hope, but the recent infatuation with benchmarking does NOT.
What have you done recently to help your company get better data about your learning-and-performance initiatives?
Citation for article that triggered this insight:
Klein,
G., & Jarosz, A. (2011). A naturalistic study of insight. Journal of
Cognitive Engineering and Decision Making, 5(4), 335-351.
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:59pm</span>
|
Teaching with Digital Technologies Infographic
Technology has had a huge impact on educating students around the world so it’s no surprise that it is being heavily incorporated into classrooms. From computers to tablets and pretty soon virtual reality field trips, technology has opened the door to possibilities we never could have imagined. The Teaching with Digital Technologies Infographic shows what teachers and students think about the potential of digital technologies in educational setting.
Facts and Stats
74% of teachers believe that technology enables them to reinforce and expand their content.
60% of high school seniors and college students believe that technology helps them study more efficiently and perform better in class.
93% of teachers agree that digital resources help their students academic achievement.
95% of teachers agree that digital resource engage their students in learning.
Both students and teachers are in agreement that technology helps improve the classroom and overall learning environment. At the current rate of expansion, technology may soon revolutionize the educational system as we know it.
View also:
Teaching with Technology Infographic
How Educational Technology is Being Used in the Classroom Infographic
Via: online.annamaria.eduThe post Teaching with Digital Technologies Infographic appeared first on e-Learning Infographics.
eLearning Infographics
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:58pm</span>
|
As Quincy Jones once remarked, "I’ve always thought that a big laugh is a really loud noise from the soul saying, "Ain’t that the truth."
That said, Edu-fun Friday is a series devoted to adding some humor to the lives of teachers who visit this blog. After all, there’s nothing better than ending the week on a positive note! Plus, do we have the best topics to provide us with some comic relief or what?
Amen! Amen! A-men!
Thanks You Can’t Scare Me, I’m a Teacher Facebook page for sharing this end of the year laugh!
Edutech for Teachers team
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:58pm</span>
|
The Northern New Jersey ASTD Learning Leaders Forum has invited me to speak at their October 15th meeting. To whet the appetite, we created the following video interview, separated into separate videos.
What research have you found interesting? What's new?
Lessons learned from learning leaders
Five failures of workplace learning professionals
Advice for learning leaders (CLO's, training directors, etc.)
What are you going to speak about on October 15th?
I'd like to send special thanks to Tony Irace--a long-time colleague and a great learning leader--and his co-conspirator at the Northern New Jersey ASTD Learning Leaders Forum, Meg Paradise. Thanks for your interest in my work and for organizing this great event!!
Click here to get the details about the event if you're in the area.
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:58pm</span>
|
Teens and Media Over the Years Infographic
Being a teenager in 2015 is very different than it was in 1995. Technology has had a huge transformative impact on teen’s behaviour and subsequently their lifestyles. One example in favour of this transformation is social networking. Before the massive uptake of Internet and web 2.0 technologies, kids at the time used to socialize with friends from school or neighbourhood and they got to spend time together in "real world". Now, virtuality takes over and platforms such as Facebook, Snapchat, online gaming sites…etc have redefined the notion of social networking unleashing it from the spatio-temporal boundaries that used to stand in the face of people getting to know and talk to each other.
Technology has opened all kinds of new things to teens, some good and some bad. So just how as being a teenager changed from the 90s? Are things better or worse? Take a look at the Teens and Media Over the Years Infographic above which presents true facts about teens and media and decide for yourself. In particular, the Teens and Media Over the Years Infographic focuses on the social aspect of teen’s lives and reveals some really astounding stats about the increasing dependence of teenagers on social media as the primary means of socialization.
True Facts About Teens and Media, Now & Then
The teens of 1995 were on the forefront of learning to use the internet as part of their everyday lives, today’s teens don’t know how to live without it. They’re focused on finding the latest-and-greatest app that will help them communicate better with their peers. Whereas 60% of teens in 1995 talked to their friends on the phone daily, now only 39% of teens make or receive voice calls at all, while only 35% of teens social with other teens outside of school on a daily basis.
This was partially due to a movement in the mid-2000s in which parents encouraged children to stay inside due to fears of neighborhood safety — not to mention the rapid expansion of the internet. Between 1995 and 2005, the internet grew from 23,500 to 64.8 MILLION websites.
It’s amazing to see how far teens and media have come — and to imagine where they’ll go next.
Via: www.teensafe.comThe post Teens and Media Over the Years Infographic appeared first on e-Learning Infographics.
eLearning Infographics
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:58pm</span>
|
Wow. What a firestorm! You'd think the U.S. Government was going to go into default or something. Popular Science decides to get rid of its comments--fearing that good science was being misperceived because of online comments.
Here are just a few of the many articles/blogs on the controversy:
NPR Radio Story (Audio)
NPR Online Story
The Guardian article on how some are trying to fix comments
Washington Post article damning comments (read the comments at the end)
The Hindu article praising comments
Slate article on the side of comments
I few quick comments (before I've read the actual science that Popular Science cites and other related research, which, I must add, I find a fascintating topic):
1.If Popular Science is using only one or two studies to draw conclusions, they don't understand social science. Specifically, they don't understand that social science research generally requires--at a minimum--dozens of studies to draw firm conclusions, fence of boundaries, and discover contingencies. Not always, but usually.
2.I agree that society today is getting more and more anti-science, anti-evidence, and anti-wisdom.
3.I agree that there is justification for worrying about the effect of social-media pollution. As a guy who reads over 200 articles from scientific refereed journals on learning, memory, and instruction each year--and thus who probably knows more than the average bear in my field--I've seen lots of categorically-wrong information floated in social-media comments in our field. Of course, I am not infallible, all-knowing, nor omnicient. Anyone who reads the research knows how little of the whole he/she can possibly know. But still, I do know enough to know when some notions of learning are fundamentally flawed. AND in the workplace learning-and-performance field, there is much that is foolhardy, misinformed, and harmful--and social media has not stopped this from happening.
4. There are some victories, however, even if they are not complete. For example, social-media and the internet have made it less likely that people in our field are spouting off about people learning 10% of what they see, etc. Maybe I have made a difference.
5.I know that comments have been helpful to me personally in other contexts. For example, the New York Times comments have been very helpful to me in seeing the strengths and weaknesses of the original article.
6.I got rid of unmoderated comments on my blog purely due to the large amount of spam that was being posted. Most commenters here have helpful things to say.
6.Popular Science argued specifically that people in general are misinformed about science, lending credence to the idea that comments on vetted scientific articles for a popular audience may be a special case.
7.Popular Science also argued (see the NPR interview) that they made the decision because they would rather put their resources into creating good articles in the first place rather than moderating their comment sections.
8.I wish someone who studies this issue intensively would create a rubric, helping us understand when and how comments can be valuable--and when they cause more harm then good. It can't be black and white--comments are good, comments are bad. Most things in human nature don't work like that.
9.For the workplace learning-and-performance field, my recommendation to you is: Don't assume that comments are good or comments are bad. Do assume however, that you may need a way to regulate, monitor, or control comments to make them helpful. I'll never forget the time that I was arguing with a social-media evangelist who was claiming that social media was always corrective in time. A member of the audience interrupted with the story of how social media killed a couple of soldiers when they used information from social-media to attempt to deal with an improvised explosive device.
Some previous posts on social media:
Plusses and Minuses of Social Media
Measuring Social Media
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:58pm</span>
|
When asked for a simple heuristic in how to use the spacing effect--the
finding that repetitions spaced in time are more effective in
supporting remembering than repetitions squished narrowly in time--I've
often told people that the ideal spacing interval is one that will equal
the retention interval one desires. If you want you learners to
remember for a month, give them one-month spaced repetitions. If you
can't do that, longer is better, and there seems to be something magical
about repeating something overnight.
But new research suggests that even short spacings of only
half-a-minute or so can have lasting benefits over non-spaced
repetitions.
Citation:
Rawson, K. A., & Dunlosky, J. (2012). Relearning Attenuates the Benefits and Costs of Spacing. Journal of Experimental Psychology: General. Advance online publication.
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:58pm</span>
|
I know I'm going completely against most training-industry practice in saying this, but it's the truth. Likert-like scales create poor data on smile sheets.
If you're using questions on your smile sheets with answer choices such as:
Strongly Agree
Agree
Neither Agree Nor Disagree
Disagree
Strongly Disagree
You're getting data that isn't that useful. Such questions will create
data that your stakeholders--and you too--won't be able to decipher very
well. What does it mean if we average a 4.2 rating? It may sound good,
but it doesn't give your learners, your stakeholders, or your team much
information to decide what to do.
Moreover, let's remember that our learners are making decisions with every smile-sheet question they answer. It's a lot tougher to decide between "Strongly Agree" and "Agree" than between two more-concrete answer choices.
Sharon Shrock
and Bill Coscarelli, authors of the classic text, now in its third edition, Criterion-Referenced Test Development,
offer the following wisdom: On using Likert-type
Descriptive Scales (of the kind that use response words such as "Agree,"
"Strongly Agree," etc.):
"…the
resulting scale is deficient in that the [response words] are open to many
interpretations." (p. 188)
So why do so many surveys use Likert-like scales? Answer: It's easy, it's tradition, and surveys have psychometric advantages often because they are repeating the same concepts in multiple items and they are looking to compare one category to another category of response.
Smile sheets are different. On our smile sheets, we want the learners to be able to make good decisions, and we want to send clear messages about what they have decided. Anything that fuzzes that up, hurts the validity of the smile-sheet data.
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:57pm</span>
|
Looking for some interesting K-12 edtech data? Then look no further than the New Digital Learning Playbook, Project Tomorrow’s annual Speak Up National Research Project which provides districts nationwide and throughout the globe with new insights into how today’s students want to leverage digital tools for learning.
In fall 2013, over 403,000 online surveys from K-12 students, parents, and educators representing over 9,005 U.S. schools were used to establish a more comprehensive understanding of the various ways students are currently tapping into a wide range of mobile devices to enhance learning anytime, anywhere.
Check out the infographic shown below to see how mobile devices enable new and customized learning that is un-tethered and digitally-rich.
A full page version of this infographic can be found here.
Classroom Connection:
Educators are always looking for data to support the value of digital tools related to the learning process. The findings provided in this infographic (and corresponding report) provide additional rationale to support the integration of technology into classrooms and beyond.
A shout out to my colleague Jim Gates for sharing this valuable edtech info!
Edutech for Teachers team
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:57pm</span>
|
What To Look For When Choosing an LMS Infographic
Choosing a Learning Management System is a complicated process and can be very time consuming and overwhelming for organizations since there are many different aspects to consider, not to mention that the LMS is usually the most expensive component of the online learning ecosystem. The What To Look For When Choosing an LMS Infographic has been developed to show organizations the main features and most important functionalities that they should look for during their search for the perfect LMS.
Functionality
SCORM / AICC
Tin Can API
Analytics
Scalable
Features
Course Design
Instructional Design Tools
Course Creation Tools
Content Development
Course Feedback
Assessment
Realtime Learner Participation Tracking
Customizable Reports
Printable Cerificates
Group Reporting
External Training
Event Tracking
Collaboration & Communication
Course Notes
File Exchanges
Discussion Groups
Collaboration Features
Resource Management
Accessibility
24/7/365
Web-Based
Tablet Accessible
Assignable Privileges
Easy to Use
How-to-Guides
Customizable
Easy Navigation
Onboarding
Customer Support
Help Desk
Vendor Support
Online Support Hub
User Groups
Security
Privacy Controls
Server Locations
Automatic Backup System
Read also:
Checking Under the Hood: Choosing a Learning Management System
How To Choose The Best Learning Management System
Can Your LMS Do This? 8 Questions You Need To Answer
11 Tips for Choosing The Best Learning Management System
Via: cypherworx.comThe post What To Look For When Choosing an LMS Infographic appeared first on e-Learning Infographics.
eLearning Infographics
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:57pm</span>
|
This article first appeared in my Newsletter in the October 2013 issue. You can sign up for my newsletter by clicking here.
--------------------------------------------------------------------------
Onboarding is ubiquitous. Every organization does it. Some do it with great fanfare. Some make a substantial investment. Some just let supervisors get their new hires up to speed. Unfortunately, most organizations make critical mistakes in onboarding—mistakes that increase turnover, raise costs, weaken employee loyalty, and lower productivity.
Fortunately, recent research highlights onboarding best practices. If organizations would just use the wisdom from the research, they’d save themselves money, time, and resources—and employees in those companies would have to deal with many fewer headaches.
Key Outcomes
Recent reviews of the research suggest that there four key outcomes that enable onboarding success:
New hires have to quickly and effectively learn their new job role.
New hires have to feel a sense of self-efficacy in doing their job.
New hires have to learn the organizational culture.
New hires have to gain acceptance and feel accepted by their coworkers.
Enabling Factors
Recent research suggests that the following factors are helpful in ensuring onboarding success:
What New Hires Can Do
Be proactive in learning and networking
Be open to new ways of thinking and acting
Be active in seeking information and getting feedback
Be active in building relationships
What the Organization Can Do
Ensure that managers take a very active and effective role
Provide formal orientations that go beyond information dissemination
Provide realistic previews of the organization and the job
Proactively enable new hires to connect with long-tenured employees
Five Biggest Mistakes (In Reverse Order of Importance)
5—Providing an Information Dump during Orientation
The research shows that employee orientations can facilitate onboarding. However, too many organizations think their orientations should just cram tons of information down the throats of employees. Even worse are orientations that have employees sit and listen to presentation after presentation. Oh the horror. New employees are excited to get going. Putting them into the prison of listening—even to great content—is a rudeness that shouldn’t be tolerated. The best orientations help build relationships. They get employees involved. They prepare new hires for how to learn and grow and network on their own. They help new hires learn the organization culture—both the good and the bad. They share the organization’s vision, passions, and its strategic concerns.
4—Thinking that Training is Sufficient
Training can be essential to help get new employees competent in their new roles, but it is NEVER sufficient on its own. Training should be supported by prompting mechanisms (like job aids), structures and support for learning on the job, reinforcement and follow-through, and coaching to provide feedback, set goals, and lend emotional support.
3—Forgetting the Human Side of Onboarding
New hires are human beings, and, just like the rest of us, they too are influenced by the dynamics of social interaction. They don’t just learn to do a job. They also learn to love and trust a company, a work unit, or a group of coworkers—or they don’t. In return, new hires are either trusted and respected by their coworkers or they’re not. The research is very clear about this. One of the keys to successful onboarding is the strength of the relationships that are built in the first year of a person’s tenure. The stronger the bonds, the more likely it is that a person will stay and bring value to the organization.
2—Considering Onboarding as Something that Can Be Done Quickly
Some companies offer a one week orientation and then cut loose their new hires to sink or swim. Enlightened companies, on the other hand, realize that onboarding is like relationship-building—it takes time. It takes time to really learn one’s job well. It takes time to integrate into the organizational culture. It takes time to connect with people. Realistic estimates suggest that onboarding can take 6 months, 12 months, or even 18 months to fully integrate a person into a new organization.
1—Not Preparing Supervisors
Supervisors are the single most important leverage point for onboarding success. You’ve probably heard it said that people don’t quit their companies, they quit their supervisors. Well, the flip side can also be said. People don’t join a company, they join a supervisor and his/her workgroup. Unfortunately, most supervisors just have no idea about the importance of onboarding and how to do it correctly. Where best practices give supervisors training and an onboarding checklist, too many supervisors just wing it. The real tragedy is that the investment in onboarding training and a checklist for supervisors is quite small in the greater scheme of things.
Final Thoughts on Onboarding
As a workplace learning-and-performance consultant, when I’ve been called in to advise companies on their onboarding programs, I often see incredibly dedicated professionals who are passionate about welcoming new people into their organizations. Unfortunately, too many times, I see organizations that have the wrong mental models about what makes onboarding successful. It’s a shame that our old mental models keep us from effectiveness—when the research on onboarding now gives us sound prescriptions for making onboarding successful.
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:57pm</span>
|
As Quincy Jones once remarked, "I’ve always thought that a big laugh is a really loud noise from the soul saying, "Ain’t that the truth."
That said, Edu-fun Friday is a series devoted to adding some humor to the lives of teachers who visit this blog. Even though it’s summer, there’s still nothing better than ending the week on a positive note! Plus, do we have some of the best topics to provide us with some comic relief or what?
Wouldn’t Maslow be so proud of the addition to his hierarchy of basic human needs! I know I can’t live without my "we-fee" (as my mother calls it). How ’bout you?
Thanks to the Live and Breathe blog for sharing this image!
Edutech for Teachers team
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:57pm</span>
|
Enrollments are open for my first Workout-Workplace Workshop on the World Wide Web (w6).
Fittingly, it will cover the most significant improvement in smile-sheet design in a generation--The Performance-Focused Smile Sheet.
Click for details...
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:57pm</span>
|
How Classroom Design Impacts Learning and Engagement Infographic
To complementary studies conducted by The University of Salford have shown that classroom design has a profound influence on learning and engagement. The How Classroom Design Impacts Learning and Engagement Infographic presents the key areas in which the built environment can influence student progress.
The studies, which were conducted at a number of schools across the UK, recorded the variations in student progress when exposed to different classroom designs and layouts. The How Classroom Design Impacts Learning and Engagement Infographic above covers classroom lighting, design, colour and personalisation - with a number of useful tips teachers and schools can use to reduce the negative consequences of poor classroom design and improve the learning environment for their students.
Interesting Findings
Researchers found that 75% of the variation in pupil performance can be explained by the built environment - with lighting, air quality, colour and noise disruption cited as the primary factors effecting student engagement.
Did you know that a classroom’s environment can affect student progress by as much as 25% throughout the academic year? Or that fluorescent lighting has been linked to hyperactivity and a lack of attention among some pupils? Even the way that we furnish our learning spaces can have an impact - certain layouts of a classroom can enhance learning, improve concentration, and encourage better behaviour.
Via: www.innova-solutions.co.ukThe post How Classroom Design Impacts Learning and Engagement Infographic appeared first on e-Learning Infographics.
eLearning Infographics
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:56pm</span>
|
More and more training departments are considering the use of the Net Promoter Score as a question--or the central question--on their smile sheets.
This is one of the stupidest ideas yet for smile sheets, but I understand the impetus--traditional smile sheets provide poor information. In this blog post I am going to try and put a finely-honed dagger through the heart of this idea.
What is the Net Promoter Score?
Here's what the folks who wrote the book on the Net Promoter Score say it is:
The Net Promoter Score, or NPS®, is based on the fundamental perspective that every company’s customers can be divided into three categories: Promoters, Passives, and Detractors.
By asking one simple question — How likely is it that you would recommend [your company] to a friend or colleague? — you can track these groups and get a clear measure of your company’s performance through your customers’ eyes. Customers respond on a 0-to-10 point rating scale and are categorized as follows:
Promoters (score 9-10) are loyal enthusiasts who will keep buying and refer others, fueling growth.
Passives (score 7-8) are satisfied but unenthusiastic customers who are vulnerable to competitive offerings.
Detractors (score 0-6) are unhappy customers who can damage your brand and impede growth through negative word-of-mouth.
To calculate your company’s NPS, take the percentage of customers who are Promoters and subtract the percentage who are Detractors.
So, the NPS is about Customer Perceptions, Right?
Yes, its intended purpose is to measure customer loyalty. It was designed as a marketing tool. It was specifically NOT designed to measure training outcomes. Therefore, we might want to be skeptical before using it.
It kind of makes sense for marketing right? Marketing is all about customer perceptions of a given product, brand, or company? Also, there is evidence--yes, actual evidence--that customers are influenced by others in their purchasing decisions. So again, asking about whether someone might recommend a company or product to another person seems like a reasonable thing to ask.
Of course, just because something seems reasonable, doesn't mean it is. Even for its intended purpose, the Net Promoter Score has a substantial number of critics. See wikipedia for details.
But Why Not for Training?
To measure training with a Net-Promoter approach, we would ask a question like, "How likely is it that you would recommend this training course to a friend or colleague?"
Some reasonable arguments for why the NPS is stupid as a training metric:
First we should ask, what is the causal pathway that would explain how the Net Promoter Score is a good measure of training effectiveness? We shouldn't willy-nilly take a construct from another field and apply it to our field without having some "theory-of-causality" that supports its likely effectiveness. Specifically we should ask whether it is reasonable to assume that a learner's recommendation about a training program tells us SOMETHING important about the effectiveness of that training program? And, for those using the NPS as the central measure of training effectiveness--which sends shivers down my spine--the query than becomes, is it reasonable to assume that a learner's recommendation about a training program tells us EVERYTHING important about the effectiveness of that training program?Those who would use the Net Promoter Score for training must have one of the following beliefs:
Learners know whether or not training has been effective.
Learners know whether their friends/colleagues are likely to have the same beliefs about the effectiveness of training as they themselves have.
The second belief is not worth much, but it is probably what really happens. It is the first belief that is critical, so we should examine that belief in more depth. Are learners likely to be good judges of training effectiveness?
Scientific evidence demonstrates that learners are not very good at judging their own learning. They have been shown to have many difficulties adequately judging how much they know and how much they’ll be able to remember. For example, learners fail to utilize retrieval practice to support long-term remembering, even though we know this is one of the most powerful learning methods (e.g., Karpicke, Butler, & Roediger, 2009). Learners don’t always overcome their incorrect prior knowledge when reading (Kendeou & van den Broek, 2005). Learners often fail to utilize examples in ways that would foster deeper learning (Renkl, 1997). These are just a few examples of many.
Similarly, two meta-analyses on the potency of traditional smile sheets, which tend to measure the same kind of beliefs as NPS measures, have shown almost no correlation between learner responses and actual learning results (Alliger, Tannenbaum, Bennett, Traver, & Shotland, 1997; Sitzmann, Brown, Casper, Ely, & Zimmerman, 2008).
Similarly, when we assess learning in the training context at the end of learning, several cognitive biases creep in to make learners perform much better than they would perform if they were in a more realistic situation back on the job at a later time (Thalheimer, 2007).
Even if we did somehow prove that NPS was a good measure for training, is there evidence that it is the best measure? Obviously not!
Should it be used as the most important measure. No! As stated in the Science of Training review article from last year: "The researchers [in talking about learning measurement] noted that researchers, authors, and practitioners are increasingly cognizant of the need to adopt a multidimensional perspective on learning [when designing learning measurement approaches]." Salas, Tannenbaum, Kraiger, & Smith-Jentsch, 2012).
Finally, we might ask are there better types of questions to ask on our smile sheets? The answer to that is an emphatic YES! Performance-Focused Smile Sheets provide a whole new approach to smile sheet questions. You can learn more by attending my workshop on how to create and deploy these more powerful questions.
The Bottom Line
The Net Promoter Score was designed to measure customer loyalty and is not relevant for training. Indeed, it is likely to give us dangerously misguided information.
When we design courses solely so that learners like the courses, we create learning that doesn't stick, that fails to create long-term remembering, that fails to push for on-the-job application, etc.
Seriously, this is one of the stupidest ideas to come along for learning measurement in a long time. Buyers beware!! Please!
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:56pm</span>
|
I just read the following research article, and found a great mini-review of some essential research.
Hagemans, M. G., van der Meij, H., & de Jong, T. (2013). The effects of a concept map-based support tool on simulation-based inquiry learning. Journal of Educational Psychology, 105(1), 1-24. doi:10.1037/a0029433
Experiment-Specific Findings:
The article shows that simulations—the kind that ask learners to navigate through the simulation on their own—are more beneficial when learners are supported in their simulation playing. Specifically, they found that learners given the optimal learning route did better than those supplied with a sub-optimal learning route. They also found that concept maps helped the learners by supporting their comprehension. They also found that learners who got feedback on the correctness of their practice attempts were motivated to correct their errors and thus provided themselves with additional practice.
Researchers’ Review of Learners’ Poor Learning Strategies
The research Hagemans, van der Meij, and de Jong did is good, but what struck me as even more relevant for you as a learning professional is their mini review of research that shows that learners are NOT very good stewards of their own learning. Here is what their mini-review said (from Hagemans, van der Meij, and de Jong, 2013, p. 2:
Despite the importance of planning for learning, few students engage spontaneously in planning activities (Manlove & Lazonder, 2004).
Novices are especially prone to failure to engage in planning prior to their efforts to learn (Zimmerman, 2002).
When students do engage in planning their learning, they often experience difficulty in adequately performing the activities involved (de Jong & Van Joolingen, 1998; Quintana et al., 2004). For example, they do not thoroughly analyze the task or problem they need to solve (Chi, Feltovich, & Glaser, 1981; Veenman, Elshout, & Meijer, 1997) and tend to act immediately (Ge & Land, 2003; Veenman et al., 1997), even when a more thorough analysis would actually help them to build a detailed plan for learning (Veenman, Elshout, & Busato, 1994).
The learning goals they set are often of low quality, tending to be nonspecific and distal (Zimmerman, 1998).
In addition, many students fail to set up a detailed plan for learning, whereas if they do create a plan, it is often poorly constructed (Manlove et al., 2007). That is, students often plan their learning in a nonsystematic way, which may cause them to start floundering (de Jong & Van Joolingen, 1998), or they plan on the basis of what they must do next as they proceed, which leads to the creation of ad hoc plans in which they respond to the realization of a current need (Manlove & Lazonder, 2004).
The lack of proper planning for learning may cause students to miss out on experiencing critical moments of inquiry, and their investigations may lack systematicity.
Many students also have problems with monitoring their progress, in that they have difficulty in reflecting on what has already been done (de Jong & Van Joolingen, 1998).
Regarding monitoring of understanding, students often do not know when they have comprehended the subject matter material adequately (Ertmer & Newby, 1996; Thiede, Anderson, & Therriault, 2003) and have difficulty recognizing breakdowns in their understanding (Ertmer & Newby, 1996).
If students do recognize deficits in their understanding, they have difficulty in expressing explicitly what they do not understand (Manlove & Lazonder, 2004).
One consequence is that students tend to overestimate their level of success, which may result in "misplaced optimism, substantial understudying, and, ultimately, low test scores" (Zimmerman, 1998, p. 9).
The research article is available by clicking here.
Final Thoughts
This research, and other research I have studied over the years, shows that we CANNOT ALWAYS TRUST THAT OUR LEARNERS WILL KNOW HOW TO LEARN. We as instructional designers have to design learning environments that support learners in learning. We need to know the kinds of learning situations where our learners are likely to succeed and those where they are likely to fail without additional scaffolding.
The research also shows, more specifically, that inquiry-based simulation environments can be powerful learning tools, but ONLY if we provide the learners with guidance and/or scaffolding that enables them to be successful. Certainly, some few may succeed without support, but most will act suboptimally.
We have a responsibility to help our learners. We can't always put it on them...
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:55pm</span>
|
As I’ve mentioned before—it is not only important for educators to encourage students to push the technological envelope, but it’s equally essential that we teach them to navigate the digital world in a responsible manner. But in order to do so, we must first become familiar with copyright rules and fair use guidelines ourselves—a task that can seem very daunting and frustrating at times.
If you’ve ever used online materials for teaching and learning, then I’m sure you have wondered one or more of the following: Do I need permission to use this image? Can I share this video on my classroom web site? Would it be best for my me and/or my students to create my/their own media?
Well, it may not answer all of your burning questions, but the copyright flowchart shown below will surely help with some of the confusion surrounding this very relevant and significant topic.
Thanks to Silvia Tolisano, author of the Langwitches blog, for designing and sharing this useful resource! For more information about this infographic, check out the original blog post here.
Classroom Connection:
Unfortunately, some educators and students (as well as people in general) have the tendency to ignore the fact that media is regulated and requires compliance of copyright rules and fair use guidelines. That said, both teachers and students need to learn that just because media is accessible, downloadable and free that this does not necessarily mean it’s acceptable to use or reuse without possible restrictions.
That said, this very cool flowchart can serve as a great resource for assisting with properly locating and/or using media obtained from the Internet as well as creating your own.
Edutech for Teachers team
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:55pm</span>
|
As a learning consultant, I've been called into workplaces to do work-learning audits specifically focused on safety. Unfortunately, what I've seen too often are poor safety-learning practices. People often talk a good game of safety, but their practices are just not effective. Let me give you one example. I was at a manufacturing plant and was told that all team meetings talked about safety. However, what I saw at actual team meetings was a perfunctory exhalation about safety that was likely to have zero effect on actual safety outcomes. Seriously, many team leaders would say something pithy like "10 fingers, 10 toes" and that would be it!!
To be truly effective, safety messages have to follow the principles of all good learning design. Specifically, safety messages have to be context-based. They have to refer to actual workplace situations, and get employees to visualize and anticipate safety-critical situations and the actions that are needed in those situations. Safety messages also have to prompt employees to retrieve these situation-action links and do that in a manner that is repeated in various ways over time.
Recently, while teaching a workshop, one of the participants told a great story about how General Electric has built a set of cultural expectations that propel safety. The author--who wants to remain anonymous--wrote up the following overview of what he/she observed at GE.
I have had the pleasure to conduct training for the field service organization at GE. One key aspect of the field service organization is safety. A seemly simple task of lifting a heavy object with a crane can easily result in fatality by a shift in the chain causing the object to swing out of control. During my work I was impressed with the relentless focus on safety, which was not just in words, but in action. I thought it would be useful to share an example of how safety is built into their culture.
Each day of a training session, or any meeting for that matter, always started with a safety moment. This discussion focused on the potential safety issues that could come up, and precautions that need to be followed. I would start the training by having the hotel facility manager come in and cover the emergency procedures. If I failed to start any training session in this manner, a participant would, without exception, come to me during the first break indicating that we forgot the safety briefing. Unlike other organization where I would be asked to show a safety video, and people would count sheep until it ended, this safety briefing was seen as important to all the participants.
At the start of each training day, and after lunch, a participant would be assigned to share a safety moment in their work that enabled someone to avoid a potential injury. There was never a problem getting participants to accept responsibility for conducting one of these safety moments. In fact, after sharing their experience, there was always a round of applause from the other participants. This consistent practice, and positive reception by individuals of all levels helps to foster a strong safety culture within the organization.
In talking with the author of this observation, I was amazed at how deeply ingrained a culture of safety was in this GE environment. From this example, here are lessons learned--many of which will be relevant even to those who are not dealing with safety, but who are focused on performance-improvement in general.
They focused on specific safety issues and situations.
They focused on safety ubiquitiuosly, not just in training and not just when it was "safety time."
People bought into the importance of safety--they didn't just go through the motions.
There were expecations that safety discussions were scheduled into everything.
Many people wanted to volunteer to lead safety discussions--not just people designated as safety officers.
People really appreciated the safety discussions--and they showed their appreciation.
Management was not the only driver of safety.
Safety messages were repeated, and spaced over time.
Special thanks to the anonymous author and to GE for demonstrating that safety can be inculcated into workplace practice.
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:55pm</span>
|
MOOC's don't have to suck. The 4% to 10% completion rates may be the most obvious problem, but too many MOOC's simply don't use good learning design. They don't give learners enough realistic practice, they don't set work in realistic contexts, they don't space repetitions over time.
But after reading this article from Thomas Friedman in the New York Times, you will see that there is one thing that MOOC's do really well. The get learning content to learners.
Really, go ahead. Read the article...
Why is "Exposure" one of the Decisive Dozen learning factors?
Many people have wondered why I included "Exposure" as one of the most important learning factors. Why would exposing learners to learning content rank as so important? Friedman's article makes it clear in one example, but there are billions of learners just waiting for the advantage of learning.
I got the idea of the importance of exposing learners to valid content by noticing in many a scientific experiment that learners in the control group often improved tremendously--even though they were almost always outclassed by those who were in the treatment groups.
By formalizing Exposure as one of the top 12 learning factors, we send the message that while learning design matters, giving learners valid content probably matters more.
And yes, that last sentence is as epically important as it sounds...
It also should give us learning experts a big dose of humility...
MOOC's will get better...
Most MOOC's aren't very well designed, but over time, they'll get better.
Will Thalheimer
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:54pm</span>
|
10 Awesome Totara LMS Features Infographic
Totara LMS is an open source distribution of Moodle developed for the corporate and government market. Putting emphasis on the person not the course, Totara LMS allows you to better manage performance right across your organisation. This is a solution for aligning talent development with organisational strategies to meet the challenge of moving towards a performance management culture. The 10 Awesome Totara LMS Features Infographic showcases the top 10 features of Totara Learning Management System.
1. Learning and Performance Management
Give your learners constructive feedback so that they can improve the quality of their work. Company goals and/or individual goals are defined and may be assigned to learners based on the position they hold, the organisation they work in or other factors.
Personal and organisational goals feed into the learners development plan where both can be reviewed via the online appraisal or 360 tools.
2. Competency Management
Assign courses to competencies, and competencies to a particular job role or part of the organisation. A user’s learning plan can then automatically pull in all competencies and courses assigned to their organisation and job role. The Competency hierarchy allows you to set up one or multiple frameworks within Competency structures. These then define the skills, knowledge and behaviour used to assess staff performance.
3. Audiences
Totara LMS gives you more than just position and organisational hierarchies. With the Audiences features, you can create and target custom groups via dynamic rules and events.
Define, edit, and manage user groups using rules defined in other organisational data sources. Define audiences based on key events e.g. logins or completions. You can even define audiences based on other audiences!
4. Program Management
The Totara Program Management feature allows you to create learning paths by defining sets of courses and their dependences. These act as stepping stones for your learners and give order and flow of the course completion. Program Management also includes the ability to set up recurring courses, ensuring compliance and removing manual admin tasks. This feature also links with certification.
5. Certification
Site administrators create a new certificate with two paths: the original certification path and a recurring recertification path. The original path can be automatically reused as the recertification path or a separate path can be defined. To ensure compliance, automatic alerts and mails can also be configured to inform learners about refresher training.
6. Reporting and Dashboards
Totara LMS allows you to manage your reporting requirements, with the inbuilt Report Builder, a powerful and flexible tool that allows you to build custom reports on users, course progress and competency achievement across your business.
Graphical reports can be built by the system administrator with access controlled by system role and where in the organisation the user is.
7. Visibility Manager
Visibility Manager allows you to limit the catalogue so only specific learners can see courses, programs and certifications applicable to them based upon the audiences they are enrolled in.
8. Integration with HR and other business systems
To better manage and develop your talent, it’s essential your LMS connects with key business information systems. Maximize your return on investment and minimize your data entry tasks with HR Sync.
Totara LMS also integrates with every enterprise Single Sign On (880) and Shibboleth.
9. Highly customizable
Customising the look and feel of your Totara LMS is easy, and can be branded exactly the way you need it to look. Improve learner engagement by having your LMS look like an extension of your website, other internal systems or intranet sites. Learning Pool also offers a fully responsive theme, meaning learners can access the system on any device, any time, any place.
10. Growing roadmap of enhancements
Totara LMS released version 2.7 in March 2015. With a growing roadmap, Totara LMS typically release once a year adding more great features each time with heavy focus on usability and accessibility. As a Platinum Partner, Learning Pool offers free upgrades to our Totara LMS customers. We also work in partnership with Totara LMS to develop further enhancements and features.
That’s not all…
Learning Pool offers Totara LMS as a complete solution with total support for admins and end users throughout the length of your subscription. There are also four innovative add-ons exclusively available for Learning Pool customers:
Classroom Connect: Synchronise classroom training with online training and give your users full control of their face-to-face training as well as reducing the administration around organising classroom events.
Encore: Encore is our mobile learning app that provides learners with content that is time delayed from the moment they take the initial learning. Based on the concept of Spaced Practice, Encore significantly improves learner retention.
Shopping Basket: Shopping Basket seamlessly integrates with Totara LMS giving learners a self-contained experience using valid payment gateways like PayPal or Sage Pay.
Knowledge Bank: Deliver policies, best practice and frequently asked questions to thousands of users, reducing the demand on HR teams.
Via: www.learningpool.comThe post 10 Awesome Totara LMS Features Infographic appeared first on e-Learning Infographics.
eLearning Infographics
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:54pm</span>
|
As Quincy Jones once remarked, "I’ve always thought that a big laugh is a really loud noise from the soul saying, "Ain’t that the truth."
That said, Edu-fun Friday is a series devoted to adding some humor to the lives of teachers who visit this blog. Even though it’s summer, there’s still nothing better than ending the week on a positive note! Plus, do we have some of the best topics to provide us with some comic relief or what?
I don’t know about you, but I love summer. And not having to write lesson plans is very liberating!
Thanks to the UClass blog for sharing this meme! BTW—UClass is a a really useful lesson plan exchange and resource center for educators. Check it out here!
Edutech for Teachers team
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jul 15, 2015 01:54pm</span>
|