Loader bar Loading...

Type Name, Speaker's Name, Speaker's Company, Sponsor Name, or Slide Title and Press Enter

. . . shouldn’t stay in Vegas !! I just returned from the SHRM Annual Conference (SHRM15) in Las Vegas, Nevada. It was a great even from start to finish. I enjoyed the Bloggers Lounge, the Smart Stage, the SHRM Store, the vendor hall, the keynote speakers, Jennifer Hudson, the concurrent sessions, being on TChat with Kevin Grossman, Callie Zipple and Chanel Jackson, the No Kid Hungry Poetry Slam and especially meeting the great attendees throughout the week !! Tomorrow, I return to work and the great folks I get to be with most of the time...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:50am</span>
I’m Mary Crave. Among the many roles I play as an extension education specialist in Wisconsin, Africa, and elsewhere are those of facilitator of and advocate for participatory evaluation practices that engage vulnerable persons - those who often don’t often have a voice in their community. At the University of Wisconsin I focus on programs that empower youth and women in international agriculture and also teach students who plan to work in community non-profit organizations. It wasn’t until the past few years that I consistently and consciously applied a social justice lens to my work in each of these roles. It was enlightening to see at MESI how others frame their work in social justice. Hot Tip: Is social justice a goal of the programs you evaluate? While we may not think so at first, it may be. Is the goal of a program to make a change? Improve the common good? Transform people or an organization? Provide access to resources? Do the programs consider who is not involved? Social justice is about power - who has it, who doesn’t. Therefore, as evaluators we need to understand our own sense of power and privilege as we enter into a community or program. Rad Resource: Back Pocket Questions One intriguing breakout session at MESI was led by Leah Hakkola, University of Minnesota. Hakkola uses what she calls "Back Pocket Questions" for culturally sensitive evaluation. A back pocket question is: a strategy we can use to examine privilege and its systemic consequences; a question we can ask ourselves to better understand dominant cultural norms, power dynamics, discrimination and expectations; and, a tool we can use as an entry point into difficult conversations we may need as an evaluator. Some examples of back pocket questions from Hakkola: Who would be disappointed if they were left out of the evaluation? The design? Evaluation funding? Program participation? How much risk is associated with each group participating in this evaluation? What power am I willing to concede to my stakeholders? How might my actions be supporting systemic oppression or discrimination? How might my actions be colluding with my privilege in this evaluation? What are the unique consequences of my decisions with regard to each stakeholder group? Whose stories and what methods am I privileging in this context? These were adapted from an AEA 2014 session, "How can we identify, talk effectively about and address privilege and power in the practice of evaluation?" facilitated by Andrea Anderson-Hamilton and Sally Leiderman. What questions are in your back pocket? The American Evaluation Association is celebrating MESI Spring Training Week. The contributions all this week to aea365 come from evaluators who presented at or attended the Minnesota Evaluation Studies Institute Spring Training. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:50am</span>
  Several years ago, long-time family and friends got together over the holidays for a good meal.  Grown children were back in the nest, and we enjoyed the repast reminiscing about days gone by. The conversation, however, drifted into a discussion regarding what we remember as the most upsetting or painful interaction we had with our parent/child.  For example, one son mentioned his father yelling at him across a blackjack table, "If you're not going to play RIGHT, don't play at all." What emerged was that every child had a painful...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:49am</span>
My name is Charmagne Campbell-Patton and I am an independent evaluator. This was my first year at MESI, and I really appreciated the focus on social justice. One of the main themes that ran across most of the presentations was the power that comes with the role of evaluator and our responsibility to use that power to promote social justice. Here are just a few of the salient points I took away with me: Hot Tip: Speak truth to power. In his workshop with Nora Murphy on wicked problems in Developmental Evaluation, Michael Patton likened the evaluator to the Medieval Jester, whose role was to deliver bad news to the Monarch in a way that could be heard. To drive home the point, Michael donned a jester mask, which he wears when beginning a new evaluation assignment to lighten the mood. In addition to providing some levity to tense situations, the mask is a good reminder that evaluators have a responsibility to tell the truth, especially when no one else can or will and in particular, to those who may not want to hear it. Cool Trick: Empower others to speak for themselves Michael’s presentation certainly made an impression, but he is a white man with a PhD and 45 years of experience as an evaluator, all of which make it much easier for him to speak truth to power than for some. So how do you get people to listen if you’re not in a position of power? One option is to empower those who are traditionally marginalized to speak for themselves by providing them with data that gives them a voice. In his keynote, Eric Moore, Director of Research and Evaluation at Minneapolis Public Schools, emphasized the importance of paying attention to the theoretical frameworks that inform evaluation work because the design of an evaluation determines who has a voice and who is at the table to make meaning of the data. Hot Tip: Hold yourself accountable. When asked what he did when those in power didn’t listen to him or those he was trying to give voice, MQP shared that sometimes he had to walk away. Indeed, in his closing words, Eric Moore offered this piece of advice: Be accountable on a daily basis to yourself and why you are doing the work. Lesson Learned: Evaluation in a social justice context takes courage. Perhaps that should be added to Jean King and Laurie Stevehan’s essential competencies for evaluators? Rad Resources: Classic analyses: Hill, A. (1997) Speaking truth to power. New York: Doubleday. Hoppe, R. (1999). Policy analysis, science and politics: from ‘speaking truth to power’ to ‘making sense together.’ Science and Public Policy, 26 (3): 201-210. Spiegel, A., Watson, V., & Wilkinson, P. (1999). Speaking truth to power?. The Anthropology of Power. Wildavsky, A.B. (1979). Speaking truth to power: The art and craft of policy analysis. New Brunswick, NJ: Transaction Books. The American Evaluation Association is celebrating MESI Spring Training Week. The contributions all this week to aea365 come from evaluators who presented at or attended the Minnesota Evaluation Studies Institute Spring Training. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:49am</span>
Hey aea365ers, Sheila B. Robinson here, aea365’s Lead Curator and sometimes Saturday contributor with an update on what YOU are reading. Each year, we look at the top posts from the past year. We have to wait a bit since some posts get significant readership weeks or even months later! Lessons Learned: Here are the top posts of 2014, as measured by Feedburner: Sheila B. Robinson on Delightful Diagrams from Design Diva Duarte (April 12, 2014) Susan Kistler on Innovative Reporting Part I: The Data Diva’s Chocolate Box (April 3, 2014) Jordan Slice on LucidChart for Freeflow Chart Construction (June 8, 2014) DVR TIG Week: Ann K. Emery and Stephanie Evergreen on the Data Visualization Checklist (June 27, 2014) Elissa Schloesser How to Make Your Digital PDF Report Interactive and Accessible (January 28, 2014) Jayne Corso on Creating Images That Will Make Your Presentations Stand Out (December 6, 2014) Molly Ryan on Using Icon Array to Visualize Data (August 13, 2014) William Faulkner and João Martinho on Visual Communication with a Poster: A Quick, Dirty, and Biased Case Study (November 17, 2014) Literature Search Strategy Week: Molly Higgins on the Best Databases for Everything (November 3, 2014) Clare Strawn on Evaluating Communities of Practice (October 31, 2014) Hot Tip: Definitely some great authors and great reading in that group. And hey, I’m in there too (it was a surprise to me!). It’s clear that our readers like to learn about new resources and innovative ways to use them in evaluate work. All of those post links are still live of course, so seize the moment and catch up on anything that you may have missed! Cool Trick: Want a chance to see your name in this list a year from now? It could happen! Send me a note of interest at aea365@eval.org. I’d love to hear from you. Congratulations to our Top 10 authors and many thanks to them for their wonderful contributions. *A very special thank you to Susan Kistler, AEA’s Executive Director Emeritus for helping extract the data for this post! Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:49am</span>
On July 8, @SHRMnextchat chatted with John Dooney (@SHRMAnalytics) about "HR's Roller Coaster Ride From the Great Recession." In case you missed it, here are all the great tweets from the chat:     [View the story "#Nextchat RECAP: HR's Rollercoaster Ride From the Great Recession" on Storify]  ...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:49am</span>
Hi!  We are Nathan Anderson, Data Management Specialist for Mid-Dakota Education Cooperative, and Amy Engelhard, Data Steward for the state of North Dakota. We strive to help PK-12 educators move from a "data frustration" mindset to a "data utopia" mindset. More specifically, we collaborate with teachers and administrators to optimize the use of student achievement for purposes of informing individual student instruction, identifying strengths and weaknesses in a classroom, and illuminating trends and gaps across a school district. We often embed the Statewide Longitudinal Data System (SLDS) Data Use Standards and the A+ Inquiry framework into our presentations and work sessions as strategies for teaching and facilitating effective data use processes. The SLDS Data Use Standards resource introduces essential knowledge, skills, and professional behaviors. A+ Inquiry is an effective data use framework centering on the premise of awareness, which connects a wheel-like series of stages that make up a thorough inquiry cycle. Stages include absorbing the correct context; asking an essential question; accumulating, accessing, and analyzing the right data; answering the question; announcing the findings; and applying decisions and actions. Graphic adapted from "Disciplined inquiry: Using the A+ Inquiry framework as a tool for eliminating data hoarding, mindless decision-making, and other barriers to effective ESA programming," by N. C. Anderson, M. R. Brockel, and T. E. Kana, 2014, Perspectives: A Journal of Research and Opinion About Educational Service Agencies, 20(3).   Lesson Learned: Many teachers and administrators have access to seemingly endless amounts of data they don’t know how to use. When educators access data while first asking strong questions, they are better equipped to find valuable answers and put the data to effective use. Accessing data without first asking a strong question puts many educators at risk for wasting time and other resources on a purposeless data pursuit.  Frequently, this is where the "data frustration" mindset takes hold. Hot Tip: Use A+ Inquiry as a framework to examine how the SLDS Data Use Standards can look in action and to visually put yourself in the "data utopia" mindset. Hot Tip: Align your inquiry processes with the SLDS Data Use Standards and A+ Inquiry to reduce mindless decision making, data hoarding, and frustrations affiliated with using data. Rad Resource: Take a look at the SLDS Data Use Standards resource. The SLDS Data Use Standards Workgroup is in the process of creating new enhancements to the original resource. Rad Resource: A+ Inquiry was introduced in an article (scroll to No. 4) explaining how an Education Service Agency could apply the framework. We have now adapted the framework for use with student achievement data. Rad Resource: Check out these A+ Inquiry resources, including a one page handout, presentation with A+ Inquiry slides, and effective data use scenarios written through an A+ Inquiry lens. The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PK12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.  
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:49am</span>
Joe Gerstandt is one, of a group of people, you can find here in the HR/Social Media space who call Omaha, NE home.  Joe is a frequent keynote speaker on general HR topics and also in his own area of expertise on Diversity and Inclusion (D & I).  Joe is also one half of the duo called Talent Anarchy.  Under that banner, Joe works with his partner Jason Lauritsen.  Together they speak, host conference events and write books. Joe has been working the speaking circuit for more than...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:49am</span>
We are Krista Collins and Eugenia Gwynn, experienced evaluation consultants for Early Childhood Education (ECE) programs in Atlanta, GA. Over the past decade, ECE programs have become a major funding priority, with topics such as family engagement, pre-kindergarten, school transitions, and quality rated standards receiving a lot of attention. We have worked closely with programs taking the lead to define these innovative practices, and have supported their efforts to develop, measure, and validate best practices in ECE. We recently collaborated on an evaluation of a local ECE community of practice (COP) convened to develop a family engagement framework to guide quality rated standards and practices throughout the state. To support these efforts, our evaluation focused on understanding how the COP stakeholders defined family engagement practices. With such a wide variety in how stakeholders were invested in family engagement, we had to capture this information using many different techniques. Below are a few successful strategies we employed in an effort to identify a singular definition of family engagement that summarizes the multiple perspectives of an ECE COP. Lesson Learned: Learn the Language. Reviewing the existing literature is a great place to start, but learning how your stakeholders talk about the topic and identifying the resources they use can provide valuable information about the local context. Make time to become familiar with the organizations and policies that govern how stakeholders operate. Speaking the same language will not only help stakeholders participate in the evaluation more effectively, but it will also help you understand the data and form conclusions more efficiently! Hot Tip: Just Ask! The most informative data we received came from a simple brainstorming session with parents. We invited parents to tell us what they did to be engaged in their child’s education, and what centers did to encourage their involvement. Instead of conducting a formal focus group with a prepared plan and list of questions, we facilitated a brainstorming session and had parents work together to define what family engagement in their community looks like. The best part - we were able to quickly see how family engagement practices look differently across communities! Rad Resource: The Family Involvement Network of Educators (FINE) is comprised of educators, practitioners, policymakers, and researchers dedicated to strengthening family-school-community partnerships. The FINE Newsletter, which is published monthly by the Harvard Family Research Project (HRFP), shares the newest and best family engagement research and resources from HFRP and other leaders in the field. The newsletter provides useful information about family engagement, including research reports, teaching tools, and training materials. The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PK12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:49am</span>
Hiring conditions are perhaps in their strongest stretch since the Great Recession ended six years ago. Job openings have reached a level not seen in nearly 15 years, and more workers are quitting their current positions in search of something better, thus creating churn and opportunities for other job seekers. Problems do remain: One-third of HR professionals in the manufacturing sector (33.7 percent) said they experienced increased recruiting difficulty in June, as did 27.5 percent of service-sector HR professionals, according to the Society for Human Resource Management’s Leading Indicators of National Employment (LINE) report. June marked the 16th straight month...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:48am</span>
Greetings! We are Kelly Murphy, Program Chair of the PreK-12 TIG, and Selma Caal, Research Scientists from Child Trends, a non-profit, nonpartisan research center that works to improve the lives and prospects of children and youth. As developmental psychologists and program evaluators, we spend a lot of time thinking about how to best measure program impacts on youth and strive to develop measures that are rigorous, developmentally appropriate and feasible for program contexts. While we are pleased to see a shift from a narrow focus on academic performance and problem reduction, to a broader "whole child" perspective that encompasses outcomes, such as social-emotional learning (SEL) and "soft" skills, we realize this shift has brought new challenges for educational evaluators: Defining SEL outcomes, and distinguishing them from other positive youth development outcomes; Identifying which SEL outcomes are relevant to the population served; and Identifying reliable, valid and brief measures of SEL outcomes. Today, we’d like to share some rad resources you can use to assess SEL outcomes in your own evaluations. What is social-emotional learning? Broadly, SEL encompasses a number of skills that promote positive relationships, ethical and conscientious work, and productivity. Child Trends recently identified five key SEL skills that help students excel in school over time: self-control, persistence, mastery orientation, academic self-efficacy, and social competence. If you’re interested in learning more about SEL check out these Rad resources: The newly released Handbook of SEL Research and Practice; The Collaborative for Academic Social and Emotional Learning (CASEL) has numerous resources on defining and measuring SEL; and StriveTogether has an overview of social and emotional learning competencies and has published a report that provides a detailed review of SEL outcomes and their relationship to academic achievement.  How do you measure social-emotional learning outcomes? Rad Resources: If you’re interested in findings measures of SEL that are relevant across various developmental periods: Child Trends, in collaboration with the Tauck Foundation, has published a report on measuring elementary school students’ social and emotional skills, which includes teacher-report and student-report measures of social and emotional skills that are free to use; Additionally, Child Trends, in collaboration with the Templeton Foundation, published a book on measures of flourishing children, which includes free measures that have been tested rigorously with adolescents. These measures can also be accessed online; StriveTogether has also published reports that include a summary and a compendium of SEL measures; and Performwell, a collaborative effort initiated by Urban Institute, Child Trends, and Social Solutions is a searchable database of measures. The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PK12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:48am</span>
Hello! We are Sheila A. Arens and Mariana Enriquez. Evaluators frequently propose psychometrically sound instruments or low-inference observation protocols used by trained observers for data collection. Such instruments and methods are sometimes the most appropriate to address a question of interest. However, we believe it is important for evaluators to stretch beyond traditional data collection methods when it makes sense to do so and when traditional, quantitative approaches will not be able to reach the target population, capture their experiences, or produce meaningful, reliable data.  It is important to give careful consideration to how and whether respondents will engage in data collection, and even when traditional methods may be desirable, to think about whether there are more appropriate ways of delivering the data collection request to the target population. In other words, data collection cannot be a one-size-fits-all approach. We frequently work with populations for whom completing surveys or participating in focus groups would be difficult. In fact, these data collection methods may even further disenfranchise these individuals. For instance, some may feel immense discomfort discussing any issues in a group, or may quickly identify whose lead they need to follow in the group of participants to remain in a "safe place" … yet, the evaluator may persist in using focus groups as the means of data collection. Lesson Learned: Recognizing that data collection cannot be a one-size-fits-all approach, we have sought alternative ways to engage individuals. We must be nimble and creative in our data collection approaches. And, although alternative methods may not work for everyone, how the target population is informed and engaged may make a difference in their participation. Including "participants" in the process to decide the best ways to reach them—and being humble about what we know, what works, what’s best, etc.—not only seems prudent but also seems like a culturally competent approach. Alternative data collection approaches might include approaches that are similar to focus groups but provide additional opportunities for sharing not circumscribed by a single facilitator or necessarily hampered by group dynamics, the use of photographs or images produced and annotated by participants (photovoice, for instance: http://steps-centre.org/methods/pathways-methods/vignettes/photovoice/ ), or online social networks such a Twitter or Google Hangout. A word of caution: Be creative, but be sure that your data collection method is appropriate for the evaluation questions of interest, is sensitive to participants’ needs and existing resources, and that the evaluation budget can support the additional burden of a potentially more time-consuming analytic method. Rad Resources: PhotoVoice, an app for iPhones (respondents can upload images and record their thoughts about the image) WorldCafe design principles The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PK12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:48am</span>
Government action may have produced a roadblock on the trip to the future of work. In the past several years there has been a great deal written about the changing world of work. Tom Peter’s talked about it when he discussed "You, Inc."; Dan Pink wrote about it in Drive! and I have written about it numerous times in my Future Friday posts. We have seen the reference to the "gig" economy or the "on-demand" economy and the increasing use of independent contractors. We have seen numerous discussions of, and...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:48am</span>
Hi! We are Courtney Howell and Shelly Engelman, evaluators at SageFox Consulting Group in Atlanta, GA. We are social psychologists who happen to be education evaluators. As such, we regularly employ our social psychology backgrounds to inform our work. Image credit: reihayashi via Flickr Persuasion is part of every aspect of life. This reality is no different for evaluation. Persuasion also plays a role in evaluation reporting, ensuring that clients, users, and funders will engage with your report/proposal in the way you intended, leading to a positive outcome that benefits the program. Recently, we identified three social psychological concepts that can be readily employed to enhance the ability of reports to communicate findings. Here are a few effective techniques that we have adopted in our evaluation reporting practices: Primacy and Recency Effect: People tend to recall the first and last things in a series best and the middle items worst. A Recency Effect happens when people encounter unsalient, non-controversial, uninteresting, and unfamiliar material. Salient, interesting, controversial material produces a Primacy Effect.Lesson Learned: When composing a report, we want to put the most interesting and familiar things first and the non-controversial, and unfamiliar things last. Rad Resource: Structuring and Ordering Persuasive Messages Mere Exposure Effect: People tend to develop a preference for things simply because they are familiar with them. Mere Exposure Effect enhances perceptual fluency which is the ease with which new ideas can be processed and internalized. Remember, familiarity breeds liking. Lesson Learned: Clients may be more likely to accept feedback if it is repeatedly highlighted across reports. Rad Resource: Attitudinal Effects of Mere Exposure Confirmation Bias: The tendency to interpret information in a way that confirms one’s beliefs or hypotheses. In fact, people tend to stick to a position even after the evidence has shown it was false. Lesson Learned: To avoid confirmation bias, call attention to information that may go against expectations by using a visual marker (like an exclamation mark) to point to messages that are inconsistent with the rest of the report. Play devil’s advocate to suggest alternative ways of interpreting the data/findings. Rad Resource: Motivated Numeracy and Enlightened Self-Government Rad Resource: To further explore the intersectionality of social psychology and evaluation: See Melvin Mark, Stewart Donaldson and Bernadette Campbell’s text on Social Psychology and Evaluation. The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PK12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:48am</span>
This article was originally posted on Fox Business.  Last week I attended the annual Society for Human Resource Management (SHRM) conference in Las Vegas, the largest gathering of human resources professionals in the world. While there I had the opportunity to sit down with Marcus Buckingham, author of the newly released Stand Out 2.0. Buckingham spoke about how leaders at every level can develop their employees by focusing on strengths.     Build Self-awareness One of the greatest challenges in our workforce is that "we have people coming into the workplace who are fundamentally inarticulate...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:47am</span>
Greetings! I’m Sara Vaca, independent consultant at EvalQuality.com and Creative Advisor of this blog. Creativity is another powerful tool evaluators can potentially use at certain points of the evaluation process to improve engagement or to solve eventual dead-ends or conflicts. So I have started posting about this subject and today I’m going to share my ideas about how to foster creativity. Let’s run this little improvised test (that is almost a rubric!): Do you consider yourself a creative person? a. No, I’m a serious evaluator/researcher and creativity has nothing to do with my job. b. Not really, I try to be creative, but nothing "happens." c. It is not my major virtue but I have some creativity moments here and there. d. Yes! I’m always overflowing with new ideas of how to do things. Ok, if you answer is (d), don’t read on. You don’t need any tips for further developing your creativity. If your answer was (c), your creativity is already released, but you could encourage it to make an appearance more often. In that case, or if you don’t consider yourself as a creative person (b) but you would like to be one, here are some ideas: Let your mind fly free. Don’t censor any crazy thought that may come out, no matter how "crazy", "undoable", or "impossible" it may seem. In fact, at the beginning, you should "force" yourself to go wild and think of the most absurd, bizarre things to set your creativity free. Use often the questions "Why not?" And "What if?" as a way to challenge what you know, or what happens, or what you think you know or why it happens. Always within rational limits (until you go too far and you result annoying), challenge everything. Get inspired by others: check for related stuff that can be inspirational. Of course the internet is a great place for researching. Talk to others: discussing things out loud and hearing others’ points of view often helps you get out of the blockage. Find something repetitive to do where you meditate upon everything, and do it periodically (daily if possible). Observe where you are and what you are doing when you have an idea. Often they are daily routines like walking, showering, driving or the like. Finally, if your answer was (a), you may be perfectly right, but there are times when being creative at some moments may mean doing work that is more enjoyable and fun, in case that’s something appealing for you. Other tips to be creative? Want to share the places or moments where you often come up with new ideas? Please comment or share Sara.vaca@EvalQuality.com. Remember: the crazier, the better! Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:47am</span>
  Culture Savvy: Working and Collaborating Across the Globe Globalization is a business reality. As companies try to succeed in the international arena, the question of culture, learned behaviors, and implicit (mis-)understandings are part of the formula. Typical "Culture 101" training consists of a list of do’s and don’ts that tend to reinforce negative stereotypes—and increase anxiety about overseas collaboration. This book presents a new approach that addresses culture directly and is based on research from the field of neuroscience. Recognizing why we may feel anxiety and discomfort in different cultures...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:47am</span>
I’m James W. Altschuld. When Ryan Watkins asked me to write a NA entry for this year’s aea365 sponsored week, he playfully dubbed me the ‘Patron Saint’ of Needs Assessment. While his perception is not accurate, allow me in that spirit to offer some commandments for Needs Assessors. Thou: art a facilitator of a process and never a savior which thou wilt never be shalt always explore the organization or group thou art working with for existing information before collecting any new data shalt have an advisory or guiding group built into the needs assessment process for eventually its members will probably be part of the needs resolution strategy must assume that many voices will enter the fray increasing its difficulty must expect frustration when doing needs assessments (art thou surprised for it comes with the territory, no elaboration is required, and it fits with commandments 3 and 4) wilt consult the literature for ideas and input as painful as that might be shalt honor the needs assessment gurus (a nutty commandment if there ever was one) wilt have to allot more time and lucre for the process than thou originally planned (so what’s new) must recognize that thou art a facilitator or a catalyst but that other not thou have responsibilities for solutions and their implementation will absolutely adhere to the prior 9 commandments   Author’s Note:   After reading the commandments it should be apparent that Dr. Watkins was completely incorrect in applying the Patron Saint designation to this author. Nothing could be further from the truth. The American Evaluation Association is celebrating Needs Assessment (NA) TIG Week with our colleagues in the Needs Assessment Topical Interest Group. The contributions all this week to aea365 come from our NA TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:47am</span>
SHRM, in collaboration with U.S. Travel Association, conducted a survey to examine HR professionals’ opinions about the importance of vacation. A large majority of HR professionals believe that when employees take vacation, it makes a positive impact on performance, morale, wellness, culture, productivity and retention. This study also looks at the amount of vacation offered to employees, rollover policies, and the number of unused vacation days.    Shrm us...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:46am</span>
Ever thought about doing research on needs assessment (or evaluation for that matter)? My name is Ryan Watkins and what follows is a short description of a basic framework that can help you consider the types of research that could be useful. Based on Briggs (1982) and Driscoll (1995), the following research paradigms can offer choices for your research design: Experimental and Quasi-Experimental: Experimental research designs can provide educational researchers with the "most effective means of establishing causal influences on a phenomenon of interest" (Driscoll, 1995, p. 323). While quasi-experimental research designs may be less effective in providing evidence of casual relationships, they offer pragmatic alternatives when meeting the requirements of experimental designs are not practical. Meta -Analysis: The meta-analysis research design is "a widely used method for combining results from different quantitative research studies" (p. 71) on the same phenomenon (Gall, Gall, and Borg, 1999). Case Study/Ethnography: According to Trochim (2001), "A case study is an intensive study of a specific individual or specific context… There is no single way to conduct a case study, and a combination of methods (such as unstructured interviewing and direct observation) can be uses." (p. 161) Technology Development and Evaluation using a Novel Technique: As a field of research focused on useful application, needs assessment research should be active in the development, application, evaluation, and continuous improvement of assessments using technologies (e.g., online surveys). Cost Analysis: The requirements of decision-makers for data related to the cost-effectiveness of decisions presents researchers with a pragmatic research paradigm with obvious application benefits. Model Development and Evaluation: Models offer useful tools for conceptualizing the relationships and complexities among the components of a system. Novel Technique Development and Evaluation: Techniques are the processes used to accomplish results (i.e., produce products, obtain outputs, and/or achieve consequences). The development and validation of a novel technique for accomplishing results within needs assessment (or evaluation) is an essential role of the researcher. Theory Development: Theories help us explain or predict a phenomenon. In others words, a theory is the answer to why something (such as a behavior) occurs or does not occur within a context. Making and testing theories helps advance research and practice. Rad Resources: Briggs, L. (1982). A Comment on the Training of Students in Instructional Design. Educational Technology. 22(8), 25-27. Driscoll, M. (1995). Paradigms For Research in Instructional Systems. In Angling, G. (1995). Instructional Technology: Past, present, and future. Englewood, Co: Libraries Unlimited. Gall, J., Gall, M. and Borg, W. (1999). Applying Educational Research: A practical guide (4th ed.). New York: Addison Wesley Longman. Trochim, W. (2001). The Research Methods Knowledge Base (2nd ed.). Cincinnati, OH: Atomicdog.com Publishing. The American Evaluation Association is celebrating Needs Assessment (NA) TIG Week with our colleagues in the Needs Assessment Topical Interest Group. The contributions all this week to aea365 come from our NA TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:46am</span>
A series of corporate relocations, steady growth in real estate development and a jobless rate that is consistently below the national average have all made the Dallas region’s economy a top performer among major metro areas in the United States. The most significant development as of late is Toyota North America’s pending move to the Dallas suburb of Plano. Work started in January 2015 on a new headquarters complex for the automaker, which will bring an estimated 4,000 jobs to the region sometime in 2017 when the project is complete. The 12-county...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:46am</span>
Hello, my name is Sue Hamann, and I work at the National Institutes of Health. One part of my job is to solicit and review proposals from evaluation contractors for various types of evaluation projects, including needs assessment and program planning. Today I will share some tips about improving your proposed needs assessment. Hot Tips: Show that you are knowledgeable about needs assessment. Don’t even think about submitting a proposal that does not define needs assessment or mention its rich history and development. This sounds obvious, but you would be surprised that I often review proposals that are lacking evidence that the proposers know the field. Be sure to cite the Altschuld and Ryan edition (New Directions for Evaluation, #144, winter 2014). Justify your proposed methodology based on existing literature. Assemble and describe a team with all the required skills. Michael Scriven in his 1991 Evaluation Thesaurus listed the following as content areas in which evaluators should be skilled: statistics, cost analysis, ethical analysis, management theory and practice, pedagogy, social psychology, contract law, interviewing skills, professional politics, presentation graphics, dissemination, synthesis. Altschuld and Watkins (2014) stated that needs assessment involves the following methods, in addition to the qualitative and quantitative methods employed in other evaluation activities: gap analysis, causal analysis, prioritization strategies, comparison of solutions. Needs assessment is usually a team effort. Make sure that you document and budget for the skills that you already have available and the skills that you will add to your team. Be alert to the culture of the organization to which you are applying. I read a lot of proposals that are obviously boilerplate. It is generally not worth your time to submit a vague needs assessment proposal, that is, one that is not tailored explicitly to the organization or solicitation. Sometimes you have to do some online searching about the history and status of an organization to determine the kind of needs assessment that will be consistent with the culture of an organization and useful to it. Be sure to read the article from Maurya West Meiers and colleagues (New Directions for Evaluation, #144, winter 2014); it has great tips about planning international needs assessments, but the tips are applicable to any new environment. Document your membership in AEA. If you are reading this blog, you are probably a member of the largest group of evaluators in the world. Be sure to mention this when you state your qualifications. If you belong to the Needs Assessment TIG, say so. If you do not belong and you are interested being paid to do needs assessments, you should join and become active in the TIG. The American Evaluation Association is celebrating Needs Assessment (NA) TIG Week with our colleagues in the Needs Assessment Topical Interest Group. The contributions all this week to aea365 come from our NA TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:45am</span>
I’m Maurya West Meiers, Senior Evaluation Officer at the World Bank and coauthor of A Guide to Assessing Needs: Essential Tools for Collecting Information, Making Decisions, and Achieving Development Results. I often work with groups in carrying out needs assessments, collecting data, training, facilitating retreats, etc.  So I’m always looking for facilitation tips and resources.  Today I’m sharing some favorites. Lessons learned:  If your end-goal in your meeting with a group is to gather data or make decisions (through focus groups, multi-criteria analysis, etc.), you’ll want to do some early rapport building to get people comfortable with one another and talking. Make sure the right people are in the room. It seems obvious, but take the time to define your targets in advance and make sure that those participating are those targeted.  Be prepared to gently remove people who don’t fit your pre-defined needs.  Have another coordinator with you to help in this process.  And have your room comfortably furnished and arranged. Learn the names of participants in advance and give a warm greeting when they enter.  These are common networking techniques because they work and put people at ease. Use name badges and table tents. Have these items ready.  You may wish to let participants write their own names instead of pre-printing them.  Perhaps Jennifer prefers to have everyone call her Jen - so give her the chance to write her name as she wishes. Get people talking early.  As people enter the room, introduce them to others - and have ideas listed on a flip chart or card that they can discuss with one another.  Keep people moving and mixing.  Use a chime or bell to signal a move. Use icebreakers.  An easy icebreaker involves giving participants name badges and asking them to write two or three things they feel comfortable discussing with others.  Example: Energizers and games. If your group work - such as in a retreat - covers a lengthy period of time, use energizers (usually involving some movement) or games to keep people alert and engaged.  If you search for energizers on YouTube, you’ll find many ideas you can adopt and adapt for your purposes and you’ll see how they work ‘in action’ and not just on paper.  This quick and easy energizer is one such example. Rad Resources. Here are some of my "go to" books and websites on facilitation techniques and tools. The Ten-Minute Trainer: 150 Ways to Teach it Quick and Make it Stick!  by Sharon Bowman Liberating Structureswebsite and book by Henri Lipmanowicz and Keith McCandless.  Check out their one-page Liberating Structures Menu. Thiagi’s 100 Favorite Games by Sivasailam Thiagarajan   The American Evaluation Association is celebrating Needs Assessment (NA) TIG Week with our colleagues in the Needs Assessment Topical Interest Group. The contributions all this week to aea365 come from our NA TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.  
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:44am</span>
Here’s my common sense response to the FLSA proposed rule. Pay your people for the work they are doing. Seriously, that’s all. Ideally, pay them what they are worth. If you don’t think they are worth what you’re paying them, it might be time to part ways.   Let’s look at this together. You’re paying people $455 a week, and you’re lobbying that they won’t want to be hourly because being classified as "exempt" is an honor to them? Do you realize that $455 a week is $11.38/hour? That’s all! $11.38 an...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:44am</span>
Displaying 29689 - 29712 of 43689 total records