Loader bar Loading...

Type Name, Speaker's Name, Speaker's Company, Sponsor Name, or Slide Title and Press Enter

I am still pumped up after watching @IAMJHUD’s performance last night. She is absolutely an amazing artist. Her music touches your soul and will bring you to tears. Her connection and interaction with the crowd was so natural and appeared effortless. Everyone that knows me understands that I have a heart for Talent Management. If @IAMJHUD would have asked me to join her team after the concert, I would have hopped in her limo and never turned back. Not even for my luggage! I would pour everything I had...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:59am</span>
Greetings! I am Linda Cabral from the University of Massachusetts Medical School’s Center for Health Policy and Research. A big part of my job uses qualitative methods to evaluate different health and human services programs. Our data collection processes can include utilizing one-on-one or group interviews and as well as focus groups. With this type of narrative (i.e., first person data collection), decisions must be made up front as to the ultimate format of your data set. One of the decisions is whether or not to audio record these data collection events and another is whether these audio files will be transcribed. Transcribing data can be a tedious process requiring several hours for each recorded interview. A general rule is that the text of a 30-40 minute interview takes about 1-2 hours to type and results in about 15-20 pages of text. Recently, we have been faced with a myriad of projects requiring decisions on how formal our transcription process should be. Let me offer you some of our thinking and lessons learned! Lessons Learned: Decisions are needed as to the level of detail needed from each qualitative data collection event, which can range from a verbatim transcript to a less formal write-up of notes. While transcribing your own data can have significant analytic benefits (getting close and personal with the material), it may not be practical for everyone - particularly if you’re time-strapped. Transcription of interviews allows for each evaluation team member to go through the transcript carefully, providing an easily readable document from the study. Having a transcript can facilitate working together in a team where the tasks have to be shared. Agreement about data interpretation is key. When considering outsourcing transcription: Realize that a fully transcribed interview will result in pages and pages of data to sift through. There will be a lot of "noise" in there that could potentially be eliminated if the transcription was done in-house by evaluators familiar with the project’s evaluation aims. You have choices as to the type of transcript that would be most helpful to you, including: word-by-word verbatim; clean verbatim (removing ‘hmm’ and ‘you know’); or one with improved grammar and readability. You have options ranging from independent contractors to large firms that specialize in transcription services. Transcribers can be paid by the word, the hour, or the length of time of the recording. Hot Tips: Always have your evaluation aims drive your decision about whether to transcribe or not. Plan ahead for how notes, audio recordings, and transcripts will be stored and how personal identifiers will be used in order to main data integrity. Budget the necessary time and resources up front whatever your decision is! Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:59am</span>
  As an American living abroad, I often get asked 'why did you choose to build a company in the UK'?  My answer is simple -- I believe that finding top talent is the number one priority for building a successful company, and being based in the UK allows me to access the best from all over the world. Let me expand on this further. Today, most executives would agree that the ability to attract and retain top talent is a key competitive differentiator for their business, but more than 63% of CEOs are concerned about the availability of key...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:59am</span>
Greetings! I’m Galen Ellis, President of Ellis Planning Associates Inc., which has long specialized in participatory planning and evaluation services. In online meeting spaces, we’ve learned to facilitate group participation that - in the right circumstances - can be even more meaningful than in person. But we had to adapt. Although I knew deep inside that our clients would benefit from online options, I couldn’t yet imagine creating the magic of a well-designed group process in the virtual environment. Indeed, we stepped carefully through various minefields before reaching gold. As one pioneer observes, Just because you’re adept at facilitating face-to-face meetings, don’t assume your skills are easily transportable. The absence of visual cues and the inability to discern the relative level of engagement makes leading great virtual meetings infinitely more complex and challenging. Assume that much of what you know about leading great meetings is actually quite irrelevant, and look for ways to learn and practice needed skills (see Settle-Murphy below). We can now engage groups online in facilitation best practices such as ToP methods and Appreciative Inquiry and group engagement processes such as logic model development, focus groups, consensus building, and other collaborative planning and evaluation methods (see our video demonstration). Lessons Learned: Everyone participates. Skillfully designed and executed virtual engagement methods can be more effective in engaging the full group than in-person ones. Some may actually prefer this mode: one client noted that a virtual meeting drew out participants who had been typically silent in face-to-face meetings. Software platforms come with their own sets of strengths and weaknesses. The simpler ones often lack interactive tools; but the ones that do allow interaction tend to be more costly and complex. Tame the technical gremlins. Participants without suitable levels of internet speed, technological experience, or hardware—such as microphoned headsets—will require additional preparation. Meeting hosts need to know ahead of time what sorts of devices and internet access participants will be using. Participants should always be invited into the meeting space early for technical troubleshooting. Don’t host it alone. One host can produce the meeting (manage layouts, video, etc.) while another facilitates. Plan and script it. Virtual meetings require a far more detailed script than a simple agenda. Indicate who will do and say what, and when. Practice, practice, practice. Run through successive drafts of the script with the producing team. Rad Resources: Settle-Murphy, N. (2015). Making Virtual Meetings Come Alive: It’s Everyone’s Job! Some great tips from the Virtual Facilitation Collaborative (we have taken their eight-week course) Tessie Catsambas and Hallie Preskill’s (AEA members) amazing book "Reframing Evaluation through Appreciative Inquiry" (I refer to this all the time.) Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:59am</span>
  Ahh, Vegas - how we love to hate you! You're hot, you're sleazy, you're crowded, you're expensive and you cheat...of course we hate you! The annual SHRM conference was held in Las Vegas this year and over 15,000 attendees gathered to learn more about their profession. I have been to Vegas for conferences before but this is the first time in which I gambled...and by gambled I mean I played PaiGow. The PaiGow table is a generous, social and helpful table. The game itself is easy to learn and even if you're a bit slow on the uptake, the dealer and all the others offer their...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:58am</span>
Hello, we are Marissa Szabo and Humberto Reynoso-Vallejo, members of the Chapter 224 research team evaluating the Health Care Cost Containment Law from the Office of the State Auditor in Massachusetts. This is another in a series of posts on lessons learned from the ongoing evaluation of Chapter 224 published during the March 8th week. Given the wide scope of the Chapter 224 evaluation project, the research team needed to tap into the healthcare community for its advice, perspectives, and feedback. To this end, we assembled an advisory committee comprised of members from the health insurance industry, health care advocates, academia, labor, and professional organizations to gain diverse input. At the first meeting of this advisory committee, the research team presented a draft of the evaluation plan for their consideration. Members of the board provided suggestions related to methodology, potential data sources, and provided valuable information on the overall logistics of the evaluation. Since that meeting, the team has been in regular contact with members of the committee for specific feedback on different aspects of the evaluation including reviewing sections of the evaluation plan, and providing additional support as needed. In addition, stakeholders will be included as research participants in qualitative interviews complementing the quantitative component of the study. Stakeholders who belong to task forces, councils, commissions, agencies, boards and other groups associated with the enactment of Chapter 224 will be asked to participate in a brief on-line survey to assess initial observations and concerns related to the roll out of Chapter 224 legislation. Two follow-up interviews will be conducted with selected participants to explore their reactions to our quantitative findings As the Chapter 224 evaluation project continues, the research team will continue to consult with the advisory committee and to engage key stakeholders to ensure the quality of this project. Hot Tip: After an Advisory Committee meeting, follow up with members that provided specific feedback on aspects of the evaluation to keep them engaged. Hot Tip: Include Key Stakeholders as participants in the research including taking part in surveys and participating in in-depth structured or semistructured qualitative interviews. Hot Tip: Ask for their guidance in identifying data sources and other resources valuable to the evaluation process. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:58am</span>
Hi there, Liz Zadnik here, bringing you another Saturday post focused on practitioner experiences and approaches. Today I’m going to focus on my personal journey to stay up-to-date and relevant in all things evaluation. I was not formally trained as an evaluator - everything I know has been learned or gained through on-the-job hands-on experiences and mentorship (I’m very lucky to have been able to work with a few brilliant evaluators and researchers!). Self-study, reading, and ongoing training have been intentionally incorporated into my personal and professional schedule. Rad Resource: Coursera is an excellent resource for online learning. You can even get certifications in concentrations after completing a set of courses in sequence. They have a number of courses around data analysis and data science! Rad Resource: iVersity coordinates and archives some really interesting and innovative massive open online courses (MOOCs). The "Future of Storytelling" course gave me a number of ideas and skills for crafting accessible and engaging trainings and resources, as well as some insights for capturing stories for program evaluation. Recent and future courses focus on idea generation methods and gamification theory. Lesson Learned: Follow your gut! At first I thought I needed to select courses, books, and resources that were explicitly "evaluation-y," but found it was those courses that made me say "Oooh! That looks interesting!" helped me think creatively and find ways to enhance my evaluation and program development skills. Rad Resource: MIT Open Courseware is much more structured and academic, as these are courses held at MIT. These require - for me - a bit more organization and scheduling. Rad Resource: edX is another great chance to engage in online courses and MOOCs. Right now they have two courses on my "to-take" list: Evaluating Social Problems and The Science of Everyday Thinking. Are there other online course providers or resources you rely on to stay current? How do you stay up-to-date and innovative as you balance other obligations and projects? Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:58am</span>
I wrote my first conference recap post - then called "Rants and Raves" - in March of 2010. This has become my favorite blogging experience of every conference I attend, because it really causes me to think about what is happening around me and whether I love it or hate it - or if I am just momentarily in a mood. This is why I always wait a day or two until conference end. And by focusing...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:58am</span>
I’m Kylie Hutchinson, independent evaluation consultant and trainer with Community Solutions Planning & Evaluation. I also tweet regularly at @EvaluationMaven. Have you ever wondered what evaluation recommendations and SpongeBob SquarePants have in common? Well, in my opinion, a lot. Think about why we make recommendations. We want stakeholders to take action on our evaluation findings. But we all know this doesn’t happen by magic. And it doesn’t occur as soon as we submit our final report either. In fact it can be months or years before managers and policy-makers are actually in a position to make decisions based on our findings. In order for utilization to happen, I think recommendations need to be three things: easily absorbed (at the time of first reading) sticky (so they stay in the minds of decision-makers) have ‘legs’ (so they prompt action). Hmmm…now think…what has good absorption, is sticky, and has legs? Exactly! SpongeBob SquarePants!   Rad Resource: Here’s a tip sheet on Recommendations That Rock! Hot Tip: Well-written recommendations don’t have to check every tick in the box, but they do deserve significant attention. Don’t leave them to the end or the last minute. Instead, keep a running list of your initial ideas as soon as they occur, even if it’s at the beginning of the evaluation. And always run them by your stakeholders to increase ownership and the chances of implementation. Better yet, develop them collaboratively during a data party. Rad Resource: You can find a Pinterest page with other resources for writing better recommendations here. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:58am</span>
  I attended a great event at #SHRM15.  A few social media influencers Rayanne Thorn, Dwane Lay, and Jason Lauritsen organized a fundraiser for "No Kid Hungry".   Here’s the deal: They pulled it off in about 3 weeks. Found a venue (shout out Public House at the Luxor, Las Vegas) that worked with budget. Got cos they work for to sponsor (Quantum Workforce/Dovetail- maybe some others?). Actually Quantum chose to skip the #shrm15 vendor booth to sponsor this social...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:58am</span>
Hi! I’m Madhawa "Mads" Palihapitiya, Associate Director at the Massachusetts Office of Public Collaboration at UMass Boston. We recently concluded the first phase of a statewide municipal conflict resolution needs assessment study commissioned by the state Legislature. Hot Tip: The term "need" can mean many things to many people. For needs assessment purposes, needs are defined as Gaps in Results. Organizations don’t often think of aligning their institutional needs with the societal bottom-line, but working towards this alignment is crucial.  We needed to investigate if our institutional mission, which is to help government and other entities address public conflict, was perfectly aligned with the needs of Massachusetts municipalities and their constituents. This alignment was particularly important to us as a statutory state agency and would add measurable societal value. Overtime, this alignment can also increase the institutional bottom-line. Rad Resource: Roger Kaufman’s Needs Assessment for Organizational Success. See also Bethany Pearsons’ talk at Evaluation 2014 on the Triple Bottom line. People don’t usually talk about societal results when they talk about organizational needs. How do we define societal results? We first developed an Ideal Vision that contained a series of societal results and indicators to measure them. Lesson Learned: We had to resist the impulse to focus on immediate institutional needs like organizational inputs and processes. Imagining an ideal future or vision can tell us where the journey should end. Rad Resource: Kaufman’s Ideal Vision. Cool Trick: To help the organization and others being engaged understand the difference between different results, consider developing a visualization like the DoView chart below. Assessing societal results while assessing the institutional bottom-line requires access to valuable data both within and outside of your organization. A Needs Assessment Committee (NAC) was established as the ‘public face’ of the process and to provide advice and guidance on assessment design, participant selection etc. Cool Trick: Set-up a website to communicate the purpose of your needs assessment. Use social media whenever possible. There are 351 cities and towns in Massachusetts. Multiple organizations were involved. We had limited resources to collect the data we needed. We had to get creative! A series of regional focus groups and telephone interviews were held. To reach the rest, we launched an online survey. Hot Tip: Online surveys are a great way to involve more people. Keep survey questions close-ended and completion time to 10-15 minutes. Plan ahead so that you keep the survey open for as long as you can. Cool Trick: Get creative with survey dissemination by using contact databases, newsletters, list servers, Facebook and twitter. Ask people you know to invite others to take the survey. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on theaea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest toaea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:57am</span>
#SHRM15 the conference is a wrap! We had a lot of fun, it was really hot, I met great people, hung out with old friends, and had a classy time!   Now that we’ve learned lots of great tips and tools and received some wonderful advice, it’s time to get to work! Where should we start? Well, let’s talk about engagement. Engagement is something we learned a LOT about this week. Let me recap: ·         Throw employee engagement surveys away ·         Use these tools to increase employee engagement ·        ...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:57am</span>
Greetings! My name is Catherine Cooper, and I am the Faculty Director of the Educational Partnership Center and Professor of Psychology at the University of California, Santa Cruz. I invite you to explore and use the resources from the Bridging Multiple Worlds Alliance (BMWA). The BMWA is a growing network of researchers, educators, and policy makers - including evaluators - in the U.S. and other nations who work with P-20 (preschool through graduate school) partnerships to support low-income, immigrant, and ethnic minority youth.  These partnerships support youth in building pathways from childhood to college and careers without giving up ties to their families and cultural communities. We work in collaboration with alliance partners, including youth themselves and evaluators of programs and partnerships. Rad Resource: In the BMWA, we offer three resources that evaluators tell us are especially useful: Aligning models and measures to build a common language among partners. Tools for research, policies, and practice, including formative and summative evaluation. Longitudinal data tools for qualitative and quantitative evaluation and research The Bridging Multiple Worlds (BMW) Model (shown below) taps five dimensions for opening pathways: Demographics—students’ age, gender, national origins, race/ethnicities, languages, and parents’ education and occupation Students’ aspirations and identity pathways in college, careers, and cultural domains Students’ math and language academic pathways through school Resources and challenges across students’ cultural worlds of families, peers, schools, community programs, sports, and religious activities, among others Partnerships that reach across nations, ethnicities, social class, and gender to open pathways from preschool through graduate school (P-20)   Rad Resource: Bridging Multiple Worlds Tools include: Survey measures of these five dimensions for middle/high school and college students Activities for middle and high school students for building pathways to college and careers, with pre- and post-activity surveys (in English and Spanish) Logic model template for programs and alliances among programs Longitudinal case study templates Rad Resource: I invite you to join BMWA partners- students, families, schools, community programs, and universities-in using these tools to ask your own questions and build common ground among evaluators, researchers, educators, and policymakers. The tools and other resources are available at www.bridgingworlds.org. Rad Resource: Bridging Multiple Worlds: Cultures, Identities, and Pathways to College (Cooper, 2011) describes BMW and related models, supporting evidence, tools, and applications in P-20 research, practice, and policy work. Hot Tip: Healthy partnerships are learning communities where "everyone gets to be smart". Focus on questions and indicators partners are interested in and display data in clear and meaningful formats.  This increases enthusiasm, engagement, and cooperation. Examples of such questions, indicators, and formats are on our website. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on theaea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest toaea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:57am</span>
Hello, I am Rupu Gupta, Analyst at New Knowledge Organization Ltd. and Co-Chair of AEA’s Environmental Program Evaluation Topical Interest Group. My evaluation work focuses on learning about the environment and conservation in informal settings. As we celebrate Earth Day, I would like to share some reflections on evaluating these experiences. Lessons Learned: Informal learning settings are critical to learn about the environment and actions to protect it. Informal learning settings offer opportunities for "free-choice" learning, where the learners choose and control what they learn. They are typically institutions such as zoos, botanic gardens, aquariums, and museums, distinct from formal educational settings like schools. With hundreds of millions of visits to these institutions annually, they are prime settings to engage the public in thinking about the environment. Conservation education is often a key aspect of these institutions’ programming, where visitors can learn about different forms of nature (e.g., animals, natural habitats), threats they face (e.g., climate change), and actions to address them (e.g., reducing energy use). Educational experiences here are often referred to as informal science learning for their connection with understanding natural systems. Learning about the environment in informal learning settings can happen through a variety of experiences. Informal learning is socially constructed, through a complex process that includes oneself, close others (friends, family) and more distant others (institution staff). Specific experiences, like animal encounters, hands-on interactions with flora in botanic gardens, or media-based elements (e.g., touch screens) enable visitors to engage with information about nature and the environment. Docents play an important role in helping visitors ‘interpret’ the message embedded in the experiences and exhibits. Evaluators assessing the impact of the different experiences in informal settings, need to be mindful of the multiple pathways for visitors to engage with the environmental information. Informal learning manifests broadly. Learning experiences in informal settings encompass outcomes, beyond learning traditionally associated with school-based education. In the process of making meaning of the various experiences, learning is tied to the multiple aspects of the human experience. They can be cognitive (e.g., gaining knowledge about climate change impacts), attitudinal (e.g., appreciating native landscapes), emotional (e.g., fostering empathy towards animals) or behavioral (e.g., signing a petition for an environmental cause). A mix of qualitative and quantitative methods are best to capture the complex learning experiences. By considering the range of learning possibilities, evaluators can design and conduct effective evaluations to understand how people engage with the multi-faceted topic of the environment. Rad Resources: The following are great to get acquainted with evaluation in informal learning settings: "Free-choice Learning and the Environment" by John H. Falk, Joe E. Heimlich, and Susan Foutz "Framework for Evaluating Impacts of Informal Science Education Projects" edited by Alan J. Friedman Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:57am</span>
Hi, my name is Catherine Nameth, and I’m the Education Coordinator for an NSF- and EPA-funded research center at the University of California- Los Angeles. As Education Coordinator, my primary job is not evaluation, so I have to act creatively in order to integrate evaluation into my work and balance the need for internal evaluation with my other administrative and research responsibilities. Hot Tip: Be an active learner and an active listener. Get to know your colleagues and their areas of expertise. Go to meetings, listen, and be open to learning about your colleagues and what they do. Your understanding of them and their work will inform your understanding of your organization as well as its people and programs/research. This understanding can then inform how you design surveys and collect evaluation data. People who know you are more likely to respond to your surveys and other "official" evaluation requests, and when they respond, you get the information you need! Rad Resource: Map it out! Use Community Solutions’ map for "How Traditional Planning and Evaluation Interact." This map displays how an evaluation logic model (inputs-activities-outputs-outcomes) situated horizontally interacts with program planning (goals-objectives-activities-time frame & budget) which is modeled vertically. In using this map, you’ll see that the "activities" of each model intersect, and this cohesive visual aid also serves as a reminder that program planning goals and evaluation outcomes should- and can- inform one another. Use this map to keep yourself focused, which is really important when your primary responsibilities include many aspects other than evaluation, and to help you show your organization’s leadership what you are doing and why you are doing it. Hot Tip: Have an elevator pitch at the ready. When your work includes evaluation but is not entirely about evaluation, you need to be able to explain quickly and concisely what you are evaluating, why you are evaluating it, what information you need, and how your colleagues can help you by providing this needed information . . . which they will be more willing to do if they know you! Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:56am</span>
Dr. Michael Woodward, a member of the #SHRM15 Blogger Team, chats with Ernie Anastos about what employees fear most at work. The three fears discussed: Not knowing where they stand, automation and being too digitally connected.          ...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:56am</span>
Hi, my name is Martha Meacham. I am a librarian at the Lamar Soutter Library, University of Massachusetts Medical School. We are always happy to help answer the many questions we receive about copyright. While this can be a complicated issue, it shouldn’t be scary. A little background understanding and due diligence will help guide you while navigating copyright. Copyright is a set of exclusive legal rights granted to the creators of works that allows them to control the copying, reuse, redistribution, creation of derivatives, and performance of their works. While copyright allows creators to benefit from their works, particularly financially, it also has some important limitations that benefit the public. Just because something is copyrighted doesn’t mean it can’t be used; the proper steps just need to be taken. Hot Tips: It can be a challenge to determine if copyright needs to be taken into consideration. The Copyright Flow Chart below can help guide you through some questions to ask when considering if copyright is applicable.   Lessons Learned: You may need to do further investigation in areas like Creative Commons Licenses or fair use. Remember, you can always ask for permission. However, don’t wait until the last minute to start thinking about copyright. Finding answers and seeking permissions can take time. Avoid the temptation to ignore the issue, or use something questionable because time has run out to take the proper steps. Rad Resources: There are many great ways to find materials where copyright is not an issue or has been explicitly addressed. Anything produced for or by the government exists in the public domain (something belonging or being available to the public as a whole, and therefore not subject to copyright). For example, the NIH Photo Galleries, the CDC Public Health Image Library, and a database of U.S. Government Photos, all provide materials that exist in the public domain. Other resources contain images that have Creative Commons Licenses. Sites like Flickr allow you to search by Usage (specific Licenses) or restrict to just Creative Commons Galleries. Additionally, almost all images found in WikiCommons have some sort of license that allows for their use. Regardless of resource, it is wise to double check the specific license for a specific image, and always give credit to the source. Finally, the Lamar Soutter Library offers some great resources about copyright here. Also check out the Columbia University Libraries’ Copyright Advisory Office and Fair Use checklist and the Copyright and Fair Use page from Stanford University. When in doubt ask for help. Copyright can be tricky but there are many guides. With a little practice, your copyright journey can be smooth sailing. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:56am</span>
Venturing into social media can be a daunting task since the various platforms are growing so quickly. Developing a checklist can be an easy way to get started in social media and organize your social strategy and routine. I have outlined a few ways you can start developing your social media checklist.  Define your audience Identifying your target audience on social media is important. It’s easy to say that you want to target anyone or everyone who is willing to give you a like or retweet, but is this really aiding your social media goals or purpose and is your content being used effectively? By identifying who you want to target—whether that group is students, evaluation professionals, non-profit workers, or those focused in data—you can create targeted content that will be more valuable for your followers and result in a higher return on investment for your social strategy. You can start with the basic demographics questions: age, occupation and education. Then you can identify their interests. Develop a content strategy It’s important to develop some sort of content strategy when venturing into social media so you can stay relevant with your audience. This helps you stay on track and keeps you from sharing anything and everything. Once you have identified what your audience is looking for, you can develop posts that match their needs.  Important questions to ask yourself when developing content are: What is important to your audience? What are their questions or concerns? What do they want to learn more about? Set up your check list for each channel Once you are ready to start posting, you can set up your personal checklist and scheduling guide which will help you reach your activity goals. Below are a few examples: Facebook Publish 1 post each day Dedicate two days each week to blog content from evaluation sources Monitor and respond to comments once a week Review insights at the end of every month Twitter Publish twice daily Retweet relevant content to your followers twice a week Follow 15 new and relevant users or organizations each week Follow industry hashtags once a week These are just a few examples. You can create a checklist that works with your schedule and social goals. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:54am</span>
From the report entitled The HR Roller Coaster Ride: Are Key HR Metrics Back to Prerecession Levels? by John Dooney All HR professionals have their own stories to tell about how their organization faced the Great Recession. After it began in December 2007, revenues fell more than 50 percent as the demand for goods and services dried up. Organizations slashed operating expenses, including staff and benefits costs, to stay afloat, leaving HR professionals wondering when the HR roller coaster ride—with its ups, downs, twists and turns—would end. The HR roller coaster...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:54am</span>
My name is Melissa (Chapman) Haynes from the University of Minnesota’s Minnesota Evaluation Studies Institute (MESI). At MESI we have a strong interest in building evaluation capacity within the university and the community through university-community partnerships. We are trying to build this capacity in a sustainable manner, and in a way that builds upon the practice of professional evaluators and creates scholarship of the teaching of and professionalization of program evaluation. One of our signature activities is a spring evaluation training that MESI has hosted for the past 20 years. This week of posts will highlight a bit of the key learning, resources, and tools presented at our 2015 event! Lesson Learned: Creating an inclusive community of evaluators is essential but we are an incredibly diverse field - what brings us together? Through the week of MESI the Program Evaluation Standards (Yarbrough et al., 2010) and AEA Guiding Principles were utilized in various contexts. In particular, as a frame of reference as we decide which evaluation projects we will engage in, as a guide to navigation and negotiation of situations where ethics are in question, and to elevate the profession of evaluation in various contexts. We can and should continue to use and explore how these guiding documents can further the professionalization of our field. Hot Tip: Donna Mertens provided some wonderful examples of the art and power of questioning during her workshop and keynote address. During her workshop she gave some examples of how she uses questioning to negotiate with clients. For example, if a potential client asked you to frame an evaluation in a manner that did not jive with the Program Evaluation Standards or AEA Guiding Principles, one might tell the client something like "I will not do X, but let’s talk about how we might frame an evaluation that will continue to serve the population of interest." Rad Resource: Some of the presenters have opted to share the information they presented on our website: http://www.cehd.umn.edu/OLPD/MESI/spring/2015/default.html Rad Resource: A fun highlight of MESI is the annual "Top Ten" competition. For those new to MESI, Jean King develops a Top Ten statement - this year it was "How is program evaluation like interstellar space travel?" We had over 50 entries from MESI attendees - the Top Ten is located here. My favorite is #2 - "You’ve got to remember that YOU are the alien here." The American Evaluation Association is celebrating MESI Spring Training Week. The contributions all this week to aea365 come from evaluators who presented at or attended the Minnesota Evaluation Studies Institute Spring Training. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:54am</span>
Stop talking about the gender confidence gap! I keep hearing a lot about the "confidence gap".  Premise: women are less confident than men.  I believe those raising the issue are well intended, but I worry about the constant drum beat on this issue for three reasons: Are we not measuring confidence by what are more traditional "male" standards? Are we not mis-measuring the confidence of women as a result? Is not raising the issue reinforcing the stereotype in terms of confidence ? Rather than talking about the confidence gap, I recommend we talk about the gap in perceptions on how we...
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:54am</span>
My name is Donna M. Mertens and I am an independent consultant based in Washington DC; my work is both domestic and international. I had the honor of being the keynote speaker at the Minnesota Evaluation Studies Institute (MESI) in March 2015. The MESI theme was Social Justice amidst Standards and Accountability: The Challenge for Evaluation. The concept of social justice in the context of evaluation implies that evaluators can play a role in addressing those wicked problems that persist in society, such as violence, lack of access to quality education for all, poverty, substance abuse, and environmental pollution. Lesson Learned: Wicked problems and Social Justice. Evaluators are concerned and involved in contributing to the solution of wicked problems. They also recognize the importance of bringing a social justice lens to this work. Michael Harnar conducted a survey of 1,187 evaluators and reported that 69% (n=819) either strongly or somewhat agreed with this statement: Evaluation should focus on bringing about social justice. Rad Resource: Mertens, D.M. editorial: Mixed Methods and Wicked Problems, Journal of Mixed Methods Research, 2015, 9, 3-6. Abstract http://mmr.sagepub.com/content/9/1/3.extract Harnar, M. (2014). Developing criteria to identify transformative participatory evaluators. JMDE. http://journals.sfu.ca/jmde/index.php/jmde_1/article/view/383 Lesson Learned: Social Justice Lens Leads to Different Evaluation Questions. Evaluators who work with a social justice lens are concerned with the question of program effectiveness and answering the impact question, Did "it" work? They are also interested in asking other types of questions: Was "it" the right thing? Was "it" chosen and/or developed and implemented in culturally responsive ways? Were contextual issues of culture, race/ethnicity, gender, disability, deafness, religion, language, immigrant or refugee status, age or other dimensions of diversity used as a basis for discrimination and oppression addressed? How were issues of power addressed? Do we want to continue to spend money on things that don’t work? Rad Resource: Native American Center for Excellence published Steps for Conducting Research and Evaluation in Native Communities that provides a specific context in which a social justice lens is applied in evaluation. Lessons Learned: Social Justice Criteria for Evaluators. Evaluators who work with a social justice lens consider the following criteria to be indicators of the quality of the evaluation: Emphasizes human rights and social justice Analyses asymmetric power relations Advocates culturally competent relations between the evaluator and community members Employs culturally appropriate mixed methods tied to social action Applies critical theory, queer theory, disability and deafness rights theories, feminist theory, critical race theory, and/or postcolonial and indigenous theories Rad Resource: Reyes J., Kelcey J., Diaz Varela A. (2014). Transformative resilience guide: Gender, violence and education. Washington, DC: World Bank. The American Evaluation Association is celebrating MESI Spring Training Week. The contributions all this week to aea365 come from evaluators who presented at or attended the Minnesota Evaluation Studies Institute Spring Training. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:53am</span>
This is Jean King, professor of Evaluation Studies at the University of Minnesota and mother of the Minnesota Evaluation Studies Institute (MESI—pronounced "messy" because evaluation is that way). MESI began 20 years ago to provide high quality evaluation training to all comers: evaluation practitioners, students, accidental evaluators, and program staff and administrators. We are fortunate to have had Minnesotans Michael Quinn Patton and Dick Krueger as regular MESI trainers from the beginning and, with funding from Professor Emerita Mary Corcoran, guest sessions from many of our field’s luminaries. Over the years MESI has taught me a great deal. This entry details three learnings. Lesson Learned: Structured reflection is helpful during evaluation training. Experiential educators remind us that merely having an experience does not necessarily lead to change; reflection is the key to taking that experience and learning from it. At MESI plenaries we regularly build in time when the speaker finishes for people to "turn to a neighbor" (groups of 2 to 4-no larger) and talk about what they took as the main ideas and any confusions/questions they have. The reflection is easy to structure, and people engage actively. If appropriate, the facilitator can ask people to jot down their questions, which can become the basis of Q&A. Hot Tip: I never ask an entire large group, "Are there any questions?" At the end of sessions in large conferences/training sessions, the facilitator/presenter will frequently ask the entire group if there are any questions. In these situations there is often an awkward pause, sometimes lasting long enough that people start glancing nervously at each other or at the door, and then someone who can’t stand the silence thinks of a question, raises a hand, and is instantly called on. Everyone breathes a sigh of relief. When I facilitate a session, I instead use the "turn to a neighbor" strategy (briefly—just a couple of minutes) so that everyone can start talking and generate potential questions. You can even call on people and ask what they were discussing in their small group. Cool Trick: Create Top Ten lists as part of a meeting or training session. Since MESI’s inception, attendees have participated in an annual tongue-in-cheek Top Ten competition where they submit creative answers to a simile that describes how evaluation is like something else (e.g., the state fair, baseball, Obamacare). We provide prizes for the top three responses, and I am continually impressed with people’s cleverness. This year’s topic compared evaluation to interstellar space travel, and the final list is posted at www.evaluation.umn.edu. The Top Ten is a useful activity because it spurs creativity and helps a group come together around a common, low-key cause. The American Evaluation Association is celebrating MESI Spring Training Week. The contributions all this week to aea365 come from evaluators who presented at or attended the Minnesota Evaluation Studies Institute Spring Training. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.  
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:52am</span>
I am Ravan Charles, an evaluation newbie from Omaha, Nebraska. I’m writing about how my personal experience at the 2015 Minnesota Evaluation Studies Institute (MESI) Spring Training will influence the way I do evaluation for the rest of my life. I have to admit, I was skeptical when I saw the conference theme - ‘Social Justice Amidst Standards and Accountability: The Challenge for Evaluation’ with an emphasis on cultural competence. It is probably important to note that I am a black woman. I grew up in a world where cultural competence was for white people. Cultural competence meant the white lady facilitating my all-black tween girls’ group was able to code-switch fluently, or my sociology professor making sure to call on me whenever we talked about race in class (every day, it was a long semester). My perception was that cultural competency was something that white people were trained to be good at (by other white people). When I walked into the conference, I instinctively scanned the room for other black and brown faces. I saw one… two… seven…. I lost count. Of the many trainings, conferences, and college classes that I have attended in my life this was the very first where I felt that there was adequate people-of-color representation. I had never realized how important racial diversity is to me until my need for it was satisfied. Once it was, I was able to notice and more deeply appreciate other types of diversity in the rooms. Furthermore, I was able to become more actively engaged and take ownership of the training. Lesson Learned: When I was able to see myself across the room, in the lunch line, and even at the podium I no longer felt like a spectator. I realized that being culturally competent is about continuously learning from, sharing, and honoring our differences and using that knowledge to create things together. That is just as valuable to me as anyone else. Seeing myself in the room allowed me to see myself as an evaluator, and as someone who wants to be a good one. I plan to use this experience inform my future work. For me, being culturally competent will mean that I not only strive to interact effectively with clients, I will also work at ensuring that every individual that participates will be able to see their self in every part of the work. My hope is that if every participant feels well-represented throughout they will feel the same sense of ownership and engagement I felt at MESI. Rad Resources: Cultural Competence in Evaluation: An Overview by Saumitra SenGupta, Rodney Hopson, Melva Thompson-Robinson Humberto Reynoso-Vallejo on Cultural Competence and Cultural Humility in Evaluation The American Evaluation Association is celebrating MESI Spring Training Week. The contributions all this week to aea365 come from evaluators who presented at or attended the Minnesota Evaluation Studies Institute Spring Training. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 27, 2015 11:51am</span>
Displaying 29665 - 29688 of 43689 total records