Loader bar Loading...

Type Name, Speaker's Name, Speaker's Company, Sponsor Name, or Slide Title and Press Enter

Call me Daniel. "Some years ago—never mind how long precisely—having little or no money in my purse, and nothing particular to interest me on shore, I thought I would sail about a little and see the watery part of the world". (Herman Melville, 1851) Rad Resource - Comprehensive Evaluation. A site where to find various information and resources on evaluation, public policy and evaluating public policies. Despite the limited time that I have, I started posting in October 2014 with the intention to include a post once a week, at least. Hot Tips — Favorite posts: Comprehensive Evaluation is bilingual, it is written in Spanish and English. I have only a few posts, but here are those I think are most interesting. Also, I will include links to certain sections of the blog that I believe may be of interest to you. Evidence, Evaluation, and Effective Government (by Caroline Heider). Interesting opinion piece by Caroline Heider, Director General of the Independent Evaluation Group of the World Bank Group, originally published in The Diplomatic Courier. ENGAGE: Open Government Data. ENGAGE is a door for researchers that leads them to the world of Open Government Data. By using the ENGAGE platform, researchers and citizens will be able to submit, acquire, search and visualize diverse, distributed and derived Public sector datasets from all the countries of the European Union. Authors. Links to relevant information on authors and researchers. Training. Links to undergraduate and postgraduate degrees in design, management, analysis and evaluation of public policies around the world. Lessons Learned — Why I blog: Comprehensive Evaluation is the result of my investigation for the preparation of my degree in Public Administration and Management Final Project, "The evaluation of public policies in Valencian Region: situation analysis and proposal of institutionalization": many materials collected, pages visited or consulted sources. These data and information will be gradually incorporated into the web. I hope, someone might find it useful. Lessons Learned — What I’ve learned: In addition to managing Comprehensive Evaluation, I have another personal blog. I am studying a master and also working at the Polytechnic University of Valencia so, I do not have much time for posting in my two blogs but I try never to forget … Post is an escape, a way to get away from daily routine and show the rest of the world who you are, what interests you, what worries you, what you can offer and someone may be interested in receiving… It is also a great way to make friends, if not … what would I do here? Thank you very much!! This winter, we’re continuing our occasional series highlighting evaluators who blog. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: WE Week: Nick Hart on the Value of Affiliate Connections with the Academic Community Ayesha Tillman on Finding an Entry to Mid-Level Evaluation Position within a Government Agency Susan Kistler on the Democratization of Data
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:14am</span>
Hello! My name is Chithra Adams. I work as an evaluator at the Human Development Institute, University of Kentucky. I blog about design thinking and evaluation. Rad Resource - Evaluation and Design Thinking: The blog explores the application of design thinking principles to the practice of evaluation. Each post includes a review of design thinking and design literature and ends with a discussion of possible applications to evaluation. I usually post once every two weeks. I view the blog as a community learning space and visitors are encouraged to provide their interpretation of the concepts discussed in the posts. Hot Tips: Favorite posts: The blog is fairly new so there only a few posts. Most of the current posts focus on understanding the definition of design thinking. Here are two posts that will give you a sense of the blog: The Start of a Preoccupation: This posts talks about how I got into design thinking. It also talks about the questions I had after reading through many definitions of design thinking. The post includes some great introductory resources on design thinking. Definition Deconstruction Design Sensibility Part 1 of 3: This post describes design sensibility and what it means to evaluators. The post includes an article by design consultants. It is a pretty easy read and gives a glimpse of how designers’ view problems. Lessons Learned: Why I blog: There are a lot of web resources on design thinking. I found these resources to be quite helpful to get me excited and interested in the concept. However, they were less helpful concerning how design thinking could be used in the practice of evaluation. At a cursory glance, design thinking will appear as a strategy to make products more user-centered. As a discipline, evaluation is rich with theories and practices that encourage being user centered (utilization focused evaluation, empowerment evaluation etc.). Evaluation has a strong tradition of implementing practices and developing products that are user centered. So what does design thinking offer evaluation? Does design thinking provide any added value to evaluation? What does it look like to practice design thinking in evaluation? The blog is the record of my journey to answer these questions. Lessons Learned: What I’ve learned: Blogging to me is like exercising! Once I get started, I have so much fun. This winter, we’re continuing our occasional series highlighting evaluators who blog. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Bloggers Series: John Nash on Contributing to Multiple Blogs Emily Lauer and Courtney Dutra on Person-Centered Evaluation: Aging and Disability Services Bloggers Week: Cameron Norman on Censemaking
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:14am</span>
Hi there! I am Deborah Levy, Principal of SuccessLinks, LLC and the new Chair of the Independent Consulting TIG. I have been evaluating for 15 years and have been independent for eight years. I have published two posts since I started my blog "Why Not Blog?" after attending the annual conference in October. As you can see, my start has been slow and the holidays didn’t help much. I started my blog because I wanted to insert my voice into the blogging world, specifically around evaluation and my work as an independent consultant. I really enjoy reading friends’ and colleagues’ blogs (evaluation-related and those that aren’t). They inspire me and also I really like that a blog helps you understand a different side to people. I wanted to produce the same effect for other readers and present a part of myself that people who know me personally or professionally don’t know. Favorite Post -  (not many to choose from) My First Post, I Hope it’s a Good One: This post flowed just as I would want it to. It felt natural writing it and conveyed the excitement I was feeling as I put the words down. The response was positive and energized me to continue. It was a strong entry to the blogging world. Lesson Learned- After my first two posts, I decided that a monthly blog was going to be more my speed. Many blogging experts suggest writing weekly or even multiple times a week, but for me that isn’t possible or even desired at this point. Everyone has a different writing schedule and that is okay. Just because you don’t post once a week does not mean your blog isn’t worthy or you should stop writing all together. I also have learned that graphics go a long way. Some of my favorite blogs use photos, charts, or cartoons. It makes reading them more fun. Lastly, don’t decide that your blog is going to serve one purpose (e.g. an evaluation blog) because you probably have many more stories to tell and thoughts to share that aren’t about that subject area. It would be a disservice to yourself and your readers to not write something you want to share because it doesn’t fit into the box you originally created for yourself. This winter, we’re continuing our occasional series highlighting evaluators who blog. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Bloggers Week: Glenn O’Neil on Intelligent Measurement Bloggers Week: Karen Anderson on On Top Of the Box Evaluation Bloggers Series: Gail Barrington on the Barrington Research Group Blog
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:13am</span>
"Creativity is intelligence having fun", Albert Einstein. Greetings! I’m Sara Vaca, independent consultant (EvalQuality.com) and recently appointed Creative Advisor of this blog. To start contributing I thought of writing some posts about how creativity intertwines with evaluation. This is Part I of a two-part post. Lesson Learned: Evaluation is a rigorous, systematic transdiscipline. However, evaluators can (and already) use creativity to improve their practice in many different moments and levels. Here are many examples, just digging in our aea365’s archives: Hot Tips: 1. Advocating for evaluation Evaluation is not as well-known as it should be for many citizens and politicians. Many of us find ourselves exploring ways to make evaluation more attractive, interesting and remarkable, at least at our local environment. Examples: Kylie Hutchinson on "O Collective Impact," an Evaluator’s Carol Michael Quinn Patton on Using Children’s Stories to Open up Evaluation Dialogues 2. Making stakeholders engage A demonstrated key factor in an evaluation, our fellows have already encountered this potential problem and shared tips to overcome it: Alicia McCoy on Using Humor and Creativity to Engage Staff in Evaluation Marybeth Neal on Using a Wall to Engage Stakeholders Julie Poncelet, Catherine Borgman-Arboleda, and Jorge Arboleda on Using Participatory Video to Engage Youth in Evaluation in a Creative and Empowering Way Jeanne Hubelbank on Assessing Audience or Client Knowledge in a Sweet Way Jessica Foster on Maximizing Survey Response Rates 3. New ways of using data Evaluation has always relied on data, but other sectors are catching up. Now evaluators have realized that and we are learning new ways in dealing with and using data: Kimberly Kay Lopez on Getting Creative With the Data You Collect and Use for Evaluations! Patti Patrizi on Using Existing Data in New Ways Laura Pryor and Nichole Stewart on Data Science for Evaluators Cameron Norman on The Evaluator-as-Designer   We would love to hear how YOU are using creativity in your evaluation work. Please consider contributing your own aea365 post! (sara.vaca@EvalQuality.com) Look for Part II with more examples of aea365 posts on creativity and evaluation! And even more about creativity and evaluation coming your way soon! Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Sheila B Robinson on Introducing our Newest aea365 Team Members Innovative #Eval Week: Jean King and Laura Pejsa on What We Learned from "Learning by Doing" Cultural Competence Week: Dominica McBride on AEA 2013 and Spreading the Word on Cultural Competence
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:12am</span>
My name is Ann Zukoski and I am a Senior Research Associate at Rainbow Research, Inc. in Minneapolis, Minnesota. Founded as a nonprofit organization in 1974, Rainbow Research’s mission is to improve the effectiveness of socially-concerned organizations through capacity building, research, and evaluation. Projects range in scale from one-time program evaluations to multi-year, multi-site research studies and designs that explicitly include participatory approaches designed to lead to program improvement. Through my work, I am always looking for creative ways to capture evaluation data. Here is one rad resource and a hot tip on a participatory tool to add to your tool box. Rad Resource: Participatory evaluation approaches are used extensively by international development organizations. This web page is a great resource for exploring different rapid appraisal methods that can be adapted to the US context. ELDIS -http://www.eldis.org/go/topics/resource-guides/participation/participatory-methodology#.UwwFaf1z8ds ELDIS provides descriptions and links to a variety of information sources on participatory evaluation approaches, including online documents, organization’s web sites, databases, library catalogues, bibliographies, and email discussion lists, research project information, map and newspaper collections. Eldis is hosted by the Institute of Development Studies in Sussex, U.K. Hot Tip: Evaluators are often asked to identify program impacts and measure key outcomes of community based projects. Impact and outcome measures are often externally determined by the funder. Many times, however, collaborative projects lead to unanticipated outcomes that are seen to be of great value by program participants but are overlooked by formal evaluation designs. One participatory technique, Most Significant Change (MSC), offers an alternative approach to address this issue and can be used to surface promising practices. Most Significant Change Technique (MSC) - MSC is a participatory qualitative data collection process that uses stories to identify the impact of the program. This approach involves a series of steps where stakeholders search for significant program outcomes and deliberate on the value of these outcomes in a systematic and transparent manner. Stakeholders are asked to write stories of what they see as "significant change" and then dialogue with others to select stories of most importance. The goal of the process is to make explicit what stakeholders (program staff, program beneficiaries and others) value as significant change. The process allows participants to gain a clearer understanding of what is and what is not being achieved. The process can be used for program improvement, identifying promising practices as well as to uncover key outcomes by helping evaluators identify areas of change that warrant additional description and measurement. Where to go for more information: http://www.mande.co.uk/docs/MSCGuide.pdf Have you used this tool? Let us all know your thoughts! The American Evaluation Association is celebrating Best of aea365 week. The contributions all this week are reposts of great aea365 blogs from our earlier years. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Ann Zukoski on Participatory Evaluation Approaches Susan Kistler on Free Guides to Participatory Video and the Most Significant Change Technique Veena Pankaj and Myia Welsh on Participatory Analysis: Expanding Stakeholder Involvement in Evaluation
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:10am</span>
My name is Lija Greenseid. I am a Senior Evaluator, with Professional Data Analysts, Inc. in Minneapolis, MN. We conduct evaluations of stop-smoking programs. Smokers generally have lower education levels than the general population. Therefore, we want to make sure the materials we develop are understandable to smokers. Rad Resource: Use a "readability calculator" to check the reading-level of your written materials. I have used this with program registration forms, survey instruments, consent statements, and other materials. Not surprisingly, the first drafts of my materials are often written at a level only grad students (and evaluators) can understand. With a critical eye and a few tweaks I can often rewrite my materials so that they are at an eighth-grade reading level, much more accessible to the people with whom I want to communicate. A good Readability Calculator can be found here: http://www.editcentral.com/gwt1/EditCentral.html It provides you with both a reading ease score, and a number of different measures of the US school grade level of the text. This blog posting is rated at a high-school reading level. Do you agree? The American Evaluation Association is celebrating Best of aea365 week. The contributions all this week are reposts of great aea365 blogs from our earlier years. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.   Related posts: Lija Greenseid on Using a Readability Calculator DOVP Week: June Gothberg on Keeping it Simple and Intuitive Susan Kistler on Welcoming the Data Visualization & Reporting TIG and DVR Resources
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:09am</span>
My name is Amy A. Germuth,  and I am Founder and President of EvalWorks, LLC in Durham, NC and blog at EvalThoughts.com. Over the last year I have worked improving my reporting of findings to better meet my client’s needs and have a few great resources to help you do the same. Rad Resource: "Unlearning Some of our Social Scientist Habits" by Jane Davidson (independent consultant and evaluator extraordinaire, as well as AEA member and TIG leader). She added some additional thoughts to this work and presented them at AEA’s 2009 annual conference in Orlando. Her PowerPoint slides for this presentation can be found at: http://bit.ly/7RcDso. Frankly, I think this great article has been overlooked for its valuable contributions. Among other great advice for evaluators (including models or theories but not using them evaluatively and leaping to measurement too quickly), she addresses these common pitfalls when reporting evaluation findings: (1) not answering (and in some cases not even identifying!) the evaluation questions that guided the methodology, (2) reporting results separately by data type or source, and (3) ordering evaluation report sections like a Master’s thesis. This entertaining article and the additional PowerPoint slides really make a case for using the questions that guide the evaluation to guide the report as well. 2015 UPDATE Read Resource: Data visualization can help make reporting more accessible and visually captivating.  There is a great post on "What is data visualization?" and many posts from other aea365 authors. Rad Resource: Why assume all findings have to be reported as a paper?  Try reporting using PowerPoint and heed the advice Garr Reynold’s provides in his great book "Presentation Zen Design" to ensure that you do not subject your clients to DBP (death by PowerPoint). This post is a modified version of a previously published aea365 post in an occasional series, "Best of aea365."  Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Amy Germuth on Reporting Findings Amy Germuth on reports and checklists p2i Week: Laura Beals on Applying p2i to Presentations at Work
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:09am</span>
My name is Summer N. Jackson, Project Director at Bay Area Blacks in Philanthropy a regional, nonprofit membership organization that focuses on advancing the interests of African Americans in philanthropy and address the impact of racial disparity within philanthropic institutions and African American communities in the San Francisco Bay Area. I had the opportunity to serve as a session scribe at Evaluation 2010 and one of the sessions I attended was session 304: Insights into Foundation Evaluation. I chose this session because I am interested in strategies to develop an organizational learning culture in a foundation. Lessons Learned - Evaluation should be shared throughout the organization Traditionally programs are responsible for engaging in the activities for a given evaluation. In the Pan Canadian system, new federal requirements dictate an organizational level approach with creates an institutional culture of learning rather than the top down approach produced when programs hold the responsibility. Lessons Learned - Create shared values around evaluation When planning to introduce evaluation into an organization, one should try to create opportunities for discussion through facilitated workgroups rather than as a mandate. An approach that seeks to develop shared values around the benefit of inquiry will increase buy-in from participants. Hot Tips - Implementing a New Tool: Start with a prototype Identify challenges and work to enhance quality of data received year by year Complete an informal feasibility study to gradually introduce processes that are more rigorous Develop an actionable plan Work with senior management to increase buy-in and to provide directives to staff Consider using a facilitator to provide evaluation education and training Be explicit about organizational goals and try to help staff understand how their work fits into them Hot Tip - Internal Champions, Open Doors, and Meet & Adjust: When implementing a new evaluative tool or framework in an organization, identify an internal champion that will help promote the tool. Maintain an open door policy after the initial training and offer additionally Technical Assistance (TA) opportunities that are intimate in nature. Lastly, schedule a quarterly/monthly meeting to review data and challenges and readjust when necessary. This will enhance trust and communication between the program staff as well as enhance the quality of data you receive in the end. This post is a modified version of a previously published aea365 post in an occasional series, "Best of aea365." Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Scribing: Summer Jackson on Insights Into Foundation Evaluation Jeff Sheldon on the Readiness for Organizational Learning and Evaluation instrument OL-ECB Week: Jeff Sheldon on using ROLE to Determine an Organization’s Support of Evaluative Inquiry
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:09am</span>
My name is Judah Viola and I am a Community Psychologist at National Louis University in Chicago. I also manage an independent consulting practice specializing in program evaluation and collaborative community research. Whether boom or bust economy, it is never a bad time to build your own evaluation skills and capacity to provide value for clients. Fortunately, there is a plethora of information available about the art and science of evaluation. However, there isn’t much published or taught in universities about how to build your evaluation consulting practice. Knowing what analyses are appropriate for a particular project and how to conduct them are necessary, but you also need to be able to translate statistics or methodology into language that all stakeholders can understand and utilize for decision making. In addition, you need to know a whole host of other things such as how long the project will take and how much the evaluation is worth to the client. Thus, for the majority of evaluation consultants, academic training alone is not sufficient preparation, and real word evaluation experience working under others is the preferred option. Hot Tip: Start with your existing network. Who introduced you to the field of evaluation? Many faculty do evaluation projects and look for students, or early career evaluators to support their work. In addition, many independent consultants subcontract parts of their larger projects out to colleagues and are willing to supervise and train folks with little experience. If you don’t currently know many evaluation consultants, there are simple ways to build your network. Hot Tip: Getting involved in local professional development organizations, promotes learning and networking, and opportunities for referrals or collaborations. My involvement in the Chicagoland Evaluation Association (CEA), a local AEA affiliate, has enabled me to connect with evaluators with whom I have worked on evaluation projects, presented at conferences, and published book chapters. Hot Tip: Contact and listen to experts on the phone, on-line, or at conferences. Initially, I thought that experienced independent evaluation consultants may be too busy or hesitant to talk to a newbie. However, I found that they (especially AEA members) are quite approachable and generous with their time and are quick to share lessons learned so you don’t have to make the same mistakes they did when getting started. I’ve gotten a host of great advice about everything from getting an accountant to how to come up with a pricing strategy. Rad Resources: Consulting and Evaluation with Nonprofit and Community Based Organizations by Judah Viola & Susan McMahon (2010) Independent Evaluation Consulting: New Directions for Evaluation, No. 111 (2006) Evalbusiness: The Independent Consulting TIG listserv This post is a modified version of a previously published aea365 post in an occasional series, "Best of aea365." Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Judah Viola on Building Capacity to Succeed as an Independent Consultant Heidi Gegax on Creating a Consultants Collaborative IC Week: Jan Upton on The Independent Consultant Life Cycle
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:09am</span>
My name is Käri Greene and I’m a Senior Research Analyst at Program Design & Evaluation Services, an intergovernmental agency for the Oregon Public Health Division and Multnomah County Health Department, as well as a co-Chair for the LGBT Issues TIG. Hold on…what was that jumble of letters at the end of the sentence? Well, our TIG explores areas of sexuality, gender and identity as they relate to evaluation theory, practice, and use, specifically focusing on issues related to lesbian, gay, bisexual, and transgender issues. Many evaluations might not deal explicitly with LGBT issues; however, gender and sexuality are concepts present in much of our evaluation practice. Gender or ‘sex’ is a standard demographic variable collected in nearly all evaluation studies, and sexual orientation is being included more frequently in evaluations. But the concepts of sexual orientation, sexual behavior, and gender identity can be dynamic and complex. In public health evaluations, someone served by a program might identify as a lesbian woman, but she may have been born and raised as a boy and not identify as transgender. A man served at the local public health clinic might be having sex with other men, but not identify as gay or bisexual. Being clear about what we need to know about the clients served in our programs is essential to answering our evaluation questions. 2015 Update The key thing to keep in mind when dealing with issues of sexuality and gender is to question assumptions and ask the right questions for your evaluation and those served by the program. Sexual orientation does not automatically define a person’s sexual behavior, and gender identity does not always fit neatly into a two-by-two table. Feeling even more confused about how to deal with gender and sexuality? That’s good - that means you’re questioning assumptions! But it can be frustrating. The field is evolving and even after a century of research on sexuality and gender, few researchers agree on terminology, dimensions and categorical classifications of sexuality. But fear not, we’ll have more to say on this subject throughout the week… Hot Tip: Consider how you currently assess gender. It might be important to ask multiple items to get at gender - one that asks current gender identity ("Do you consider yourself to be male, female, transgender, or something else?") and one that asks birth gender ("What sex were you assigned at birth - male, female, or intersex?"). Hot Tip: Consider expanding your existing response categories for sexual identity. Younger clients might consider themselves "queer" as opposed to the more traditional categories of lesbian, gay, or bisexual. 2015 Updates Rad Resource: The Williams Institute "Best Practices for Asking Questions to Identify Transgender & Other Gender Minority Respondents on Population-Based Surveys" Rad Resource: The Gay & Lesbian Alliance Against Defamation (GLAAD) has resources, including a media reference guide, that can be helpful when communicating and reporting about issues of sexuality and gender. Rad Resource: "Do Ask, Do Tell" article by Cahill et al. on the acceptability of asking patients sexual orientation and gender identity in clinical settings. This post is a modified version of a previously published aea365 post in an occasional series, "Best of aea365." Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: LGBT Week: Käri Greene on Issues of Gender and Sexuality in Evaluation LGBT Week: Joseph Kosciw on Making Schools Safe and Affirming Places to Learn for LGBT Students LGBT Week: Denice Cassaro on Using Open-Ended Demographic Categories to Learn About Race, Ethnicity, Sex, and Gender
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:08am</span>
Hi there, Liz Zadnik here, new(ish) member of the aea365 curating team and sometimes Saturday poster. Last year Sheila posed the question What is it that YOU would like to read about on this blog? One of the responses resonated with me, as it represented my relationship with evaluation as a professional: I would love to see a post, or series of posts about evaluation from the perspective of practitioners for whom their primary job is not evaluation. Perhaps tips on how to best integrate evaluation into the myriad of other, seemingly more pressing, tasks without pushing it to the back burner. I work in the anti-sexual violence movement at a state coalition, focusing on prevention strategies, training, and making community-based rape crisis centers accessible to people with disabilities. These three areas are my priorities - there are deliverables and activities that don’t always include evaluation and assessment. Many times - given my love of evaluation - I am the sole voice at the table asking about an evaluation plan. Most of the time we can weave evaluation in from the ground floor, other times it happens a little late(r). Hot Tip: Ask this (or a similar) question: "How will we know we’ve been successful?" This is the most effective way I have found to help get people thinking about evaluation. It has started some of the most engaging and enlightening conversations I’ve ever had, both about a project and the work of the movement. Lesson Learned: Sometimes, evaluation takes a backseat to program implementation and grant deliverables. This can be disappointing (to say the least), but I do see a change. Funders are more frequently asking for research, "evidence," or assessment findings, providing evaluation enthusiasts (like myself) to engage our colleagues in this work. Lesson Learned: Practice and challenge yourself, even if no one is ever going to see it. One of the ways I "integrate evaluation into the myriad of other, seemingly more pressing, tasks" is evaluating myself and my own performance. I regularly incorporate evaluative questions into training feedback forms, look for ways to assess the effectiveness of my technical assistance provision, and record my professional progress throughout the year. I sit in on as many AEA Coffee Break webinars and other learning opportunities as I can, always practicing the skills discussed and looking for ways to apply them to my work. I would so appreciate hearing from other practitioners (and evaluators!) about their experiences infusing evaluation into their work. I’d also be happy to answer any questions you might have or write about specific projects in the future. Let me know - the aea365 team is here to please! Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Sheila B Robinson on A Call for Blog Posts! Jara Dean-Coffey on Visual Facilitation and Graphic Recorders IC Week: Norma Martinez-Rubin on Working Solo vs. Subcontracting
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:08am</span>
Hi. My name is Lacy Fabian, and I’m a healthcare quality researcher with MITRE in Woodlawn, MD. In addition to my work in healthcare quality, I am a preceptor for advanced pharmacy students. As a board member for the Eastern Evaluation Research Society (EERS), my AEA local affiliate, it’s my pleasure to welcome you to EERS week on AEA365. This week, some of our members will be sharing lessons learned, hot tips and rad resources in the evaluation field. As tends to be the case for many, I came to evaluation via an indirect path and professional societies like EERS and AEA have been mainstays in my continuing professional development, as well as outlets for giving back to the field. This has been particularly true in learning how to talk about evaluation with others from colleagues to clients. "What is evaluation?" continues to be a challenge, as many are only familiar with a small branch of evaluation, if at all. Lesson Learned: In the last few years the dialogue around evaluation has improved. People are increasingly aware of the presence of evaluation in their daily lives from education to healthcare to spending on government programs. That type of shift doesn’t occur only at the programmatic and policy level, but also through personal interactions. In meetings or workgroups there are often opportunities for an evaluator to employ their methods to solve challenges at hand—think process maps or logic models. Lesson Learned: Interacting with the field via a niche or sweet spot that you’ve identified helps connect evaluation to new areas and build understanding of what it can offer. For example, you may present your evaluation findings at subject matter conferences, mentor students, give a guest lecture on the importance of evaluation to a particular discipline, or volunteer your expertise to a community effort that resonates with you. Hot Tip: Take advantage of daily evaluation resources, whether it be through AEA blogs, journal article alerts or LinkedIn groups—to see the field as a whole and when shifts or challenges are occurring. Rad Resource: Registration is open for the 38th Annual EERS Conference, which will be held April 26-28, 2015. The theme is "Let’s Get Real: Evaluation Challenges and Solutions," and we hope to see you there to exchange ideas about your evaluation lessons learned. The American Evaluation Association is celebrating Eastern Evaluation Research Society (EERS) Affiliate Week. The contributions all this week to aea365 come from EERS members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Leigh M. Tolley on Student Involvement in Local Affiliates Susan Kistler on Local Affiliates and Managing Professional Obligations MNEA Week: Danielle Hegseth on A New Kind of Education at the AEA Conference
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:08am</span>
Hi! We are Jill Feldman and John Kelley, board members of the Eastern Evaluation Research Society (EERS), AEA’s oldest affiliate, writing to tell you about the Eleanor Chelimsky Forum, a hallmark of our annual EERS conference. Eleanor Chelimsky is one of the most insightful, influential, and respected evaluators of our era. At the 2012 EERS Conference, Eleanor proposed the idea of a forum to spark debate using common practitioner experiences that challenge evaluation theory as a way to improve both theory and practice. When Eleanor speaks, EERS and the Robert Wood Johnson Foundation (RWJF) listen! The inaugural Eleanor Chelimsky Forum was held in 2013. The Forums are funded through a generous grant from RWJF. Lesson Learned: Attracting high caliber speakers to the Forum was a pleasure not a problem. Michael Quinn Patton and Tom Schwandt, Deb Rog, and Abe Wandersman readily agreed to kick off EERS conferences in 2013, 2014 and 2015, respectively by serving as featured Chelimsky Forum speakers. Rad Resource: Manuscripts written by Chelimsky Forum speakers are published each year in the American Journal of Evaluation (AJE). The 2013 inaugural articles by Michael Quinn Patton and Tom Schwandt and discussant response by Laura Leviton of RWJF are available on AJE’s website. Stay tuned for Deb Rog’s soon-to-be-published article based on her 2014 address. Rad Resources: Posting videos of the Chelimsky Forum on the EERS YouTube channel helped spread the word as did joining forces with AEA to create a webinar with Deb Rog that reached evaluators across the globe. Rad Resource: Read Eleanor Chelimsky’s 2013 article, "Balancing Evaluation Theory and Practice in the Real World." Hot Tip: Don’t miss the 2015 Chelimsky Forum when Abe Wandersman delivers his plenary address "Achieving Outcomes in a Specific Project: Why Evidence Based Interventions are not Sufficient and Why Evaluation is Key" with remarks by discussant Mary Dixon Woods. The 2015 EERS conference will be held April 26-28 in Galloway, NJ - just outside of Atlantic City. Watch for the YouTube video of the Chelimsky Forum to be posted shortly thereafter. The American Evaluation Association is celebrating Eastern Evaluation Research Society (EERS) Affiliate Week. The contributions all this week to aea365 come from EERS members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Susan Kistler on the AEA Thought Leaders Forum Leigh M. Tolley on Student Involvement in Local Affiliates EERS Week: Ann K. Emery on Overhauling Your Organization’s Data Visualizations? Three Edits Guaranteed to Give You the Biggest Bang for Your Buck
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:07am</span>
By Christine Schaefer Who are the folks who judge applications for the Malcolm Baldrige National Quality Award? In an ongoing blog series, we have been interviewing members of the 2015 Judges’ Panel of the Malcolm Baldrige National Quality Award. In the interviews, they share their insights and perspectives on the award process, on their experiences, and on the Baldrige framework and approach to organizational improvement. Following is the interview of Fonda Vera, a second-year judge. Vera is executive dean of planning, research, effectiveness, and development at Richland College (PDF), the first and only community college to date to receive a Baldrige Award (in 2005). What experiences led you to the role of Baldrige judge? I began my involvement with the Baldrige Program as an applicant in 1999. I was one of the writers of [Richland College’s] numerous state and national applications. Just writing the application and studying the Criteria from that point of view was a wonderful learning experience, not only in learning about the Criteria but also about my own organization. After we were [named] a Baldrige Award recipient in 2005, I became a national examiner. I was fortunate enough to go on a site visit the very first year and also for the next two years in a row. In total, I have participated in four site visits, and I have led one. It was an incredible honor to be chosen to sit on the panel of judges. You have a great deal of experience in the education sector. How do you see the Baldrige Excellence Framework as valuable to education organizations? The Baldrige Excellence Framework is invaluable to education. The education sector is buffeted about by ongoing state and national legislative changes, calls for accountability demanding more performance with less funding, and students who are less and less prepared—for a multitude of reasons—for success in education and ultimately in life. The Baldrige Excellence Framework is the rudder for the education sector in that stormy sea. It empowers organizations to address their reason for being by maintaining focus and discipline to achieve student success. How do you apply Baldrige principles/concepts to your current work experience/employer? Richland College has used Baldrige principles and the framework since 1999. It quickly became the way we do business. We have been fortunate to have consistency in leadership, which I believe is key. This discipline of following the Criteria is not always easy, but it is hard to argue with the good results it produces. Our college’s vision of being the best place we can be to learn, teach, and build sustainable local and world community fits perfectly with the Baldrige core values and concepts. As a judge, what are your hopes for the judging process? In other words, as a judge what would you like to tell applicants and potential Baldrige Award applicants about the rigor of the process? As a current judge, a previous applicant, and a previous award recipient, I would say to potential applicants that following the Baldrige Criteria and writing an application is one of the best things you can do for your organization. It will cause you to examine what you do, why you do it, and for whom you do it in a way that you’ve never done before. Using the Socratic Method, this deep dive into your organization’s business will reveal opportunities to improve and strengths you may never have known existed. The challenge and the rigor for each organization is to have the discipline to take what is learned, act on it, and invest in the discipline for the long haul. The rewards are many if you follow this path. What encouragement/advice would you give Baldrige examiners who are reviewing award applications now? I encourage all Baldrige examiners who are reviewing award applications to do their absolute best to understand the applicant they are reviewing and to make comments that are relevant and will move the applicant to the next level of maturity. This is not only important to the applicant but also is vital to the judging process. While the examining process is certainly very hard work, take the time to enjoy the intellectual challenge and the camaraderie that accompanies it. It is indeed a unique experience. See other blogs on the 2015 Judges’ Panel: Laura Huston, Dr. Ken Davis, Michael Dockery, Miriam N. Kmetzo, Dr. Sharon L. Muret-Wagstaff, Dr. Mike R. Sather, Ken Schiller, Dr. Sunil K. Sinha, Dr. John C. Timmerman, and Roger M. Triplett. Greg Gibson, a candidate for the 2015 panel, pending appointment, will also be interviewed for this series.
Blogrige   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:07am</span>
Greetings, I’m Ann K. Emery. I consult, instruct, and write on all things data visualization. One of the most common questions I receive during workshops and webinars is, "Ann, where do I start?" Cool Tricks: As you’re overhauling your visualizations, these three edits are guaranteed to give you the biggest bang for your buck. Remove unnecessary ink. I immediately begin deleting or lightening everything without a purpose: the border, the grid lines, and the tick marks. Visualization guru Edward Tufte calls this strategy the data:ink ratio; we’re intentionally removing any ink that isn’t directly related to the data itself. These edits ensure our viewers will focus attention where we need it: on the actual patterns, not on the software program’s outdated and clunky lines. Read Muted Grid Lines: Small Details, Big Difference to explore before/after remakes in more detail and watch Removing Tick Marks and Grid Lines for a how-to lesson in Excel. Customize your color palette. Next, I swap my software program’s random color scheme for a customized palette. As a consultant, I’m typically following my client’s branding. I scroll through the organization’s website, look at their logo, and skim publicly-available reports that were created with the aid of a graphic designer. In my former role as an internal evaluator, I would match my chart colors to my organization’s own logo and branding. When I create graphs through my role with AEA’s Data Visualization and Reporting Topical Interest Group, I match AEA’s exact shade of burgundy—RGB code 149:8:4—rather than sloppily choosing any old shade of red. Editing color codes is simple. Newer versions of Excel on both PCs and Macs have built-in eyedropper tools. If you’re using an older version of Excel, follow the Uganda Evaluation Capacity Development Project’s step-by-step instructions for using a free tool called Instant Eyedropper. Write a descriptive title and subtitle. Today’s viewers want and deserve brevity, everyday language, and text that describes something about the actual finding—so that even the quickest report-skimmers will walk away having digested and retained the report’s contents. Bonus points: Select an important word or two from the title and make that word stand out ("chocolate" is in bold text and matches the graph’s dark brown color scheme). Then, add a one- or two-sentence subtitle ("Cookie dough was second most popular flavor"). Rad Resource: Want to master these skills and more? I’m leading a pre-conference workshop at the Eastern Evaluation Research Society’s conference in April 2015. Bring your laptop so we can build these charts and more from scratch. See you there! The American Evaluation Association is celebrating Eastern Evaluation Research Society (EERS) Affiliate Week. The contributions all this week to aea365 come from EERS members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Lyn Paleo on Graphic-Based Reports and Graphics for Color-Impaired Readers DVR TIG Week: Ann K. Emery and Stephanie Evergreen on the Data Visualization Checklist Manny Straehle on the ICA Data Visualization Competition
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:06am</span>
Kirk Knestis, here, CEO of Hezel Associates and EERS Communication Chair. My colleagues are very good at researching education innovations and evaluating programs. We serve as RESEARCH PARTNER or EXTERNAL EVALUATOR for research and development (or R&D) projects across a variety of programs, many funded by agencies like the National Science Foundation. Lesson Learned - I emphasize the distinction between the above roles very purposefully. The first studies an innovation being developed, to inform iterative design and assess its promise of impact; the second examines the implementation and results of those R&D efforts. Both require collection and analysis of data but it’s easy to get them tangled up where they meet when planning a research project or writing a proposal. We’ve come to understand that the simplest way to keep things straight is often to work backward, first asking how best to evaluate R&D activities, then designing the research and development processes in a separate discussion. We frankly need to use more robust methods than the "panel review" evaluation approaches to which we’ve typically defaulted. Hot Tip - If one party is implementing an R&D project consistent with precepts of the Common Guidelines for Education Research and Development, and another is charged with program evaluation for that effort, make sure that everyone involved is on the same page regarding the Purpose, Justification Guidelines, and Guidelines for Evidence to be Produced detailed in the Guidelines for the type of research being designed. These attributes define the quality of the R&D so should guide evaluation of implementation and results. Hot Tip - Equally, you may largely ignore the Guidelines for External Feedback Plans in that document. That list of possible structures for organizing evaluation activities provides little useful guidance beyond raising the possibility of peer review. Unfortunately, that bears practically only on published reports of Efficacy, Effectiveness, and Scale-up Research (per the Guidelines), so it’s not an answer for many—perhaps most—R&D projects. Rad Resource - At least, I hope it’s rad. Hezel Associates has developed two conceptual models for evaluating education R&D projects—one adapted from ideas shared by Means and Harris at the 2013 AERA conference, and a second created from scratch by Hezel Associates. The latter is tightly aligned to purpose, justification, and evidence guidelines for Design and Development Research (Type #3), where most of our partners’ projects are situated. The document linked-to above is tailored to NSF Advanced Technological Education audiences, but the frameworks should be broadly applicable. If interested, take a look. If they’re not rad or if you have ideas to make them more rad, please share them. Better yet, come to the 2015 EERS conference we can talk more in person. The American Evaluation Association is celebrating Eastern Evaluation Research Society (EERS) Affiliate Week. The contributions all this week to aea365 come from EERS members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Kirk Knestis on How Evaluators can Help Clients with Proposals in an Innovation Research and Development (R&D) Paradigm Leigh M. Tolley on Student Involvement in Local Affiliates Kirk Knestis on Innovation Research and Development (R&D) vs. Program Evaluation
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:06am</span>
Hello, my name is Susan Jenkins and I am Treasurer of EERS. I have evaluated several Federal Tribal Grants Programs and have learned to redesign assessment, analysis, strategies, and solutions to be appropriate for tribal governments and/or tribal programs. While American Indians and Alaska Natives are citizens of the United States, they also maintain separate and distinct citizenship, cultural values, traditions, beliefs, and identity which provide for modes of thought and communication that may differ from those of other groups. Lesson Learned: You do not have to be an expert. Learning what you can about the group you will be working with and being humble goes a long way. Lesson Learned: Allocate enough time. Tribal traditions often require that tribal leaders deliberate extensively and consider the long-term consequences of their decisions. Lesson Learned: Tribal members may speak English as a second language and some concepts are not easily translated. Being sensitive and seeking clarification in a patient and respectful manner can bridge gaps in cross-cultural communication. Hot Tip: Some ways to demonstrate respect include: Be willing to admit limited knowledge of tribal culture, and inviting tribal members to educate you about specific cultural protocols. When in doubt about something, ask respectfully for guidance. Understand that certain objects, such as feathers and beadwork may be sacred, and should not be touched or discussed. Listen and observe more than you speak and be comfortable with silences or long pauses in conversation. In tribal communities, any interruption is considered highly disrespectful, and may undermine your credibility. Understand that Native Americans may convey truths or difficult messages through humor or by telling stories. Pointing your finger is interpreted as rude behavior in many tribes. Respect personal space and do not take photographs without permission. Rad Resource: On a recommendation from the head of my agency’s Tribal Grants Program, I took the training: "Working Effectively with Tribal Governments" which provided basic skills and knowledge for working more effectively with tribal governments. I increased my understanding and awareness of tribal issues and concerns, and important legal, historical and cultural factors that should inform work with Tribal programs. Rad Resource: Medicine Wheel Evaluation Framework. This guide introduces the ‘Medicine Wheel’, outlining its history and uses, and shows how it can be used as an evaluation framework. I used this guide to develop a graphic showing proposed individual-level outcomes of the Federal Tribal Grant Program. Rad Resource: A list of citations obtained from public sources and recommended by National Indian Education Association (NIEA) staff and partners. Over 25 citations/abstracts and, where available, links to full-text are provided. The American Evaluation Association is celebrating Eastern Evaluation Research Society (EERS) Affiliate Week. The contributions all this week to aea365 come from EERS members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.   Related posts: EERS Week: Lacy Fabian on an Age Old Evaluation Challenge Leigh M. Tolley on Student Involvement in Local Affiliates EERS Week: Jill Feldman and John Kelley on the Eleanor Chelimsky Forum as a Learning Opportunity
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:05am</span>
Hello! I’m Tony Fujs, Director of Evaluation at the Latin American Youth Center, a DC based non-profit organization. When asked about the role of my department, I often respond that it aims to be "the GPS unit of the organization", showing decision-makers whether the organization is on track or not toward achieving its goals. I like the GPS analogy because it is simple and easy to understand. It also provides an interesting framework to think about performance measurement systems, and how to improve them. Nonprofits Performance measurement systems Where we are:                                                                                                                                                                Where we want to be: Image credit: Biblioteca de la Facultad de Derecho y Ciencias del Trabajo Image credit: Tony Fujs                     Let’s look at a few concrete examples to illustrate this point: Lessons Learned: On data collection. Recognize that data collection is always a burden: Collect only what is needed. Eliminate manual data entry whenever possible; if not possible make the user interface as intuitive as possible. On data processing. Automate, automate, automate: Internal evaluators generally work with different instances of the same data sets, therefore data cleaning and other analytical tasks can easily be automated: Use programming tools like Excel macros or R. Modern databases can also be customized to automatically "catch" data entry errors. Hot Tip: Want to learn more about efficient data processing? I’ll be running a workshop on data management at the next EERS conference. On providing actionable information Make sure the information generated by the performance measurement system is useful and understandable for the end user. Make evaluation results hard to ignore: For instance, they could be displayed on a giant TV screen in the hall of the organization building, so nobody can enter the building without seeing them. Simple is beautiful Building a culture of data is often cited as a critical step in generating buy-in toward performance measurement systems. It is a critical step indeed, but partly because performance measurement systems are often perceived as complex and cumbersome by the end-user. Drivers adopted the GPS because it is useful and easy to use, not because they developed a culture of data. Building useful, simple, and intuitive performance measurement systems can also be a powerful and sustainable strategy to generate buy-in. The American Evaluation Association is celebrating Eastern Evaluation Research Society (EERS) Affiliate Week. The contributions all this week to aea365 come from EERS members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: GOVT Week: Ted Kniker on Performance Measurement and Evaluation GOVT Week: David Bernstein on Top 10 Indicators of Performance Measurement Quality EERS Week: Jill Feldman and John Kelley on the Eleanor Chelimsky Forum as a Learning Opportunity
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:04am</span>
                          The Affordable Care Act (ACA) has now survived two Supreme Court lawsuits. And, odds are the ACA will continue to face legislative efforts to fully repeal the law through next year’s presidential election.  In the meantime, employers and employees are revving their engines and gearing up for the anticipated 40% excise tax also known as the "Cadillac tax." Starting in 2018, a provision of the ACA will impose a tax on employers whose health care plans exceed certain thresholds....
SHRM   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 05:59am</span>
Hello, my name is Jayne Corso and I am the community manager for American Evaluation Association and the voice behind the AEA Facebook page. Facebook can be a great tool for creating a community and promoting your organization to a larger social audience. However, with organic reach dropping on Facebook, resulting with the only 6 percent of your fan base seeing your content, it’s difficult to reach your community without paying for it. I have compiled a few tips that will help increase that reach percentage without having to spend additional dollars. Hot Tip: Solve a problem Most people use the Internet to be entertained or to solve a problem.  Evaluators use resources every day to solve problems, so share those solutions with your community. And if you can add a little bit of humor, go for it! Quick 1-2 sentence postings work best for this social channel. Hot Tip: Experiment! Play with posting types and times Keep your community on their toes. Post a variety of post types, such as evaluation news, tips and tricks, or fun facts. You can also interject trending hashtags such as #tbt (throw back Thursday) or #FunFactFriday, to increase the reach of your post. Try different posting times. See what kind of engagement you can get by posting at 7:30 a.m. or 8:00 p.m. Look outside of typical business hours to see if you can find optimal engagement times. Include some gaps in your posting. Typically we are told to post to social media channels every day, but see what happens if you leave a two day or three day gap, your community does not necessarily want to see content every day. Identify trends that your community is interested in. If you see that a lot of your data visualization posts do well, optimize accordingly. This increases relevance and in turn engagement. Hot Tip: Engage with other evaluation communities on Facebook Share stories, photos, and posts from other evaluators or evaluation businesses on Facebook. This creates diversity in your postings and connects you to the online evaluation community, where you can interact and meet new evaluators. Tag other organizations within your post! If you are referencing an organization, check to see if they have a Facebook page. If they do, tag them in your post by including @[organization name]. This will allow your content to have a link to the organization’s page which will help create more visibility in the news feed of the fans who have liked the tagged pages. Don’t leave Facebook just because organic reach is slipping; try these tips to increase your engagement! Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Dan McDonnell on Evaluating your Social Media Program Dan McDonnell on Some Changes to Facebook Emily Warn on Using Online Tools to Measure Outcomes for Not-For-Profit Organizations
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 05:34am</span>
Hello! I am Kathy Bolland, and like many of you, I have many professional hats. I am an administrator and faculty member in a school of social work, where my research focuses on adolescents living in poverty and on assessment in higher education. I am a past AEA treasurer, past chair of the Teaching of Evaluation Topical Interest Group (fondly known as TIG: TOE), and current co-chair of the Social Work TIG. Last March, the Social Work TIG provided a series of AEA 365 blogposts during Social Work week. This year, we do it again, although a bit earlier. Some of our blogposts extend topics introduced last year and some are new. In both years’ blogposts we focus on how social work perspectives and methods can be used in evaluation and how evaluation can be used in social services (e.g., http://aea365.org/blog/sw-tig-week-katrina-brewsaugh-on-why-you-want-a-social-worker-on-your-evaluation-team). We also talk(ed) a bit about how to engage non-evaluators in evaluation and how to help them learn about evaluation (e.g., http://aea365.org/blog/sw-tig-week-carl-brun-on-teaching-evaluation-to-social-workhuman-service-students). Today’s blogpost will provide an introduction to this week of blogposts. The first and last lesson learned focus on transdisciplines. The remaining lessons learned relate to last year’s or this year’s blogposts, most relevant to the transdisciplinary nature of evaluation and of social services. A link to all of last year’s is provided as a Rad Resource. Lesson Learned: Both evaluation and social services perspectives and methods can be applied in both disciplines, as well as in others. Scriven (2008) [The Concept of a Discipline: And of Evaluation as a Transdiscipline] and Riverda (2001) [Multidisciplinary and Transdisciplinary Approach in Social Work Education and its Implications] have discussed how evaluation and social service disciplines can thus be characterized as a transdisciplines.   Lesson Learned: Evaluation and social service professions share many guiding principles. Lesson Learned: Evaluation perspectives and methods can help social service professionals identify evidence-based practices and implement evidence-based practice. Lesson Learned: Both evaluators and social service professionals are invested in cultural competence and are still learning about it. Lesson Learned: Single-systems designs, often taught as part of "evaluating practice" is a way to help social service professionals embrace the idea of evaluation. Lesson Learned: Evaluating the degree to which program goals have been met is not the only way to evaluate a program. Lesson Learned: Evaluators can use their knowledge and skills to help their higher education colleagues in professional schools and arts and sciences with assessment tasks useful for program improvement as well as for accreditation. Lesson Learned: Evaluators and social service professionals can serve on multi-disciplinary or inter-disciplinary teams, they can work in other ways with colleagues from different disciplines, and they can also be transdisciplinary, using perspectives and methods from their primary disciplines to strengthen their work in other disciplines. Rad Resource: AEA 365 blogs sponsored by the Social Work TIG last year. http://aea365.org/blog/category/social-work/ The American Evaluation Association is celebrating SW TIG Week with our colleagues in the Social Work Topical Interest Group. The contributions all this week to aea365 come from our SWTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.   Related posts: Gerri Spilka on the Robert Wood Johnson Foundation Evaluation Fellowship Program SIOP Week: Stephen Axelrad on I-O Psychology and Evaluation APC Week: Rhonda Schlangen on Joint Evaluation Strategies for Advocacy and Services
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 05:34am</span>
I’m Tracy Wharton, Assistant Professor at the College of Health and Public Affairs at the University of Central Florida. Having been a practitioner, a program coordinator, a program evaluator, and now a faculty member, I am working on bringing relevant connections between practice and evaluation into the classroom. Evidence-Based Practice (EBP) is the process of asking a good practice-relevant question, searching for the best available evidence, determining how the available information applies to your client(s), and evaluating the results of the intervention you and your client selected. As social work continues to expand implementation of EBP across practice domains, the imperative for rigorous evaluation of what we do and how we do it becomes even more important. Two broad things are necessary for EBP to succeed: practitioners need to embrace and implement the practice of using evidence and embrace and implement the practice of applying the EBP process. Lesson Learned: Help engage people in understanding "the big picture." Imagine your State Senator asking you to explain why your program should be given expanded funding over a program in the next county over; what would you say? In today’s political climate, funding for our programs often depends on our ability to demonstrate value. While it is appealing to leave evaluation to "the experts" and focus on our corner of the practice field and the work that we do from day to day, program directors often find themselves faced with demands for outcomes data, return on investment, and cost-benefit of the ways in which we serve our various populations. Like it or not, policy and public awareness often drive funding allocations and research priorities, which in turn help drive public perception of "what is important." Even as we strive to support and empower our clients, our paychecks depend on the survival of our programs! In order to do what we do, we need funding, and to get funding, we need data. Lesson Learned: "Evidence" can mean many things, as long as it is collected with an eye on validity and rigor. Hot Tip: Remember "What? So what? Now what?" When teaching program evaluation to students in professional programs, use real clinical examples or applied experiences from internships. For example, working through a logic model for a familiar practice setting can help bring the process to life and create a link to evidence-based practice. The key to getting professional students excited about evaluation is to make it RELEVANT. Rad Resource: The Point K Learning Center has a Logic Model Builder workbook, along with dozens of other evaluation resources. The American Evaluation Association is celebrating SW TIG Week with our colleagues in the Social Work Topical Interest Group. The contributions all this week to aea365 come from our SWTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Susan Kistler on Funding Opportunities for Evaluators EEE Week: Nancy Franz on Public Value Story Telling CP TIG Week: Helen Singer, Sally Thigpen, and Natalie Wilkins on Understanding Evidence: CDC’s Interactive Tool to Support Evidence-Based Decision Making
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 05:33am</span>
Hi, my name is Javonda Williams. I am the BSW Program Chair and an Assistant Professor at The University of Alabama School of Social Work. I have also worked as a clinical social worker for 12 years. My practice experience centers on trauma and resilience in children and adolescents. One "aha" moment came for me when an outspoken 12-year old girl that I was working with asked the question, "How will I know when I am better?" Hot Tip: Just do it!! Single systems research designs are among the simplest and most cost effective forms of evaluation. Remember the primary intent of single systems designs is to examine the effect of an intervention on a client (or a single group of clients) over time. Get started in three easy steps: 1) Clearly identify the behavior you expect to change, 2) determine which intervention you will use to address the behavior and 3) pick a way (or an instrument) to measure progress. Hot tip: Get the clients involved. The results of your single system research can be useful in providing feedback to you as a clinician, but most importantly to the client. The clients should have an idea of what is working and what is not working on their journey to "better". Remember all of that "person-centered" stuff you learned in Introduction to Social Work or some other class. Hot tip: Don’t be afraid to bring the bling!!! In my work with children, some of my best examples of single subject designs have been displayed using markers, stickers and glitter!! Giving clients a visual display of progress can help encourage them to continue working towards progress. Rad Resources: There is plenty of information and examples of single systems designs available online. Here are a few to get you started: Single Subject Research Single Subject Design Introduction to Single Subject Designs The American Evaluation Association is celebrating SW TIG Week with our colleagues in the Social Work Topical Interest Group. The contributions all this week to aea365 come from our SWTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Systems Week: Morell on An Invitation to Test Integration of Traditional Eval & Agent Based Modeling Natalia Kosheleva on Measuring The Extent of Program Intervention into Targeted Systems Systems Week: Glenda Eoyang on Complexity Demands Simplicity
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 05:33am</span>
Welcome to my ramblings on evaluation. I’m Brandon W. Youker, social worker, evaluator, and professor at Grand Valley State University in Grand Rapids, Michigan. I’ve been thinking about the inculcation of many professionals during their graduate studies where they are taught to equate program evaluation with the assessment of goal-achievement. Students learn about goal-setting and then about things like theories of change and logic models. I don’t deny the legitimacy of these tools for monitoring your own programs, but relying on them as the sole strategy for evaluation leads to partial stories. According to the AEA’s Guiding Principles for Evaluators, evaluators have a responsibility to "consider not only immediate operations and outcomes of the evaluation, but also the broad assumptions, implications and potential side effects" Some common assumptions regarding goals and some counterpoints follow. The goals and objectives of the program funders, administrators, and managers are the ones that matter. What about the consumers’ or other stakeholders’ goals? The official goals and objectives are clearly articulated and agreed upon. Often, however, goals and objectives are written by a group of executives and managers. Again, what about the consumers’ goals? Goals and objectives are relatively static. So what happens when conditions change? Should the evaluator simply scrap the old goals and adopt new ones or keep irrelevant goals? Program administrators—and evaluators—can predict outcomes. Even if they could predict outcomes they tend to search only for positives ones. Goal-based evaluation by design gives little—if any—attention to program side-effects. Lessons Learned: Program administrators feel that funders want goal-achievement evaluation. On numerous occasions, I’ve been part of conversations with program administrators that sound something like the following: Program Administrator: "Look at this but not that." Me: "Why not examine that area?" PA: "Because we aren’t trying to do anything in that area." Me: "But isn’t that a critical area? And what if you were doing poorly there, wouldn’t your program suffer?" PA: "Yes, but our funders don’t give us money to do anything in that area and therefore we don’t intentionally attempt to do anything with it." Hot Tip: Explore evaluation tools that don’t dictate goal-orientation. For example, Most Significant Change and Outcome Harvesting investigate outcomes without requiring evaluators to reference stated goals or objectives. Rad Resources: Scriven’s entry on "goal-free evaluation" in his Evaluation Thesaurus outlines some limitations of goals and objectives. Additionally, I (2014) coauthored a paper in The Foundation Review titled "Goal-Free Evaluation: An Orientation for Foundations’ Evaluations" where I pled to philanthropic organizations to consider expanding their conception of evaluation and how it should be conducted. Thanks for your interest. Please contact me so we can discuss this further: youkerb@gvsu.edu. The American Evaluation Association is celebrating SW TIG Week with our colleagues in the Social Work Topical Interest Group. The contributions all this week to aea365 come from our SWTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.   Related posts: Linda Meyer on Helping Clients Develop Goals and Objectives CPE Week: Linda Delaney on Performance Appraisals PD Presenters Week: Mindy Hightower King and Courtney Brown on A Framework for Developing High Quality Performance Measurement Systems of Evaluation
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 05:33am</span>
Displaying 28153 - 28176 of 43689 total records