Loader bar Loading...

Type Name, Speaker's Name, Speaker's Company, Sponsor Name, or Slide Title and Press Enter

My name is Megan Walker Grimaldi, and I work for the Research, Evaluation, and Innovation department of Communities In Schools. This year, I was happy to serve as the program chair for the Internal Evaluation TIG at the AEA conference in Denver. Lessons Learned: One of my favorite things about the AEA conference is that it offers evaluators from all different backgrounds and fields to come together and share their experiences. As an internal evaluator, my favorite conversation by far was the conversation around the many hats that an internal evaluator wears, and how internal evaluators balance their desire to promptly assist their colleagues and their desire to focus on the evaluations for which they are responsible. The conversation started during the Internal Evaluation TIG meeting. Someone mentioned the short timelines that internal evaluators often face. Because we are internal to organizations, our colleagues, who may be a desk away, often feel comfortable coming to us and saying, "Can you get this analysis to me by close of business tomorrow?" Not only are deadlines sometimes rushed, there are times when we can be asked to do things tangentially related to our work. For example, many evaluators are fluent in data analysis. For internal evaluators, some of our coworkers may not be sure how to use a spreadsheet; their specialties might be in working with constituents in the field, or marketing, or fundraising. With our specialized knowledge, our role of evaluator may quickly evolve into a role as a teacher or tech guru. I brought this topic up in a fantastic presentation, Engaging Stakeholders in Internal Evaluation. Kristina Moster and Erica Cooksey from the Cincinnati Children’s Hospital, and Danielle Marable and Erica Clarke from Massachusetts General Hospital, presented on ways to engage various stakeholders in conducting internal evaluation. They helped me reframe my thinking around urgent or special requests. It’s actually positive that coworkers feel comfortable approaching us. In some organizations, people do not even realize that there is an evaluator to approach! And if the task is not exactly "evaluation," we can still turn the task into an opportunity to share ideas around evaluative thinking - and lay the groundwork for future evaluation projects. When you are an approachable internal evaluator, you build a rapport with your coworkers, and evaluation projects start to come your way. Communicating the parameters of your role will become easier once you have formed positive working relationships. The American Evaluation Association is celebrating Internal Evaluation (IE) Topical Interest Group Week. The contributions all this week to aea365 come from our IE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Internal Eval Week: Kathleen Norris on the Internal Evaluation Boa Internal Eval Week: Pamela Bishop on Working as a New Internal Evaluator Sarah Gill on the Atlanta-area Evaluation Association Spring Social 2012
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 07:11am</span>
Hello, my name is Jayne Corso and I am the Community Manager for the American Evaluation Association (AEA).  Today, I will discuss the benefits of using Canva, a free online tool that allows you to create images and graphics. I mostly use Canva to create images for social media, but this tool is not limited to the digital medium, you can also create images that can enhance your presentations and research.   Hot Tip: Create graphics to emphasize your information Canva makes it easy to create info graphics that support your report findings. Your first step is to choose your design format. This can be for social media, a presentation, poster, blog, business card, or presentation. Once you have decided, choose your background. You can choose from shapes, banners, buttons, and frames. Next you can add text. Choose from the designated templates or make your own. Finally, add some images! You can use the search box to find exactly what you need. Once you know what you want to add, drag the image onto your canvas. You can then resize the image to fit within your background.  Ta-da… you have just created your own info graphic! These images can visually represent your ideas and emphasis points you believe to be important for your shareholders. Rad Resources: You can also use Canva to create topic banners, emphasis important dates, create connections through lines and arrows, or find figures to demonstrate demographic data. All of the below images are free and can add great visual to your presentations. With Canva, the possibilities are truly endless! When you are ready to export your work, just hit download in the top, right corner to export your graphic as an image or pdf. Hot Tip: How to edit your image once exported from Canva When you export the image from canva, it will be in an 8.5 x 11 format. This is great if you are exporting a full page of images, however, it you simply want to export a banner or button, I suggest opening your canva image in a basic design program such as paint. Here you can crop the canva image to isolate the button or banner you desire. You can also resize the image to fit your desired dimensions. This process can also be accomplished in word or PowerPoint. Bonus tip: You can now create holiday cards on Canva! Just follow the steps above with the available holiday templates. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Dan McDonnell on Becoming an Amateur Graphic Designer with Canva June Gothberg on Creating Presentations Potent for All p2i Week: Meredith Haaf on Applying p2i to Conference Posters
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 07:10am</span>
My name is Art Hernandez and I am a Professor and Dean at Texas A&M University Corpus Christi. I participated in one of the very early yearlong experiences as a Fellow and have served as the Director for several cohorts.   I have served as evaluator and teacher or evaluation and am very interested in the processes of cultural competence in practice. Lesson Learned: As part of my experience and exploration I have determined that cultural competence is a matter of process rather than product the dangers associated with assumptions related to determining competence. Hot Tip: (Self-Reflection Guide) Self awareness and a clear understanding of the potential of personal beliefs, values and perspectives to influence decision making especially in regards to the current focus of evaluation. This self-assessment must begin before the evaluation enterprise is begun and extend throughout. Understanding of the purposes for which the evaluation is being conducted including the implications of roles and relationships between evaluators, evaluands, the "evaluation situated" community as a whole and the sponsors or initiators of the evaluation effort. Appreciation for the language of the community, recognizing that "how" is as important as "what" people say to any interpretation of meaning. Appropriateness of the methods and instruments to be used. This is in recognition of clear and obvious cultural differences related to the notion of norm as indicative of the "good" or the group as a whole, the adequacy and representativeness of sample, within culture heterogeneity and interpretation of outcomes of instrumentation. Power differentials between those who initiate, implement and those who are subject to the evaluation enterprise. Demography, history and religiosity are potential influences on judgments of cost, benefit, advantage and challenge. Recognition that regardless of the purpose, in the end evaluation is or results in a value judgment. Evaluative protocols and methods are an effort to standardize the proceedings and so to reduce "noise." However, regardless of approach, the findings of any evaluation are "snapshots" of a dynamic process from which predictions about the current or future states are made. Resist the tendency to reify research results. Non-mechanical nature of human beings and human systems. As the failure of mechanistic thinking has been demonstrated in physics, astronomy, chemistry and other "physical" sciences, it should be clear that this thinking is not likely to apply to people. Attention to unintended consequences. Rad Resources: Links to Resources on Cultural Competence in Evaluation Annotated Bibliography: Multiculturalism and Cultural Competence in Evaluation, Select References 1995-2007 Sayre, K. (2003). Guidelines and best practices for culturally competent evaluations. Colorado Trust. The American Evaluation Association is AEA Minority Serving Institution (MSI) Fellowship Experience week. The contributions all this week to aea365 come from AEA’s MSI Fellows. For more information on the MSI fellowship, see this webpage: http://www.eval.org/p/cm/ld/fid=230 Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.   Related posts: Cultural Competence Week: Dominica McBride on AEA 2013 and Spreading the Word on Cultural Competence Cultural Competence Week: Melanie Hwalek on the Adoption of the AEA Public Statement on Cultural Competence in Evaluation - Moving From Policy to Practice and Practice to Policy Cultural Competence Week: Karen Anderson on Cultural Competence in Evaluation Resources
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 07:09am</span>
My name is Ana R. Pinilla. I am a clinical psychologist and Associate Professor of Graduate Clinical Psychology at The Pontifical Catholic University of Puerto Rico. After finishing a year at the AEA Minority Serving Institute (MSI) Fellowship in October 2014, I am happy to share the lessons learned in teaching evaluation, through my experience at the fellowship. I have been teaching a Program Evaluation course for 15 years to students who do not have a broad knowledge about investigative and evaluative methodology. These students have great difficulty visualizing the pertinence of being evaluators in addition of clinicians. As a consequence, the performance of those students in Program Evaluation (PE) classes usually was poor. For example, only 78. 94% in 2011, and 78.75% in 2012 of students approved the PE course in my graduate clinical psychology program. In 2013, this percentage decreased to 44.44%. My individual project help me focus on particular elements in need of change, to make a difference in performance of my last session. A new course design based on: The introduction of an adaptation of a competency based education model, modifications in content and methodology (Integration of an in class practice and a course syllabus reduction to the basics of evaluation), proved to be effective for this course cohort to improve from the 44.44 % approval rate for last year, to a 90.9% approval rate for the present year. These students also improved significantly their attitudes toward Program Evaluation (PE), ending 90.8% of students with attitudes favorable or very favorable and finally a 100% of the group, managed to articulate a theory of action and change and demonstrated capacity for the development of an Evaluation Plan. None of this achievements had been possible in previous years. The graphic below illustrates some of these changes. Lessons Learned: Students’ performance improves when a method that instills motivation and better attitudes toward PE is used. The introduction of a Competency-based Education approach, seems a good alternative when students lack basic research tools and are facing the challenge of learning a new skill, identified as unrelated to their work. Guiding them with this approach facilitates students’ acceptance of the subject matter, which contributes to better results in their academic achievement. The American Evaluation Association is AEA Minority Serving Institution (MSI) Fellowship Experience week. The contributions all this week to aea365 come from AEA’s MSI Fellows. For more information on the MSI fellowship, see this webpage: http://www.eval.org/p/cm/ld/fid=230 Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.   Related posts: MSI Fellowship Week: Andrea Guajardo on the Minority Serving Institution (MSI) Faculty Fellowship Program Experience as a Non-Faculty Participant MA PCMH Eval Week: Ann Lawthers, Sai Cherala, and Judy Steinberg on How You Define Success Influences Your Findings Climate Ed Eval Week: Nicole Holthuis on Lessons Learned from Measuring Intermediary Outcomes
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 07:09am</span>
My name is Tamara Bertrand Jones and I am an Assistant Professor at Florida State University. During the MSI Fellows program I evaluated an international professional development program for faculty and K-12 teachers. The program developers had designed the evaluation component and it was subsequently approved with the rest of the proposal by the funding agency. My involvement as the evaluator came after the program’s approval. Prior to this experience, I had been included in the design of the evaluation prior to its approval by the funder. So, coming in after the approval taught me a few lessons. Lesson Learned: Involvement in program development and design is a luxury not all evaluators are afforded. Lesson Learned: Implementing innovative evaluation tools requires additional time, planning, and most importantly piloting.   In this evaluation I used iPad applications, photos, and journaling to gather data, in addition to traditional surveys. Lesson Learned: Utilizing multiple technology applications can ease the difficulty of virtual data collection. Throughout the month of international travel participants had several journals to submit. Participants had access to an iPad with several apps pre-downloaded to help with data collection, including Pages, Dropbox, and Shutterfly Photo Story. Hot Tip: Incorporate photos in your evaluation! Techniques like Photo Voice or Photo Elicitation use photos to add layers and nuances to evaluation findings. Learn more about them and ways these tools can improve your evaluation.    Rad Resource: Shutterfly PhotoStory During the evaluation, participants created visual stories of their experience using pictures and narrative they had written. The photo books not only served as evaluation data, but were also personal mementos of the participants’ trip. The American Evaluation Association is AEA Minority Serving Institution (MSI) Fellowship Experience week. The contributions all this week to aea365 come from AEA’s MSI Fellows. For more information on the MSI fellowship, see this webpage: http://www.eval.org/p/cm/ld/fid=230 Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Susan Kistler on Sites for Free Photos for Great Presentations Sherry Boyce on Using Photos in Evaluation Reports Susan Kistler on Using Photos to Illustrate Evaluation in Complex Ecologies
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 07:09am</span>
We’re living in the age of exploding B2B marketing horizons. It seems like every robust business is assembling a top-notch marketing team, and even smaller businesses do their best to keep up via clever, ever-evolving web tools. Indeed, investing in a diversified arsenal of marketing tools and techniques is a must. Of course, as with any investment, your eye should be on the ROI prize: marketing needs to yield the kind of results that more than pay for themselves. Read More The post Marketing Tools & Techniques: Are You Getting Results? appeared first on renshicon.com.
Renshi   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 07:09am</span>
Productivity=(Creating+Capturing+Implementing)(Ideas) Many of our clients come to us for help with improving productivity. First, we help them by improving employee confidence, which allows them to create new ideas. Second, we teach clients to develop structure to capture and implement these great ideas. And finally, we show clients how to track the implementation of these new ideas and their effects on profitability. During this process, senior leaders often hear their people complaining about the added stress this puts on their day. Read More The post Increasing Productivity: Quick Guide appeared first on renshicon.com.
Renshi   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 07:08am</span>
My name is Denise Gaither-Hardy. I am an Assistant Professor at The Lincoln University in Pennsylvania. I want to share a few thoughts about my experience as a member of American Evaluation Association (AEA) and Minority Serving Institution Fellow (MSI) during the past year. My five member cohort was led by Art Hernandez, Professor and Dean of the College of Education at Texas A&M University and my individual mentor was Kevin Favor, Professor at Lincoln University and both are former MSI fellows. This Fellowship has offered me an opportunity to acquire and enhance evaluation skills while becoming a member of an organization that continues to provide me with future opportunities needed to advance my career. I have been fortunate that some of the skills I have refined and acquired are as follows: the art of survey development, which is truly is an art the theoretical understanding of culturally responsive evaluation and the need to increase awareness and advancement the utility of logic models in evaluation Rad Resource: Susan Kistler on Lessons Learned Using Online Survey Software. Go to the AEA website and read more about Dr. Kistler and her work on survey design and development at AEA. Rad Resource:  Stafford Hood, Rodney Hopson, & Henry Frierson’s book on culturally responsive evaluation, The Role of Culture and Cultural Context - A Mandate for the Inclusion, the Discovery of Truth and Understanding in Evaluative Theory and Practice. Read more about cultural responsive evaluation at the Center for Culturally Responsive Evaluation and Assessment. Rad Resource: Thomas Chapel is the Chief Evaluation Officer at the Centers for Disease Control and Prevention. He serves as a central resource on strategic planning and program evaluation for CDC programs and their partners. He is a frequent presenter at national meetings, a frequent contributor to edited volumes and monographs on evaluation, and has facilitated or served on numerous expert panels on public health and evaluation topics. Go to the AEA website and read more about Dr. Chapel and his work on logic models. Lesson Learned: The profession of evaluation has increasingly become recognized as essential in both the private and public sectors. Training is critical and empowering. The MSI Initiative allows you to put into context components like culture and equality, while giving enough room for self-exploration. You will inherently develop and/or refine technical skills which most assuredly will include theory and methodology. Hot Tip: No matter how often you attend workshops and speak with colleagues there is always something new to be learned. The program evaluation skills and competencies that can be learned through AEA will not only benefit you, but students, faculty and administration at your home institution. The American Evaluation Association is AEA Minority Serving Institution (MSI) Fellowship Experience week. The contributions all this week to aea365 come from AEA’s MSI Fellows. For more information on the MSI fellowship, see this webpage: http://www.eval.org/p/cm/ld/fid=230 Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: MSI Fellowship Week: Edilberto A. Raynes on becoming a Minority Serving Institution (MSI) Fellow: Lesson learned MSI Fellowship Week: Andrea Guajardo on the Minority Serving Institution (MSI) Faculty Fellowship Program Experience as a Non-Faculty Participant MSI Fellowship Week: Ana R. Pinilla on The importance of in class practice and course content modification when teaching evaluation
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 07:08am</span>
Improve Employee Involvement Using Processes Not Goals. Over my years as a consultant, entrepreneur and employee I have created a lot of goals. When I look back on them, they were all pretty much useless. They were useless because we spent a lot of time determining what the goals should be. We reviewed the goals quarterly and yearly. But where did that get us? The goals were lagging indicators. If we did, or did not achieve them, Read More The post Do Goals Improve Employee Involvement? appeared first on renshicon.com.
Renshi   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 07:08am</span>
My name is Edilberto A. Raynes. I am an Associate Professor from the Tennessee State University. I am one of the five MSI Fellows from 2013-2014. I am coming in to the fellowship program with minimal background in evaluation. However, I have broad experiences in conducting social research in an academic institution. I have noticed that there were differences between conducting evaluative research and purely evaluative research, which include constructing a survey, writing a report, and conducting a presentation. Lessons Learned: It is best to know the background of the stakeholders, the recipient of the report, the timeline, and the budget. Rubrics are utilized to facilitate framing of survey questions and to interpret quantitative, qualitative, and mixed methods. Logic models include five core components: inputs, outputs, outcomes, assumptions, and external factors. These components are very valuable in mapping out the evaluative process. There is culture in the art of evaluation. There is a need to increase awareness of creating a culturally responsive evaluation. The art of writing non-evaluative questions. The use of proper verbiage such as "Was it any good? To what extent? The use of focus group as a form of evaluation. Hot Tip: Attend the workshops that you are interested in exploring because you will find new information. These workshops include, but are not limited to AEA pre-conference workshops or professional development workshops in general. In my case, I learned a lot in the pre-conference workshops. Collaborate with colleagues because that is the only way one can establish relationships. Once relationships have been established, one is able to build tools in evaluative thinking. Rad Resources: Actionable Evaluation: Jane Davidson is an expert with a book on this topic. She also has her own website filled with useful resources. How to improve writing evaluative questions? See: Improving evaluation questions and answers: Getting actionable answers for real-world decision makers also by Jane Davidson. Steps in Effective Evaluation The American Evaluation Association is AEA Minority Serving Institution (MSI) Fellowship Experience week. The contributions all this week to aea365 come from AEA’s MSI Fellows. For more information on the MSI fellowship, see this webpage: http://www.eval.org/p/cm/ld/fid=230 Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: MSI Fellowship Week: Denise Gaither-Hardy on Experiencing a Year as a Minority Serving Institution Initiative (MSI) Fellow E. Jane Davidson on Evaluative Rubrics MSI Fellowship Week: Andrea Guajardo on the Minority Serving Institution (MSI) Faculty Fellowship Program Experience as a Non-Faculty Participant
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 07:07am</span>
I am Andrea Guajardo, MPH and I am the Director of Community Health at CHRISTUS Santa Rosa Health System in San Antonio, Texas. I am also a doctoral student seeking the PhD in Education at the University of the Incarnate Word, and I enjoyed the distinct honor of being selected to participate in the Minority Serving Institution (MSI) Faculty Fellowship Program in 2013-2014. My acceptance to the MSI Faculty Fellowship Program came as a surprise because I am not a faculty member of an institution that possesses this designation. I am not a faculty member of any institution. Despite the fact that I did not fit the typical mold of previous MSIs, I applied for the fellowship based on my experience in evaluation within the context of hospital operations, grant administration, and community-based programming.   My year as an MSI produced personal growth as an evaluator, and it also allowed me to share my perspective with my fellow MSIs - all of whom are accomplished evaluators and faculty from around the United States and Puerto Rico. Interaction within our cohort often led to discussions of evaluation theory, models, and applications that I had not considered as a hospital-based evaluator. In return, I offered my pragmatic, "real-world" approach to evaluation. My year as an MSI has facilitated a rapid development of a skill set as an internal evaluator that I could not have acquired from workshops and textbooks and enabled me to share a more robust understanding of evaluation among my clinical colleagues in the hospital. AEA is a dynamic organization comprised of evaluators from multiple disciplines, philosophies, and theoretical perspectives. It is also home to evaluators whose backgrounds are not rooted in academia and theory. My year in the MSI program allowed me to experience evaluation from multiple viewpoints and to clearly identify my own perspective and path as an evaluator in hospital operations and in academia. Lessons Learned: Don’t label yourself. Seek opportunities for growth in evaluation even if you do not necessarily fit into a defined category as an evaluator.   Don’t pass up an opportunity at AEA because you don’t "fit the mold." Take a chance and you might find that others can learn as much from you as you learn from them. Lesson Learned: Seek out colleagues from different disciplines and perspectives. Although the MSI program is a formal process, you don’t have to be an MSI to engage evaluators whose experiences, backgrounds, and skill sets differ from yours. Shared knowledge benefits then entire AEA community. The American Evaluation Association is AEA Minority Serving Institution (MSI) Fellowship Experience week. The contributions all this week to aea365 come from AEA’s MSI Fellows. For more information on the MSI fellowship, see this webpage: http://www.eval.org/p/cm/ld/fid=230 Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: MSI Fellowship Week: Ana R. Pinilla on The importance of in class practice and course content modification when teaching evaluation MSI Fellowship Week: Denise Gaither-Hardy on Experiencing a Year as a Minority Serving Institution Initiative (MSI) Fellow MSI Fellowship Week: Tamara Bertrand Jones on Experiencing a Year as a Minority Serving Institution Initiative (MSI) Fellow
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 07:06am</span>
Hello loyal aea365 readers! I’m Sheila B Robinson, aea365’s Lead Curator and sometimes Saturday contributor. Thank you for being "here" today and reading this post! As I was searching for inspiration for this Saturday’s post, I turned to our aea365 archives and found a post from fall 2010 by Susan Kistler (thank you Susan!) that really resonated with me and her message bears repeating. Lesson Learned: Whenever you feel appreciation for people, let them know. That’s it. Just do it. Thank the person who helped you tweak that survey question to get it just right. Thank the people who gave up their time to participate in a focus group. Thank the interviewee who trusted you enough to be so forthcoming with information. Thank the people who opened their doors and gave you access to the site for your observation. Thank the professor, workshop facilitator, or  book author from whom you learned important skills that contribute to your professional success. Thank the research assistant, data analyst, or statistician who pored over the data to help you make sense of it. Thank your client for choosing to hire you for the job, or your employer for giving you the opportunity to do evaluation work. Thank the blogger whose post gave you insight, taught you something, or inspired something in your work. Thank those who freely share their resources and materials. Thank the Tweeters, Facebookers, LinkedIn-ers, Google+ers, and others on social media who offer innovative ideas, experiential wisdom, and links to great content. I want to take this opportunity to thank the hundreds of wonderful authors and volunteer curators who have so graciously contributed to make aea365 my very favorite evaluation blog (and yours too, I hope!) and a tremendous crowd-sourced treasure trove of evaluation know-how! Hot Tip: Make someone’s day today by saying "thank you." Rad Resources: Some insights on the importance of saying thank you: The Two Most Important Words Why It is Important to Say "Thank You" Why ‘Thank You’ is More Than Just Good Manners Image credit: Chris Piascik via Flickr (thank you Chris) Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Susan Kistler on Saying Thank You Susan Kistler on Remembering the Ides of March and Thinking About Your Message Dan McDonnell on 4 Recent Social Media Changes You May Have Missed
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 07:06am</span>
My name is Andrea Guajardo and I am the Director of Community Health for CHRISTUS Santa Rosa Health System and a doctoral student at the University of the Incarnate Word in San Antonio, Texas. I have been a member of AEA for exactly one year, and in that year, I was selected for the Minority Serving Institution Fellowship program, elected as the Co-Chair of the Multi Ethnic Issues in Evaluation (MIE) Topical Interest Group (TIG), and functioned as a core planning member of La RED (Latino Responsive Evaluation Discourse) TIG. In addition to these positions within AEA, I have also presented two poster sessions and one paper at AEA2013 and AEA2014. I would like to use this platform to encourage other graduate students to seize opportunities for professional development afforded by active participation in AEA.   Your graduate education related to evaluation can be tremendously supplemented by creating relationships with experienced evaluators and by providing leadership for groups within AEA. Hot Tip 1: Become an official member of AEA, join a TIG, and volunteer your time. TIGs are an essential part of the AEA experience both at the annual conference and throughout the year. TIGs are responsible for coordinating the review of proposals in their area of interest and developing a strand of conference sessions at the AEA annual conference, so volunteer help with this process is always appreciated.   TIG membership can help you create relationships with top evaluators and might afford the opportunity to learn about emerging ideas in your field or discipline. Hot Tip 2: Don’t just attend the annual conference - participate in it. Submit your own evaluation work for a poster, paper, roundtable, or birds of a feather.   Even if your research is in progress, it is still an appropriate occasion to perfect your presentation skills. Hot Tip 3: Find a mentor. Many experienced AEA members are very willing to provide guidance about how to become more involved and to help map out a path in evaluation at AEA. Their expertise and guidance is valuable as you begin to navigate which activities will benefit you the most in your evaluation career. Rad Resources: Where do I find more information about joining a TIG or participating in the next annual conference? For more information about Topical Interest Groups: http://www.eval.org/p/cm/ld/fid=11 For more information about how to submit your evaluation work at AEA2015 in Chicago, Illinois: http://www.eval.org/p/cm/ld/fid=170 We’re celebrating Evaluation 2014 Graduate Students Reflection Week. This week’s contributions come from graduate students of Dr. Osman Ozturgut of the Dreeben School of Education at the University of the Incarnate Word, along with students from other universities. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Alison Mendoza and Yusuf Ransome on GEDI Intern Reflections of the AEA Annual Conference Teaching Tips Week: Jill Hendrickson Lohmeier on Using Online Discussions for Teaching Deborah Grodzicki on How Students Can Succeed as a Team
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 07:05am</span>
I am Kristin Woods, 2013-2014 co-chair for the GSNE TIG; I am a PhD student at Oklahoma State University in the Research, Evaluation, Measurement, and Statistics program and a faculty member at Southwestern Oklahoma State University. I attended my first AEA conference in 2012 and was overwhelmed by the number of people and sessions as well as trying to learn about AEA. My advisor, Dr. Katye Perry, encouraged me to attend the Graduate Student and New Evaluator’s TIG business meeting. This drastically changed my experience with AEA because I was voted in as a co-chair even though I did not really know what I was getting into. Over the past two years, this experience has granted me many opportunities that have made me a better evaluator. Opportunity 1: AEA Involvement. As a co-chair, I have further developed my skills as an evaluator through the vast amounts of resources on the AEA website. I have worked with AEA members, members of the board of directors, and staff on various tasks for the conference. For example, I served as a member and then chair of the Student Travel Awards Working Group and as a conference volunteer. Opportunity 2: GSNE TIG Involvement. I have developed leadership skills through working with other members of the leadership team to coordinate the conference program, serve as a reviewer, run the business meetings, coordinate social outings, communicate with members, and develop a peer-mentorship program that connects novice evaluators with peers to aid navigating AEA and offer advice on evaluation. Rad Resources: GSNE TIG website has specific information geared toward novice evaluators and those new to AEA. GSNE TIG Facebook Community Page is a place TIG members informally network throughout the year. We share resources, ask questions, and celebrate our successes as well as commiserate over our struggles. Opportunity 3: Networking. These opportunities have allowed me to expand my network to include novice to more experienced evaluators from all over the world. I have co-authored several accepted submissions at the 2013 and 2014 AEA conferences, chaired sessions, and been asked to speak at another TIG’s business meeting. This has led to the past two years conferences being drastically different from my first conference. I speak with people I met the previous year, have engaged with through the Facebook page, e-mail, or on the phone. It allows me to put a face with a name, get to know them, and connect with another evaluator that has different experiences, therefore, becoming another resource in my toolbox to pull from when needed, which I do often. We’re celebrating Evaluation 2014 Graduate Students Reflection Week. This week’s contributions come from graduate students of Dr. Osman Ozturgut of the Dreeben School of Education at the University of the Incarnate Word, along with students from other universities. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: GSNE Week: Kristin Woods on Gaining Practical Experience as a New Evaluator GSNE Week: Ayesha Tillman on Graduate Student and New Evaluator (GSNE) TIG Mentorship program GSNE Week: A Rae Clementz on Planning Your Best AEA Annual Conference Experience
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 07:04am</span>
Hello! My name is Danielle Cummings. I am a member of the AEA Graduate Education Diversity Internship (GEDI) program and a graduate student at NYU Wagner School of Public Service studying public policy analysis. Attending AEA’s annual conference was inspiring and edifying. As a graduate student learning analytical methods and research design, the conference was a wonderful opportunity to see practical applications of many of the tools I learn about in the classroom. I came away from the experience with both a refined vision of what a career in evaluation might entail, as well as a wealth of theories, frameworks, and skills to integrate into my work. I’m anxious to put so many of the things I learned at AEA into practice, but the technique I’m most excited about is called solution-focused qualitative interviewing. Dr. Emily Spence-Almaguer and Shlesma Chhetri introduced session participants to an innovative approach to qualitative inquiry that they believe has improved interview participants’ candor, thereby increasing the richness of their qualitative data. Trained as a social worker, Dr. Spence-Almaguer adapted a therapeutic technique called solution-focused therapy to enhance qualitative inquiry. Hot Tip: There are two key elements of solution-focused dialogues: 1) People are experts on their lives, and 2) interviewees filter their responses based on expectations. In practice, this means that when we approach qualitative interviews with humility, treat interviewees as the experts, and frame our questions in a way that encourages creativity, interviewees’ responses will be more frank, dynamic, and provide sufficient context to require low levels of inference by the researcher. Here are examples of typical vs. solution-focused qualitative questions asked by Dr. Spence-Almaguer’s research team in her research on solution-focused questions, and examples of interviewee responses: Traditional approach: What would you recommend to improve the program? "Nothing, they are dong an outstanding job." Solution-focused approach: If I were going to give this program another $100,000 next year, what would you recommend that the program administrators do with the money? "Put more of the [initiative’s] programs together and coordinate them to make them work more effectively." By constructing a question that placed the interviewee in a position of authority and invoked imagery, the interviewer elicited a response that not only provided a critique of the program, but also a potential solution to a programmatic problem. Rad Resource: This post just scratches the surface of solution-focused interviewing. For more information on this approach, check out the slide deck from Dr. Spence-Almaguer’s AEA presentation, available for free to AEA members on AEA’s eLibrary. Make sure to check out the list of additional solution-focused literature and resources on Slide 24! We’re celebrating Evaluation 2014 Graduate Students Reflection Week. This week’s contributions come from graduate students of Dr. Osman Ozturgut of the Dreeben School of Education at the University of the Incarnate Word, along with students from other universities. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Emily Spence-Almaguer on Solution-Focused Therapy Teaching Tips Week: Jill Hendrickson Lohmeier on Using Online Discussions for Teaching Jacquelyn Christensen on Wordle and Survey Anchors
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 07:04am</span>
Hi, my name is Erica Roberts, an AEA GEDI scholar, doctoral candidate at the University of Maryland School of Public Health, and an intern at the National Cancer Institute Office of Science Planning and Assessment. As a graduate student who is approaching the transition from student to professional in the field of public health evaluation, I would like to share with you the lessons I learned from attending the AEA conference in the hope that these lessons can be used by other graduate students planning to attend next year’s conference. Lesson Learned: Prepare to build your professional network. The AEA conference provides an expansive and rare opportunity to meet evaluation experts, future mentors, and possible employers. Prior to attending the conference, use the Topical Interest Groups (TIG) to navigate the conference program and identify experts in your field of interest. Remember to pack business cards and update your resume or vitae. Once at the conference - be bold! Introduce yourself to presenters from organizations or fields of practice that interest you and have a few talking points or questions prepared. Once you’ve connected, add their information into an Excel spreadsheet and, after the conference, note if and when you follow-up via email and the outcome of your discussion. This will help for professional networking down the road! Lesson Learned: Prepare to be overwhelmed (but in a good way). Before arriving at the conference, figure out a way to stay organized that works best for you. I brought my iPad to each session and used the EverNote app to take notes. Most importantly (to my organization), I kept a "to-do" note where I listed everything I wanted to do when I returned home (e.g., articles to read, experts to connect with, student scholarships or job opportunities to apply for). It is likely that you will encounter a lot of information that you want to know more about but do not have the mental space to process - this is where making a "to-do" list for home comes in handy! Lesson Learned: Prepare to be inspired. You may find at the AEA conference that the ways to approach evaluation are endless - depending on the field, the context, the purpose, etc. Do not let this discourage you; rather - let it inspire you. Take these ideas and put them in your back pocket and know that at some point you may be asked to conduct an evaluation and you will have a myriad of methods and approaches to look to. I encourage you to use the AEA conference to learn about approaches to evaluation that you are not familiar with, and identify ways in which those methods could be adopted to your work! We’re celebrating Evaluation 2014 Graduate Students Reflection Week. This week’s contributions come from graduate students of Dr. Osman Ozturgut of the Dreeben School of Education at the University of the Incarnate Word, along with students from other universities. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.         Related posts: #Eval14 Grad Students Reflections Week: Danielle Cummings on GEDI Intern Reflections on the AEA Annual Conference #Eval14 Grad Students Reflections Week: Andrea Guajardo on How Graduate Students Can Become More Involved in AEA SEA Week: Jason Lawrence on Making the Most of Graduate Education in the Evaluation Profession
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 07:03am</span>
Hello! We are Çigdem Meek, Bashar Ahmed, and Marissa Molina, PhD students at the University of Incarnate Word in San Antonio, Texas. As novice evaluators, we would like to share what we have learned from our experience of attending the 28th Annual Conference of the American Evaluation Association in Denver. Lessons Learned: Attending the conference as a group of PhD students from the same university eased our anxiety of being among expert evaluators. Plan with your peers to attend the next conference in Chicago! Stay at the conference hotel (and make your reservation as soon as possible). You will not regret the networking opportunities it provides! Attend pre-conference and post-conference workshops! Evaluation 101 is a great workshop to understand the basics of evaluation. Join Topical Interest Groups (TIGS) business meetings. Meet with like-minded evaluators! Look for volunteer opportunities, especially if this is your first time. This helps you meet with other evaluators with ease (and also helps with the registration cost). Participate in panel discussions. This is an excellent way to meet and learn from other evaluators. Do NOT miss the opportunities to learn from the best through panel discussions, workshops, and conference sessions! (i.e. Donna Mertens, Robert Stake, Stafford Hood, Rodney Hopson, Hazel Symonette, Jody Fitzpatrick, Michael Scriven, Michael Patton, Art Hernandez, Karen Kirkhart, and Cindy Crusto have facilitated excellent sessions and provided exceptional insights for novice evaluators). Make sure you have your business cards (a lot) with you and exchange! Remember to take notes on cards you receive (I thought I could remember all!). In order to stay connected send them a brief email within 10 days after conference. Take notes to review later during the sessions and reflect on what you learn. Remember, reflection is what makes learning meaningful. Rad Resources: Check out these resources before attending the conference! AEA Public Library Read American Evaluation Association Guiding Principles For Evaluators For culturally competent evaluation, review the American Evaluation Association’s Public Statement on Cultural Competence In Evaluation We’re celebrating Evaluation 2014 Graduate Students Reflection Week. This week’s contributions come from graduate students of Dr. Osman Ozturgut of the Dreeben School of Education at the University of the Incarnate Word, along with students from other universities. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: MNEA Week: Leah Hakkola on Making the Most out of Networking #Eval14 Grad Students Reflections Week: Erica Roberts on Reflecting on Evaluation 2014: Perspectives of a Graduate Student GSNE Week: A Rae Clementz on Planning Your Best AEA Annual Conference Experience
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 07:01am</span>
Hi! My name is Denise Ramón. I am a doctoral student in education at the University of the Incarnate Word in San Antonio, Texas and work at the Center for Civic Leadership that focuses on civic engagement and leadership. More specifically, I help to connect my university to the community. I am interested in Asset Based Community Development (ABCD). Lessons Learned: While at the AEA 2014 Denver conference, I attended a session that was of particular interest to me, Altschuld, Hung, and Lee’s Getting Started in an Asset/Capacity Building and Needs Assessment Effort. Two dichotomous philosophical approaches were presented, needs assessment and asset / capacity building (A/CB). One of the main ideas stemming from this presentation was to create a hybrid framework between needs assessment and asset mapping. If evaluation is evolving to be visionary and sustainable, mixing traditional models, such as needs assessments, with newer ideas, such as capacity building and asset mapping, seems rather logical. This way, the best of both worlds can be extracted and can fill each other’s gaps, one can complement the other rather than being at odds. With this innovative notion, more research is needed to see if a model can really be developed and effectively implemented. Coming to my second AEA conference enhanced my network system. I participated in most of the social events hosted by AEA, such as the TIG social events, the poster presentation session, and the silent auction. Getting to know others in the field gives me confidence to participate in more evaluation activities because I know I can ask for help and turn to other veterans with more expertise. Lesson learned: Jump in to AEA with confidence and an open mind. Reach out to others. Network. Rad Resource: Using the AEA Public elibrary to find the presentations was so very useful for me. I was able to download the presentations and can now possibly use the document as a reference for my research. I highly recommend using the AEA e-library. You can also upload your own presentation and documents. It is another way to promote your work. As a doctoral student and novice to the evaluation field, the mere experience of attending the conferences has enhanced my overall learning and understanding of evaluation. Not only have I learned about new resources to tap into, like the e-library, but I have been able to relate newly learned evaluation concepts to other parts of my professional and academic life and research. This has been in part to having made new connections. We’re celebrating Evaluation 2014 Graduate Students Reflection Week. This week’s contributions come from graduate students of Dr. Osman Ozturgut of the Dreeben School of Education at the University of the Incarnate Word, along with students from other universities. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: NA TIG Week: James Altschuld on Hybrid Vigor - It’s not just Needs Assessment or Asset/Capacity Building Loraine Park, Carolyn Verheyen, and Eric Wat on Tips on Asset Mapping NA TIG Week: Lisle Hites on Conducting a Needs Assessment on HIV/AIDS Issues in the South
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 07:00am</span>
Hello, my name is Jayne Corso.  I am a Community Manager for the American Evaluation Association (AEA). Pinterest is a wonderful tool for creating shopping lists and finding great DYI projects, but did you also know that it is a useful resource for finding interesting data visuals and info graphics? After all, Pinterest is a place to go to be inspired and to share ideas with others. In my initial post about Pinterest, I have listed some steps for starting your journey on the tool and finding ways to use Pinterest for motivation. Rad Resources: How it Works When you create a Pinterest profile, you have the ability to create boards that relate to your particular interests. Boards allow you to keep all of your related pins together and help you stay organized by subject matter. I’ve used my Pinterest profile below as an example: Use the search bar at the top to search keywords focused on your interests. I suggest searching for data visualization, presentations, research, and evaluation. These keywords will pull images, info graphics, research examples, presentation tips, and much more, which have been pinned on Pinterest by other users. When you find an image you like, pin it to a board!  After you select pin, the site will prompt you to choose a board or create a new board. Now all of your related pins are in one place that you can easily reference.       Rad Resource: Follow others on Pinterest Similar to other social media sites, you can look people up by their names and follow them. When you follow someone, you get notified when they add items to their boards and their activity is shown in your news stream. Some of your favorite evaluators are already pinning on Pinterest including Kylie Hutchinson, Ann Emery, Stephanie Evergreen, and Chris Lysy. You can also follow boards. If you come across a Pinterest board created by a user that you find particularly fascinating, you can follow that board and you will be notified when something gets added. Rad Resource: Be Inspired The greatest aspect of Pinterest is that you can be inspired by the work of others and keep a keen eye on trends within evaluations, research, and presentation. Pinterest encourages you to think creatively and find the best format for your evaluation or data. AEA is interested in joining Pinterest. Tell us in the comments if this is something you would enjoy and find as a useful resource for your evaluations and projects! Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Chi Yan Lam on Hidden Online Troves of Evaluation Resources Dan McDonnell on Making New Friends and Mastering Lesser-Known Twitter Features Without Third Party Apps Zita Unger on Assessing Board Performance: The Elephant in the Boardroom
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:59am</span>
My name is Roxann Lamar and I work in research and evaluation at the Center for Human Development, University of Alaska Anchorage. Our local AEA chapter, the Alaska Evaluation Network (AKEN) hosted a discussion on cultural competence, particularly relevant toNative cultures. About 19% of Alaskans have all or partial Native heritage. The AEA’s statement on cultural competence in evaluation is comprehensive, covering a multitude of issues involved in working together in a diverse world. What is presented here is a perspective to think about - how people might respond to thelanguage we choose to use- not that any languageis universally right or wrong. Lesson Learned: Our event was called, "Cultural Competence in Evaluation." Our panel of cross-cultural experts included persons of DegXit’an Athabascan, Gwich’in Athabascan, Navajo, and non-Native heritage. All had a lifetime of personal and professional experience with cultures indigenous to Alaska. They reminded usat the startthat the words we use are important and informed us they found the term "cultural competence" to be distasteful. Theyhighly encouraged us to use the term "cultural humility" and noted it is not a new idea.They also suggested"cultural relevance" as an acceptable alternative that makes more sense in some contexts. Our panelists explained the problem with"competence"is that it implies we will reach a point where we can say,"We areculturally competent."That is what is inferredwhen people go to a workshop for a certain number of hoursand earn a certificate in cultural competence.Our panelists pointed out that these trainings oftendo more harm than good. For example, focusing on characteristics of specific cultures inadvertently encourages stereotyping.The panel’s audience was intrigued, and discussions among colleagues continued long after the event. Hot Tip: In many places or contexts, a term like "cultural humility" is a respectful choice. Without a lot of explanation it conveys a humble posture of learning about self and others. It implies openness, equity, and flexibility in working with anyone. Rad Resource: With a little looking around, I found Cultural Humility: People, Principles, & Practices. This is a 30-minute, 4-part documentary by Vivian Chávez (2012). It is focused on relationships between physicians and patients, but the principles can beappliedin other applications. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org .aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: AKEN Week: Amelia Ruerup on Understanding Indigenous Evaluation in an Alaskan Context Humberto Reynoso-Vallejo on Cultural Competence and Cultural Humility in Evaluation New Board Member At Large Corrie Whitmore on the Importance of Aim and Audience in Internal Evaluation
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:34am</span>
My name is Sharon Wasco, and I am a community psychologist and independent consultant. I describe here a recent shift in my language that underscores, I think, important trends in evaluation: I used to pitch evaluation as a way that organizations could "get ahead of" an increasing demand for evidence-based practice (EBP); Now I sell evaluation as an opportunity for organizations to use practice-based evidence (PBE) to increase impact. I’d like evaluators to seek a better understanding of EBP and PBE in order to actively span the perceived boundaries of these two approaches. Most formulations of EBP require researcher driven activity — such as randomized controlled trials (RCT) — and clinical experts to answer questions like: "Is the right person doing the right thing, at the right time, in the right place in the right way, with the right result?" (credit: Anne Payne) In an editorial introduction to a volume on PBE, Anne K. Swisher offers this contrast: "In the concept of practice-based evidence, the real, messy, complicated world is not controlled. Instead, real world practice is documented and measured, just as it occurs, "warts" and all. It is the process of measurement and tracking that matters, not controlling how practice is delivered. This allows us to answer a different, but no less important, question than ‘does X cause Y?’ This question is: ‘how does adding X intervention alter the complex personalized system of patient Y before me?’" Advocates of PBE make a good case that "evidence supporting the utility, value, or worth of an intervention…can emerge from the practices, experiences, and expertise of family members, youth, consumers, professionals and members of the community." Further exploration should convince you that EBP and PBE are complementary; and that evaluators can be transformative in the melding of the approaches. Within our field, forces driving the utilization of PBE include more internal evaluators, shared value for culturally competent evaluation, a range of models for participatory evaluation, and interest in collaborative inquiry as a process to support professional learning. Lessons Learned: How we see "science-practice gaps," and what we do in those spaces, provide unique opportunities for evaluators to make a difference. Metaphorically, EBP is a bridge and PBE is a Midway.   Further elaboration of this metaphor and more of what I’ve learned about PBE can be found in my speaker presentations materials from Penn State’s Third Annual Conference on Child Protection and Well-Being (scroll to the end of the page — I "closed" the event). Rad Resource: I have used Chris Lysy’s cartoons to encourage others to look beyond the RCT for credible evidence and useful evaluation. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Jane Davidson on Evaluation Reporting CP TIG Week: Helen Singer, Sally Thigpen, and Natalie Wilkins on Understanding Evidence: CDC’s Interactive Tool to Support Evidence-Based Decision Making Theresa Murphrey on Jing
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:33am</span>
I’m Ann Martin, an evaluator working with a team of science educators and outreach professionals in Hampton, VA. I frequently employ Google Formsfor evaluation and even other projects. Forms are free, simple, intuitive for users, and get the job done. In the past, though, Google Forms had several notable limitations. If you found that Forms didn’t meet your needs in the past, you might not be aware of great new features that represent significant improvements. It’s worth taking another look at this free resource! Hot Tip: Customize the visual look, feel, and branding of your survey! In September 2014 new Forms functionality allowed survey designers to add background and header images and to customize fonts and other display options. Before, theme options were limited. You can use this functionality to make your survey more readable and inviting. A custom header image with a logo may make your users feel more comfortable responding, or can make your survey a seamless part of a website in which you embed it. You can also embed images and videos within the body of the survey itself, which is handy for quizzes or assessments. Figure 1 - Customization options range from a header image, page and form background, and fonts. Cool Trick: Google Forms now support more complex survey design and administration options, including progress bars, data validation, logic/path branching, and randomizing the order of options in multiple choice questions. It’s also easier now to set up your survey’s questions. For instance, if you have a long list of options to include in a question, you can now copy-and-paste in a list from a word processor or spreadsheet table and automatically populate. (I wish that option had existed a few years back, when I created a drop-down with 200 alphabetized options!) Cool Trick: New Add-ons enable even more behind-the-scenes functionality. The latest Add-ons includenifty widgets like Form Notifications, which will send automatic emails to your survey respondents, and Form Publisher, which will use survey responses to fill in a new document from a template. Figure 2. Example Add-ons for Google Forms (screencapture from the Google Drive Add-ons Store). Rad Resource: The Google Drive blog shares updates to Forms functionality so that you can always be aware of new features. I’m also more than happy to share tips if you contact me. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org .aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Dreolin Fleischer on Organizing Quantitative and Qualitative Data Dan McDonnell on Becoming an Amateur Graphic Designer with Canva Audrey Roerrer on Google Tools for Multi-site Evaluation
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:32am</span>
Hi! I’m Josh Twomey, evaluation specialist at UMass Medical School’s Center for Health Policy and Research (in the Office of Clinical Affairs). As evaluators, we are often tasked with providing participants in our evaluations with individualized results. This can be challenging as results must be easy to interpret, display individual results in the context of the entire evaluation, and, where appropriate, show performance over time. My colleagues and I are concluding an evaluation of the Massachusetts Patient-Centered Medical Home Initiative focused on transforming traditional primary care into a more patient-centered care delivery model. Over the course of this evaluation, we faced the challenges above and found a basic, yet sometimes overlooked, solution to effectively communicate results to our participants. Boxplots (aka: box and whisker plots) have the advantage of displaying a lot of valuable information within a simple easy-to-understand graphic. A variable’s mean, median, 25th percentile, 75th percentile, and highest/lowest values can all be assessed at a glance. Extreme values (i.e., outliers) can be highlighted as well. The bottom of the ‘box’ represents the 25th percentile of all values in your data, whereas the top represents the 75thpercentile. The median value is within the box and is displayed via a straight line. Extending from the top and bottom of the box are ‘whiskers’, representing the highest and lowest values of the distribution, respectively. Hot Tip: Boxplots are ideal when you must highlight a single participant’s results among participants’ results. Check out this example, where patient satisfaction from Dr. Smith’s office is shown in the context of other individual primary care practices. Looking at the blue boxplot, notice that Dr. Smith’s patient satisfaction score (i.e., 68) hovers just above the 25th percentile of all the offices’ scores. Another advantage of boxplots is that data can easily be tracked over time. Here, Dr. Smith’s score improves from about the 30th percentile at baseline to the near 100th percentile by the final measurement. Hot Tip: Interpreting a boxplot is easy, after you’re shown how. When presenting boxplots, be sure to include some basic instructions on how to read one. Cool Trick: The end of the whiskers do not always have to represent the data’s highest/lowest points. Whiskers can be set to represent scores that are high/low, but not outliers. Outliers can be displayed by dots falling above/below the whiskers’ ends. When constructing boxplots for numerous participants, macro variables can simplify production. Within your graphics generator, macro variables can simultaneously insert a single participant’s data and other information (e.g., participant’s name) into the boxplot. This technique allows you to generate graphs quickly and accurately, without manually entering data. In total, boxplots can be a quick, clear, and effective way of providing participants with the information they need. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Linda Cabral and Laura Sefton on So many to choose from: How to Select Organizations for a Site Visit MA PCMH Eval Week: Ann Lawthers on Triangulation Using Mixed Methods Appeals to Diverse Stakeholder Interests MA PCMH Eval Week: Linda Cabral and Laura Sefton on Participant Observation as a Data Collection Method
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:31am</span>
My name is Kylie Hutchinson and I’m an independent evaluator and owner of Community Solutions Planning & Evaluation. Each year, I adapt a Christmas carol to share with my evaluation colleagues and not-for-profit clients. Enjoy! Hot Tip: Poems and songs are one way of communicating evaluation concepts in a fun and memorable way. Rad Resource: O Collective Impact (sung to the tune of ‘O Come All Ye Faithful’) O Collective Impact, Joyful we triangulate, Collect ye, collect ye thy same data, Come and behold them, Common outcome measures, O come, let us adore them, Much better than before when, No one thought to ask for them, Shared measurements!   O Common Agenda, Mutually reinforcing, See how evaluation’s now harmonized, Give to your funders, Impacts of the highest, O come, let us adore them, Much better than before when, No one thought to ask for them, Shared measurements!   Strategic philanthropy! Simple, but not easy, Even for projects with backbone support, Yea though we greet Thee, The only way forward, O come, let us adore them, Much better than before when, No one thought to ask for them, Shared measurements! Rad Resource: You can find other Christmas carols on my website here. For more information about Collective Impact, check out this short video, Tackling Complex Social Problems through Collective Impact. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Related posts: Kylie Hutchinson on It Came Upon a Systems Lens Best of aea365 week: Cassandra O’Neill on Engagement for High Impact Collaboration David Erickson on Measuring the Social Impact of Investments
AEA365   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jul 28, 2015 06:29am</span>
Displaying 28105 - 28128 of 43689 total records