by Elisabeth Wilson, Senior Impact Analyst
We’re calling it now, – 2022 will be the year of effective and meaningful nonprofit evaluation. If you’re hoping to level up your organization’s research and evaluation this year, keep reading for our top ten mistakes to avoid. These tips will help you to save time, money and your sanity as you measure the impact of your programs and services.
Babe Ruth said it best, “Don’t let the fear of striking out keep you from playing the game.” The same is true for evaluations. You will never know the impact of your work without taking a risk and conducting an evaluation. When approaching your evaluation, think about it as practice. You wouldn’t let a mistake or bad play in practice keep you from playing in the big game, right? Practice is an opportunity to make yourself a little bit better each day. Use this approach with evaluations and remember that all feedback is actionable.
The best way to burn out an impact evaluation, your people and yourself is to measure every single part of the program. Be thoughtful about what items you pick to measure. Ask yourself, “If my program only did three things, what would they be?” Let’s say you are a food waste nonprofit with the mission statement of supporting food insecure families. There are lots of ways to measure your impact. You could measure how much food you save, the number of partners you collect food waste from, the weight of all of the food saved, the money saved on collecting all the wasted food, and the number of people who received food. You may think you need to measure them all. But remember your mission statement. Your nonprofit is focused on families. That means your families are what we call your end outcomes. The other variables around the physical food are referred to as intermediate outcomes. Start by measuring end outcomes focused on your families and if you have extra time or capacity move on to your intermediate outcomes.
There is no greater fear for an evaluator than to have handwritten sheets of paper, sticky notes or emails dumped on their desk and called “data.” Before you embark on collecting this valuable information, think about where it should go. How do I store it? How can I save it for future access and, most importantly, how do I protect this information? Databases can look quite different depending on the size and scope of the information you collect. Big organizations may use yearly subscriptions to Salesforce, SQL Microsoft Studio, or Amazon Web Services. These are great options if you are collecting a lot of sensitive, personal data because they have built in security. If you are a small organization collecting customer feedback you might use SurveyMonkey, or Excel spreadsheets saved to secure password protected files. When it comes to security, the rule of thumb is to consider if this information included your information, where would you want it stored and who would you want to have access to it?
Pro Tip: At the end of the day, no matter the storage system you use, make sure you can save your work as an Excel or CSV file. These file types are universal in terms of readability and software.
We’ve all been there. You embark on a long, tedious process to understand a problem, and a solution has already been implemented. This is frustrating for the staff that spent time finding evidence-based solutions and the stakeholders. If you are going to evaluate a program, ask yourself, “Why am I doing an impact evaluation? Do I need more information to make my decision? Will I be open to changing program operations depending on the information I receive? Do I have time to wait for a true impact evaluation?” If you answered no, it might not be the right time or program for an impact evaluation.
No doubt your program helps many different types of people, but chances are it doesn’t help everyone in the entire world. Ask yourself, “Who is my program, as it is today, developed for? Who most benefits from participating?” These individuals are your target population and these are the ones you focus on for your evaluation.
Evaluating your impact is an ongoing process where decisions will have to be made along the way. If you don’t write down why you asked specific questions, or combine data in a certain way you are in for a headache in the future. Have you ever been in a meeting and you hear something and think, “Oh I need to write that down. Nah, I’ll remember it.”? Well, how many times did you actually remember? The best impact evaluations are evaluations that someone else in your organization can pick up and continue without the original evaluators assistance. These evaluations allow organizations to build on data, alleviate turnover concerns, and answer questions years after the study has been conducted. When conducting your evaluation, always think of the future. What would someone need to know a year from now to successfully understand what I did?
Evaluations are time consuming and can be scary for your participants, organization, funders and outside stakeholders. Curb these fears at the beginning of the project by talking with each group about why you are evaluating the program, how you are going to evaluate it and where everyone can find the results of the evaluation. Buy-in is key. You want to express to each of your stakeholders why this evaluation will benefit them and their clients, where applicable. Doing an impact evaluation is not about determining if a program is exceeding or failing (remember step one) it is about how to better support the people you serve every single day.
Impact evaluations take time, and while you will get small pieces of data throughout the process, you can’t rush to the press to share the results until you have the full story. There is nothing worse than sharing really positive early study results, only to turn around and walk back findings when the full evaluation is completed. Be kind to yourself and your evaluator and don’t rush results. Think about your program, if it takes 6 weeks for participants to finish your program you know you will need to wait at least six weeks to obtain all of your data. Next, think about how long it will take to analyze it. For surveys or smaller data sets, this can be relatively quick – you may only need a month. For larger data, you may need a couple of months. Set realistic timelines about when the data will be available. Honest communication is key.
We all know that the first couple months after rolling out a new program or service may be a little messy. That is okay. Don’t make the new process harder than it needs to be by evaluating the program before it is ready. Think about how long it will take to get your team up to speed on your new program and get all the glitches out before you add on an evaluation. This does not mean that you don’t think about an evaluation until your program is implemented. Evaluation planning should be included in program development to make sure the right data is collected. However, you don’t have to start collecting data on day one. Try a pilot program to understand how your program is running, and if your data collection tools are working. Give yourself, teams and programs a six month grace period before data collection. Think of it as a trial run. It is better to have a trial run and find errors than find them in a full-scale evaluation.
You cannot guarantee positive results when you conduct an impact evaluation. Receiving bad news from an impact evaluation can hurt, but that doesn’t mean you burn it down. An evaluation is meant to honestly show how your program is doing. If it isn’t where you want it to be, your evaluation sets the direction. (Remember to think of evaluations as practices.) Use your evaluation to structure your goals for the next year and signal to all stakeholders how you will improve. Transparency, honesty and a desire for continuous improvement are hallmarks of a successful organization. The only true unsuccessful evaluation is the one that sits on the shelf with no action behind it.
Lastly, don’t forget that even a little bit of progress is better than nothing. Keep a record of where you started so you can track how far you have come. You got this!