How Evaluation Helps Nonprofits Thrive with Dr. Annette Shtivelband

Annette ShtivelbandDr. Annette Shtivelband is Founder and Principal Consultant of Research Evaluation Consulting. For more than a decade, Dr. Shtivelband has worked with dozens of organizations as a researcher, evaluator, and consultant. She works with her clients to systematically, strategically, and thoroughly measure their impact. She excels in program evaluation, scale development and validation, training, and strategies that promote positive organizational change.

Evaluation is a powerful tool for nonprofit organizations. In fact, I believe that evaluation is the “secret sauce” that differentiates organizations that thrive versus those that only survive. Nonprofits that are able to utilize and leverage evaluation will have more successful and sustainable organizations.

For More Information go HERE

 

Read the Interview

Hugh Ballou: Greetings, it’s Hugh Ballou and Russell David Dennis again for The Nonprofit Exchange. How are things out there in mile high Colorado?

Russell Dennis: Well, it’s 50 degrees now. About this time tomorrow, not so much.

Hugh: It’s about that here in central western Virginia in Lynchburg. We have these old mountains here, the ones with trees on them. They look pretty and green. We have a special guest today, someone who you located. Why don’t you ask her to introduce herself? We are going to talk about evaluations today. Fascinating subject. I think a lot of us go too fast to think about this, but it’s time to slow down, smell the roses, and check in. Russell, take it, won’t you?

Russell: Before you people start running away from your computers at a high rate of speed, I want you to know that we have a data scientist who makes data science marvelous. If you ever get to one of her trainings, I am planning to go to some here, Annette Shtivelband is here. She is with Research Evaluating Consulting; she founded it. She is a doctor. Probably the youngest doctor you’ve ever seen. Welcome, Annette. Tell us a little bit about yourself so folks can get to know you.

Annette Shtivelband: Thank you. I own a company called Research Evaluating Consulting. We work with agencies and help them with anything related to evaluation and data. Think of us as individuals who can help a nonprofit organization, a school, a foundation, or even a government entity figure out if their programs are making the impact they think they are making. We work a lot with data. We all love numbers. We’re a lot more than that. We really believe that when organizations have good findings and reliable information, they can make the best decisions moving forward.

Russell: What is the first reaction you get when you walk into a room and say, “I am going to help you with evaluation?” The word “data” comes out. Do they look nervous? Bewildered? What is typically the first reaction a person has when you talk to them?

Annette: That depends on who the client is. I’ve had some clients who are excited because they know that with this type of information, they can be more competitive for grants. They have communication to help them tell their stakeholders how they are making a difference. They’re happy. But someone else might be earlier on in their journey and process. In those moments, I know that I personally try to simplify things and make it less daunting. I think a lot of people know the numbers a lot better than they think. Oftentimes, they have a lot of great information we can work with. We try to make things as easy as possible for them.

Russell: It’s interesting. If I were to pass you on the street and didn’t know who you were, I wouldn’t think you were a data scientist. How did you come about doing this? How did you choose this for a career? You started your own company, so what motivated you to do this work?

Annette: I’ve always been naturally inquisitive. Actually, when I was a child, my kindergarten teacher told my parents I asked too many questions. I always wanted to know why things were the way that they were. I think that this natural curiosity just helped me go toward paths where I could ask and answer questions as part of what I did. I think also with my focus and REC’s focus on nonprofit organizations, I have been able to see how a lot of different organizations and families might struggle. My family is a first-generation family here. We’re originally from the Ukraine. My parents really worked hard to provide us with a really good life because the same opportunities weren’t necessarily there where they were from. Having some of those earlier experiences working for different nonprofits as an intern and feeling passionate about what everyone was doing and how they were trying to serve a community, and also being on the receiving end. My family received a lot of help [technical issue]. An entrepreneur was something. I didn’t go to business school to do that. I am the type of person who really wants to take risks to make sure I am living an authentic life. I really believed in what we do. I went out there, and we have been in business for over four years.

Russell: You have a great pay it forward story where you didn’t have the best circumstances starting out, and you wanted to make sure that by working, you helped people in need. In that four-year span, you have done some remarkable projects and worked with agencies of all sizes. What are one or two you are really pleased with the results?

Annette: REC worked for Habitat of Humanity in Colorado. We conducted an impact study. This was an amazing project because this organization had a strong commitment to understand how their services across the state of Colorado were making an impact in the lives of the homeowners that they served. I was able to put together a strong team of individuals who really got it from the homeowner side, but also we collected data from affiliates as well. We weren’t just looking at the impact on those who were able to qualify for the program, but also from the affiliates that served across the state. We also looked at the financial data to look at the economic impact. If I recall, we were able to estimate that Habitat of Humanity in Colorado contributed over $60 million to the state. That is a huge accomplishment. Because they had this information and they invested in this project, they had their most successful year in terms of donations and additional support because they could prove they were making a difference, which was awesome.

Russell: Sometimes it looks like the word “impact” is overused. People don’t necessarily have an idea of what that might be. What is your best definition of “impact”? Is that something that varies from organization to organization?

Annette: I think that’s a great question. What I would say is that impact for social sector agencies is how they’re making a difference in the lives of the people that they’re serving. So impact is not something that is measured in one second. It’s something that happens over time. With impact, it’s some kind of change. Maybe it’s a change in the attitudes of the people that you’re serving. Maybe it’s a change in their well-being, if that’s something your nonprofit organization focuses on. Or maybe it’s really just trying to better understand what kind of mark that your organization is leaving. I actually do find that sometimes the word “output” and “impact” get mushed together. Sometimes even funders are encouraging nonprofit organizations to measure their impact. That being said, first you need to measure your outputs. Because your programming and services exist, what are you producing? For example, if you had an organization that provided after-school programming to at-risk youth, how many programs are you offering? How many classes are there? How many students are there? That would be examples of outputs. However, the change that you are making in the lives of those at-risk youth over time, that is a better example of an impact. I think that’s really what is at the heart of it.

Russell: Why is it important for nonprofits to make evaluating and measuring everything they do a priority?

Annette: I think that evaluation is this secret sauce that can help organizations not just survive, but thrive. Oftentimes, I have heard from my clients I wish we could get more grant funding, or oh my goodness, I wish we had more people sponsoring our programs, or I wish I knew better how we were making a difference in the lives of people we are serving. We need more information for our communications and marketing departments. At first glance, it may seem like you should focus on the grant writer or the communications expert. In some capacity, yes. However, if you really want to give them the materials they need to share how your organization is making a difference, evaluation needs to be the priority. With a good evaluation effort, you can demonstrate how you are making an impact. You have evidence.

For example, we worked with a low-income senior living community for a number of years. In partnership with another consulting firm that focused on grant writing, we were able to give them the information they needed to help them generate over $500,000 over the span of the last couple of years. Beforehand, they didn’t have all the information they needed to show funders they were making a positive impact.

Russell: One of the things we preach at SynerVision is strategy, having that solid foundation so you have something to build from. What do you find is the most common type of resistance an organization would have to the idea, especially if they haven’t done it before, of creating evaluations and measures?

Annette: I think it would depend on whether they view evaluation as something about learning or improvement or accountability. If they view it primarily as accountability, then someone like myself who is an external evaluator might be that person who comes into the organization and perceived as that person who would point a finger and say, “You did this wrong,” or “This program isn’t working.” I think people may view evaluation incorrectly in that perspective.

But I think that it’s like anything. Change is difficult. For a lot of organizations, they may not have had an initial experience with evaluation. They are still trying to figure out how this is going to change how we do things. Will this change my role? Will this lead to a reduction of programs? There is a defense mechanism. There is fear.

When we work with our clients, we try to assure them we are going to leverage and use this information to support them and ultimately help them sustain what they are doing. There are some times where you discover that a program or service is not working the way you think it is. In that instance, the best thing you can do is pivot.

For example, a story I like to mention is that a number of years ago, a group of individuals developed the DARE program. It was designed to reduce negative behaviors in youth: drugs, alcohol, those sorts of things. Everyone loved the idea. They had police officers come into the classroom and communicate to the kids, “Don’t do these things.” However, years later, researchers like myself looked at schools that had the DARE program versus those that did not. They had an increased rate of those negative behaviors. It’s important to make sure your programs are working the way you would anticipate they’re working. They did remedy those programs. They are now effective and are making the impact they think they’re making. It’s a quality control. It’s a way to make sure your programs are strong.

Russell: How do we know when an organization is at that point where they are ready to evaluate what they’re doing in the way that’s impactful?

Annette: I think that if there are individuals within that organization that want to better understand how their programs and services are making a difference, that is a huge first step. Oftentimes, organizations may not have full buy-in from all stakeholders. But as you have an opportunity to work with them and demonstrate the value of the services that you provide, there is a lot more buy-in. It builds this momentum and develops a more data-centric culture and climate. Also, for some organizations, they may already have someone in-house who has that kind of evaluation expertise. Those lucky organizations are able to ask and answer different kinds of questions that will serve the mission and vision of the organization. It depends on the size of the organization and what they are hoping to learn and achieve in the next year or the next five years.

Russell: I wore a lot of hats in one of my old roles working for a tribal government. It was all over the place. One of the things that happens in governments, you have a change in leadership. Sometimes that comes with a wholesale change of direction. I would bring people in. We would have questions on the table. We would have a staff meeting, and a number of us would sit around a table and make suggestions, which proved to be unsatisfactory. We’d bring in a consultant and would pay them to do a report. The things are eerily similar to what we have suggested. Is that pretty common? That would be a good segue into the question of when it would be important to bring in an outside evaluator.

Annette: I think that it really does depend on the organization. I think that if evaluation is done well, it’s going to highlight those findings that are maybe in alignment with what stakeholders across the organization may think. But it may also shed light on some nuances that may not have been realized. The way that I see it in thinking about that example of being everyone around the table is there is your truth, my truth, and whatever the “truth” is. When you are able to get information from everyone on some level, you’re able to make stronger conclusions and feel more confident in the decisions that you’re making. Until you do that, you will have a scenario where someone in leadership thinks they know what the best plan of action is, and other people will just smile and nod. It may not be what the best decision is for your organization.

Russell: That’s a huge benefit. There are some others as well to use in an outside evaluator. Talk about some of the other benefits.

Annette: You have someone that has this objective external perspective and who is able to come in and work with your organization and be able to let you know what are some common trends. Sometimes, with an organization, there is a lot of moving pieces. There isn’t someone who is able to look at all the different departments or all the ways your organization is working with data. It provides you with this intel of what is working well, what needs to be improved, and you might have some unintended findings that might actually help you make better strategies for your organization. I really do think that evaluation when used correctly is something that can inform strategic plans or strategic directions. Sometimes you might need to make hard decisions, and it lets you know what would make the most sense moving forward. It’s really a vital tool to make decisions at the end of the day.

Russell: I think when people run into a doctor who is a data scientist, that might be intimidating for some people. Oh no, this evaluation thing will be a lot of heavy lifting. What are some things you help them to do to assuage that fear?

Annette: I will let you know I am not intimidating. Building that relationship on the outset with the clients so that they know what to expect and they will know what to do to support the evaluation, we pretty much do the heavy lifting behind the scenes. I happen to think I am not intimidating in any way. As you commented on earlier, I do look relatively young. I can assure you I know my stuff. It’s about that relationship. I think that it’s one thing to be able to run advanced statistical analyses, but it’s about communicating with your clients and making sure that everyone understands what’s been found and what it means moving forward. The way we approach things is we partner with our clients and let them be the experts of their organization and what they are accomplishing. They are subject matter experts. We are the experts when it comes to the actual technical skills. We collaborate. We work together. We may ask follow-up questions because we don’t want to assume anything. Those kinds of assumptions can lead you down a rabbit hole. It’s a collaborative process between ourselves and our clients.

Russell: I think the intimidation factor comes from the idea of dealing with numbers. When it came time to have your budget projections for the year, people were hard to find. Where did everybody go? This idea of making a budget is the numbers that are intimidating to people more than anything else. But that collaborative process sounds like the way to help them through that. Is there a specific point person in each organization that is usually best for you to work with? Or does that vary between organizations?

Annette: I would say it typically is a program manager. For evaluating a specific program, whomever is in charge of the program is probably the best person for us to speak with. But in other instances, for more complex projects, we might actually be working with the executive director. Whomever that person is needs to be the hub of information so that everyone is not getting slomped with dozens of emails over a 6-9-month project.

Russell: Brendan Brouchard, a content creation expert, a well-known one, talked about three things that would encourage people to use tools they have. They should be easy to access, easy to understand, and easy to use. Is there a training component that goes in with the work you do as you are assembling these evaluations? Are people open to the training? Do they have apprehension?

Annette: You’ve touched on something that in my field is called building an evaluation capacity. Essentially, it’s not simply fishing for our clients and doing that forever. It’s about building a culture and providing training or professional development or tools or resources that can enable the organization to do that themselves. With one of our current clients, at first, we were a lot more involved in the process. We were diving in and figuring out what was working and what was not. As time progressed, one thing we did for them is helped them hire our replacement, an internal evaluator who had the right skills and who had the right fit who could easily go into that organization and be that internal evaluator for them. I think that it really is dependent on the client in terms of how much help and support they want. I know that I personally have led a number of trainings for different clients. I try to understand what my clients’ needs are and develop tools and resources that are specific to them.

Russell: It’s that work they do with you that solidifies the value that this process brings. Is there a typical learning curve, or does that vary by organization or type of project?

Annette: I would say that it really varies. I think that that organizations who may already have a staff member who feels comfortable with data and numbers, it usually may happen a little quicker. Whereas with organizations, a smaller nonprofit, where one individual is tapped with everything, it might be a steeper learning curve. There is that passion and commitment to learn and grow. I have seen that a lot.

Russell: Like anything else, I think by and large people in the for-purpose enterprise space (another name for nonprofits, we are trying to work on the language because nonprofit is a tax status, not a business strategy), a lot of people step into this and do this. Like anything else, being optimists that we are, we think it’s going to take half the money, half the time, and half the number of people to get the thing we want to accomplish done. What are some of the evaluation challenges that nonprofits ought to avoid?

Annette: I’d say that some organizations may rush into impact before they really assess whether or not their programs are being implemented as they were intended or designed. Sometimes organizations think that more data is better. In those instances, they may have a lot of core information that is not being utilized anyway. It simply is stressing out their staff. Too much data. Sometimes it’s poor data. Sometimes it’s the wrong information being collected, or in the wrong way. For example, a while back, we worked with a foundation who were looking at their impact of their program on youth. They had these extensive surveys they designed, but only one question was designed to assess an outcome or an impact. Unfortunately for them, that particular question was not consistent across time. They put a lot of weight into that question and the quality of the data was not where it needed to be. Things like that can happen. As you mentioned earlier, sometimes there is some resistance to evaluation and data. It’s not exactly the sexiest of services out there. I think of my job as to bring my own passion to the data and numbers and making sure that quality programs are going out there in working with my clients. It’s not about pointing out what’s wrong as much as it is giving organizations information to make good decisions.

Hugh: It appears from the experience that Russ and I have that organizations, especially in the nonprofit world, do not spend time assessing any of these metrics or the outcomes or impact. A lot of the individual funders want to see results. We call it Return on Life. The grantmakers demand it and say you have to do some evaluations. It would be good to set them up in advance. Put the tools into place. Part of it is the decision that you just triggered: What will we measure? How do we measure it? How do we quantify that? Our clients are our donors. We want to stay in touch with our sponsors. We want to measure eyeballs on their product. We have been working on an upgraded tool for SMART goals called SMARTER: Evaluate and Revise. Annette, I have thrown some things out at you, so I am going to shut up again since I have a froggy voice.

Annette: Is your question how do you determine what to measure?

Hugh: The whole thing of setting up a system. Is it virtual? Is it a tool you use electronically? Do you talk to people? As you are setting up these metrics, because sometimes people measure the wrong things, how do you have that conversation? How do you discern what are the right things? What does the mechanism look like to do that measuring?

Annette: Got it. I think what it really comes down to is whether or not your organization is measuring outcomes that are linked with your mission and vision and goals. It’s one of those things where an organization needs to have a clear understanding of what those might be first. Why does your organization exist? Once you have that foundation, you’re able to dive deeper into what might indicate whether or not you’re having the impact that you think that you’re having. It really depends on whether you want to focus your energies. If your organization offers one program, that would be the focus. You would be thinking about what kinds of activities we provide, what inputs go into informing those activities, those outputs. At the end of the day, you might have a theory of change. Essentially, a theory of change is we exist because of A, B, and C, and because of A, B, and C, we are going to have this particular impact. Then it’s a lot easier to identify what metrics will help you measure whether or not you had the impact that you thought you had.

Russell: For the people out in the field delivering the services, to be able to use the tools that you create to track the data that you need.

Annette: It’s really important. At the end of the day, I think that having good intentions about the programs and services that you’re offering is fantastic. Having that commitment, that is even better. When you are able to quantify it and not only able to see if one person is having an impact, but also collectively, how are we as an organization making a difference, that is where the data comes in and provides that kind of critical information. What those specific things may look like, whether you have paper surveys or iPads, that can vary. It should be based on who your target is.

Russell: Different people, different ages all communicate differently. People are on different platforms. Pip was here talking about marketing a few weeks ago and being in a lot of places. As much social media as you can get to get that message out there. How important is it for people to have the big picture of the organization that it’s clear what they are trying to do in order to build a good evaluation program?

Annette: I think that it really depends on how motivated the organization may be. Something that I recommend is starting small. Maybe you don’t need to evaluate your entire organization. Perhaps there is one specific program that it’s new and you need to find out whether it’s working. For example, we have a client that we’re working with right now. We did a case study. They had a new program. They had never tested it out before. They wanted us to take a deep dive into that one program because they were going to use that information to solidify some of that content and training. In other organizations, they have been around for a long time and are ready to look at everything, the good, bad, ugly, everything. There is definitely a greater need to take a more big-picture look.

Russell: That’s for sure. Strategy, that is our go-to word. What are some common trends that you have noticed in your work with nonprofits? Back in the days when I wrote my first grant, you could really focus on the emotion. I think you have to hit both cylinders now. That might be harder to do. Can you speak to that?

Annette: Definitely. What I have learned a lot from funders is they want to have the feels. They want to feel that excitement and passion that the stories bring. But they also want the evidence. They want the numbers. They want proof that what you’re doing is making a difference. Going back to that earlier pat of the question about what trends I have been observing, even just in the last couple of years, I have noticed a shift. I think that a lot of nonprofit leaders are a little bit more concerned about funding and resources. So there has been a lot of cuts to the local, state, and federal governments. They are having to pivot and find new ways to generate funds moving forward. But what that can mean is there is more pressure. For example, I have seen how an organization that has been around for a long time went out of business. That is the worst case scenario. That is what an organization wants to avoid, but it does happen.

I have also seen that there seems to be more competition between a nonprofit organization than there might have been beforehand. In Colorado, I know that we are reaching out beyond Colorado, but let’s use this as an example. We have over 20,000 nonprofits, and that is growing every single day. I think that sometimes people get really excited about something they’re passionate about, or their idea, and they think they’re starting a nonprofit. I admire that. But sometimes what they don’t realize is there might be other organizations that do the exact same thing. There is this amount of money that is allocated to this one particular cause, and they are all fighting for the same resource. What I think organizations and leaders might want to do is conduct some research prior to starting an organization to make sure their programs or services are needed. Maybe they are, but maybe there is a niche that they would be able to provide that would better serve their community in a way that is less competitive and more complementary.

Russell: Berny Dohrmann, CEO and founder of CEO Space, actually calls competition a virus. That can keep people narrowly focused. How important do you think it is for collaboration and building partnerships? Does that actually make evaluation easier?

Annette: I think that collaboration is key. I think that in this day and age, we cannot be experts at everything. We can’t provide programs and services to every single type of person out there. My personal opinion is if you know what you’re good at, and you really focus on making those things the best as possible, all of a sudden, you’re able to go out there in the world and speak with other leaders and individuals and talk about how you can work together rather than competing for the same resource. I don’t think this is just for the nonprofit sector. As an entrepreneur, I notice that some folks have more of a scarcity mindset or an abundance mindset. I am the latter. I have a lot of colleagues who are evaluators. We might go after the same RFP or RFQ. It’s up to the client to decide who they want to work with. It’s part of the way the world is right now. I do think that when you know individually or organizationally what your strength is, you’re better able to attract the opportunities that will support you.

Russell: It’s about working from strengths, which is one of the things we talk about at SynerVision in helping people create a strategy. One of our good friends, Dr. David Gruder, one of our WayFinders, we did a podcast a while back. There is a relationship with money that people have that is an unspoken thing. A lot of people look at evaluation as, “I don’t want to do this, but it says in the RFP we have to do it to get the money.” How does evaluation fit in the nonprofit landscape as a concept?

Annette: I think that organizations that utilize evaluation and leverage the findings, they can differentiate themselves from other organizations that might just be going through the same motions year after year after year. At the end of the day, things are constantly in flux and changing. Even more well-established organizations may benefit from working with an evaluator to make sure the programs and services they offered a decade ago are still making that kind of impact that they think they’re making. If so, great. If not, maybe it’s an opportunity to pivot and grow in a new area.

Russell: A strategy is not forever. It’s a 3-5-year window as we practice it. We’re always looking at things. That is what evaluation is for. One of the earmarks of a high-performance nonprofit is keeping things on track. It’s the systems they have, it’s their measure for the program, in order to make a difference, you have to have a target audience to serve. One thing I have run across before, and I don’t know how common it is- I had a talk about two years ago with someone in the work force development community for one county. We were talking about a program that he created. He wanted to pick my brain. People were not accessing the program. I discovered that he had not really talked to the end users to find out what challenges they would have. Transportation to this facility was among the challenges. Do you find that that’s a common problem that people have gone out and created programs without gathering enough information from the people whom they are trying to serve? How do you help them work around that?

Annette: I have seen this. In an ideal scenario, as you are developing programs or services, you have conversations with those end users. You find out what’s important to them. What might be the best way to make that information accessible to them? You give them that connection. You might give them a snippet of what that training or program is and get their feedback and refine. As you mentioned, sometimes there can be a disconnect. I think that what a nonprofit organization can do is if they have an idea, as early on in the process, talk to the people who are going to be benefited by that idea and make sure it’s something that they need or want.

I remember a number of years ago when I was in graduate school, I attended a presentation. Lessons learned from a researcher. The basic idea was they had designed this program for single mothers to help them lift themselves up and get better jobs. What they found was those women were not necessarily enrolling in the program the way they thought. After they did more research, when those women had upward mobility, they lost a lot of the child care benefits and supports they had received. Financially, they were making more money, but they were losing money if you looked at those other metrics. They had no motivation to go through the program because it was a cost to them rather than a benefit.

Russell: Is that something that seems to be trending upward? Or has it stayed level? Or has it decreased?

Annette: I think that nonprofits and leadership are talking more with the communities they are serving. I think that they get it’s not just about what they think is important, but it’s really about what the end users think are important. There has been a similar dialogue in the evaluation community. I was part of a group of evaluators who had a long discussion about equitable evaluation and what that looks like. We had a lot of fascinating conversations on the topic. The way evaluation typically is that someone from the outside looks in, but it’s something that we do onto other people rather than a collaborative process or effort with those who are being served. I do think that there are conversations out there about how to include those end users much earlier in the process.

Russell: People have a view of a bad homework assignment or a long paper that they put off to the end of the semester. Are there some instruments or methods that lend themselves better to gathering data from the people you serve?

Annette: I think that it’s the most important thing to know who that audience is. You would design a different kind of survey for youth versus adults. Education is definitely a factor to consider. It’s usually a best practice to make something at that third or fourth grade reading level if it goes out to the public. When you’re working cross-culturally, you may have a survey tool in English and need it translated to Spanish. But it’s not as simple as translating it word for word. It’s about having that similar meaning so that the two items are equivalent. I know something that we do is we work with a native speaker to translate a survey, but we also have someone who is more of an academic, and we have them work together to create that tool. If you are doing something completely different, say international research, they may not be acquainted with going online and completing surveys. In those instances, you may need to conduct an interview or a focus group. It might also depend on who that population is. If you are working with a vulnerable population, I would not recommend a focus group because if you were in their situation, do you want to open up to a bunch of strangers about something traumatic that happened to you? Maybe it’s more of that private space. There are a lot of nuances when you are working with individuals.

Russell: We had Mary Putman from the Reciprocity Collective a few weeks back. She connects homeless with jobs. She talked about trying to get them married with other services. Having a population that is both hard to track and then at times I have learned from her that sometimes these folks don’t really talk to you. I have had some experience trying to talk to a man whom I’ve seen here locally and connect him with help. There seems to be something there that he doesn’t trust. Are you finding that nonprofits that work with vulnerable populations are having trouble reaching some of these people they want to reach?

Annette: I think that it can be a challenge. I think the most important thing is having that trust and rapport. There are instances where maybe it’s not someone like myself going out into the field to speak with individuals who are homeless. Maybe it’s the client themselves doing that work because they’ve had those conversations. They’ve learned more about those individuals. They are able to ask for feedback. You’re able to create that bridge. From a research perspective, there is something called the promatora model in Latin American populations. If you’re providing goods and services, you’re not the one doing it. You’re having someone directly from the community getting trained, going in, and collecting important information.

Hugh: This hour has flown by pretty fast. Haven’t used my voice. It was going away. Annette, you can tell I have been under the weather. I wanted to ease in a bit and let you have a chance to have a closing thought for people. Russell has done an outstanding job, as he always does, of interviewing today. I have tried to stay away a little bit.

It occurs to me through your expertise of guiding people through this, I remember the old Albert Einstein quote, “Not everything that counts can be counted, and not everything that can be counted counts.” Sometimes we get skewed. Using your expertise in how to measure things and what counts and why it counts, this is such important work.

*Sponsor message from SynerVision’s Community for Community Builders*

Annette, what do you want to leave with people before Russell closes out this really great interview? Thank you for being our guest.

Annette: Thank you. I think my final thought is that evaluation is really something that can make a positive difference in the social sector. It’s not something to be afraid of. It’s not something that is going to make things worse. If anything, it will make things better. It will help organizations have better and more effective programs. It will help organizations position themselves better for sponsors and donations. As things change, being able to pivot when they need to. It’s what differentiates those organizations who just survive from those who thrive.

Russell: Annette, it’s been a very short hour. Thank you so much for sharing your brilliance with our audience here. It makes a difference. That is the third step to building a high-performance nonprofit is staying on track. You have to measure everything you do. It’s well worth the effort.

Leave A Comment