MEDA Blog - Stories from the Field

What is it that you do, exactly?

I think everyone in international development has been asked that question, and almost everyone dreads it! People have an idea of aid work as handing out rations to starving children (usually the ones in those old school Sally Struthers commercials) while wearing khaki. The truth is, though, that there are a ton of different paths you can take in development. The one I've chosen is monitoring and evaluation, and the internship I'm currently doing is in impact assessment.

Impact assessment is exactly what it sounds like: A way to evaluate whether a program is working as intended. It's part of the "Monitoring & Evaluation" (M&E) (and sometimes "Monitoring, Evaluation, and Learning") umbrella. Though some development firms only include M&E as required by the terms of reference in their government contracts, many are moving toward more rigorous in-house methods in order to track and improve program effects. (Since the 2008 recession, government agencies like CIDA, DFID, and USAID are also requiring better M&E from bidders on grants and contracts, since M&E can improve program efficiency.)

How can you tell if something's working? Well, in my case, data analysis is an essential part of impact assessment. The project I'm working on right now is an evaluation study of MEDA Maroc's training programs. These programs - most notably the 100 Hours to Success training course - are meant to improve youth access to financial services. In September last year, about a year into the program, they gave surveys to clients who had participated in training programs, asking about a range of topics the programs had covered, such as savings, loans, and employment. They also asked about how the clients and their families had changed in the last year; for example, had they bought a refrigerator? A car? Had their monthly household income increased? This kind of data, can tell us a lot about possible program effects.

A lot of it is descriptive statistics - anyone who's ever taken a research method course knows the drill on that one! But you have to be smart about what you run; you can't just describe a couple of variables and get useful information. One of MEDA's particular focuses, for example, is gender; when you ask, "What did people think about this program?", you also want to know, "What did girls think about this program?" and, "Compared to boys, how did girls perceive this factor?" You want to know what girls in urban areas thought versus girls in rural areas; you want to disaggregate the data as much as possible so that you know as much as possible. Part of data analysis is turning over rocks in the data set, looking for results that are unexpected or interesting.

(If you're super nerdy, like me, that's the fun part. Other than making beautiful graphs in Excel.)

Numbers can tell you a lot, but you also need the other side of the coin - qualitative data in the form of open-ended questions, focus groups, and case studies. It's really important to get as much depth as possible; although I believe in the power of quantitative data for giving a big-picture overview of a population, I also think that letting clients speak for themselves, and offer suggestions and solutions that work for their lives, is an integral part of delivering sustainable development solutions. Mistakes aren't always as clean-cut as delivering spoiled food to a refugee population; sometimes, experience shows us that minor tweaks or additions can have a lasting impact on program effectiveness.

Anyway, that's a little bit about my job! When we talk about statistics, just remember that they're only as good as the person doing them.

Until next time...

Impressions of Addis
Introducing the 2012 MEDA Interns

Related Posts