Studying the Past, Looking to the Future: Still Searching for Benefits
Still Searching for Benefits
By Larry Reed
Larry Reed is the director of the Microcredit Summit Campaign. He has worked for more than 25 years in designing, supporting, and leading activities and organizations that empower poor people to transform their lives and their communities. For most of that time, Reed worked with Opportunity International, including five years as their Africa regional director and eight years as the first CEO of the Opportunity International Network. Reed has taught at the Boulder Institute of Microfinance for 15 years, served as the chair of the SEEP Network, and consulted with industry-wide initiatives like the Smart Campaign for Client Protection and Microfinance Transparency.
We shall not cease from exploration, and the end of all our exploring will be to arrive where we started and know the place for the first time. —T.S. Eliot
Academics published their review of microfinance projects, using some of the most popular methods available to economists at the time. They found some positive benefits, but in most cases, the benefits were so close to zero that they seemed to damn with faint praise. The organizations that ran microfinance projects were upset. The methods used focused solely on economic benefits and did not include the full range of positive changes that microfinance could bring—things like increased education for children, empowerment for women, and better nutrition and housing for families.
I’m not writing about the past ten years and the work of professors at Yale, MIT, and NYU to apply randomized controlled trial methodology to microfinance impact evaluation. The story I’m writing about took place three decades ago this month, when Peter Kilby and David D’Zmura published "Searching for Benefits," a review of five small enterprise development programs from Africa (Upper Volta) and Latin America (Brazil, Dominican Republic, Honduras, and Peru). They used the favorite tool of economists in the ‘80s, cost-benefit analysis, to evaluate these programs. And they did search for benefits. Only one of the programs earned enough money to cover its operating costs, so Kilby and D’Zmura included things like increased wages and profits, backward linkages, import substitution, consumer benefits, and income redistribution as benefits of the programs. In their final calculations, the authors found that benefits exceeded costs in all five programs, but just barely in three of them.
This paper, with its emphasis on counting a wide range of economic benefits but little else, motivated 25 development organizations to begin working together to form the Small Enterprise Evaluation Project and come up with a better way to evaluate programs designed to increase incomes and employment opportunities for those living in poverty. Three years of brainstorming, arguing, refining, and testing led to Monitoring and Evaluating Small Business Projects: A Step by Step Guide. This workbook provided multiple ways of analyzing the economic and social benefits of small enterprise programs, but, at its core, emphasized three key points:
1) they should be clear about what positive benefits they seek to foster in their programs;
2) they should come up with simple and cost effective ways to measure whether or not these positive changes are occurring; and
3) they should use the data they gather from these measurements to improve their programs and expand the benefits they promote.
USAID played a key role in providing financial support for this group to get together and then carry out field tests. Then the Private Voluntary Cooperation (PVC) office of USAID developed its Matching Grant program that supported many U.S. PVOs (Private Voluntary Organizations in AID speak) to initiate small enterprise projects and the management capacity to support them.
The group that came up with the Step by Step Guide decided that they had learned a lot from working with each other, and had a lot more they needed to learn. So they transformed the group into the Small Enterprise Education and Promotion (SEEP) Network with start-up funding coming from the Matching Grant office of USAID. Their next project was to create financial reporting standards for microfinance programs, which led to common acceptance of ratios like operational and financial sustainability, portfolio at risk, and clients per account officer.
The PVC Office at USAID that funded the Matching Grant invested a lot in developing management capacity and training. Looking back now, it becomes clear what a big role this investment in learning, collaboration, and professional capacity had in building the microfinance sector. Without the PVC office there would be no SEEP, no Boulder Institute, no Village Banking. As the PVC office gave way to the Microenterprise Office, the investments in capacity and learning continued, leading to an emphasis on “massification” and converting into financial institutions capable of becoming banks.
Ironically, as the scope and scale of microfinance institutions grew, the desire to evaluate their performance dwindled into an accounting exercise tracking client growth and sustainability. What was called the “Ohio State school of thought” took hold, which held that if clients were voluntarily taking out loans and paying them back and the organizations providing those loans were covering their costs, then something good must be happening, and there was no need to do much further analysis.
It took a new century and a few wake-up calls to discover that credit could cause harm, clients may pay back even when they were not benefitting, and the benefits did not always go to the people we intended. As microfinance institutions grew profitable and attracted investors, we also learned that microfinance could generate huge benefits, with the largest share going to investors rather than clients.
Lately we’ve been trying to recover, relying on some of the same collaborative skills that USAID fostered in the 80s. We started with the “do no harm” principles of the Smart Campaign—basic tools to make sure that clients are not hurt by microfinance. MicroFinance Transparency shows us how to keep interest rates in check, rather than following the old dogma of “it’s access, not interest rates that is important to the poor.” The Social Performance Task Force takes us back to some of the original processes of the Step by Step Guide, bringing together a large group of stakeholders to come up with good practices and guidelines for making sure that our methods and results match our intentions.
Lastly, Truelift helps apply the principles of the Step by Step Guide to the mission of the majority of microfinance institutions—to provide those living in poverty with tools that will help them on their journey out. For these institutions, using measurement tools will help them make sure they are reaching the people they want to reach, and making operational decisions with that client knowledge can improve products and services so that clients experience positive results over time.
After three decades we are still searching for benefits and the best ways to track and report them. But more importantly, as we learn to track what is happening to clients over time, we are developing the full range of financial services and combining them with other developmental services that can lead to lasting benefits for the poor.