Introduction
On November 20-2 1, 2009, the public policy and third sector initiative of the School of Policy Studies, Queen’s University, held its Ninth Annual Public Policy and Third Sector Forum. Over the last nine years, the Annual Forum has provided a unique setting for learning by bringing together public servants, academics, and community practitioners for dialogue and debate on various themes. This year’s theme was assessing the impact of community and voluntary sector activity. Our goal was to help people learn more about effective and meaningful ways to capture the outputs, outcomes, and impact of community and voluntary sector activity. The many speakers dealt with some of the challenges of going beyond measuring inputs, activities, and outputs in order to assess impact and outcomes. They also presented some promising practices in the establishment of indicators, data collection, and analysis. This article discusses some of the innovative practices that were presented. The list is by no means exhaustive. Readers who want to learn more about current practices are invited to visit the conference website for more information (www.queensu.ca/sps/events/third_sector).
Impact assessment has generated lots of interest in both government circles and the voluntary sector. In today’s fiscal and economic context, governments face a long-term expenditure challenge. This has meant that governments are increasingly focusing on obtaining value for money. This new focus implies choices and tradeoffs regarding where to invest resources. Governments need assurance about the value of community and voluntary sector activities. They also need evidence. In the philanthropic world, funders and donors alike have also become more selective in their giving, seeking higher returns on their investment. Nonprofit organizations are well aware of these challenges. In fact, they have felt the effect of funding cuts and tightening resources for a number of years. As a result, voluntary organizations are keenly mindful of the fact that their credibility in the political realm often rests on their capacity to present that evidence.
Value for money will most likely continue to feature in debates about the role and place of voluntary organizations in policy. Achieving value for money broadly means making the best use of the resources available for the purpose at hand—using inputs in a way that maximizes outputs. While this rhetoric is not new, what is new is that governments and philanthropic organizations are seeking a different sort of return. They are willing to look beyond short-term outcomes in order to acknowledge the social and environmental benefits of their investments. Impact assessment can provide one means of reframing the worth of the sector in this broader social, economic, and political context.
Leaders in both the voluntary sector and governments share a common desire to develop outcome measurement processes that are more reliable, are of high quality, and generate comparable information that will help to achieve policy goals. This must begin with a common understanding of the key outcomes to be achieved and how to measure them. Most frequently, economic indicators have been used to assess social progress because they are relatively easy to measure. Yet, hard quantitative measures of progress don’t always reveal meaningful information about a project’s wider social impacts. While most observers recognize the importance of capturing social dimensions, the development of social indicators has been fraught with challenges given the wide range of issues with which they are concerned.
Nevertheless, over the past decade, there has been a growth in interest in developing social indicators to measure social progress. At the last Annual Forum, we had the opportunity to showcase the work of the Institute of Wellbeing that, with the help of the Atkinson Foundation, has taken a first step in this direction. The Institute developed a new and better indicator to track quality of life called the Canadian Index of Wellbeing (CIW). The Honorable Roy Romanow noted that “Most Canadians realize that our wellbeing is not measured by just narrow economic measures like the GDP … The Canadian Index of Wellbeing will change that. It will give a quick snapshot of how we, as Canadians, are really doing.” The CIW covers eight areas of life in Canada: standard of living; health; vitality of communities; education; time use; democratic engagement; arts, culture, and recreation; and environment.
The CIW is one of many initiatives currently underway around the world aimed at measuring social progress and wellbeing. It was developed through a collaborative effort involving several nonprofit organizations, experts, and citizens across Canada. This new approach to measuring offers an integrated set of social and economic indicators that together will provide a comprehensive picture of how wellbeing is evolving in Canada. While there remains much to learn about the development of social indicators, Canada now has a basis for reporting and improving health and wellbeing outcomes in a way that reflects what matters most to its citizens. It is hoped that over time the CIW will give citizens the information needed to hold politicians accountable for Canada’s progress.
Another very exciting approach to measuring impact presented at the last Annual Forum was the work of Dr. Gina Browne from McMaster University. By establishing a benchmark analysis, Dr. Browne was able to demonstrate the cost effectiveness of recreation and childcare programs for single mothers—after a single year. Moving beyond a focus on effectiveness, her model also measures outcomes in economic terms. Dr. Browne’s analysis provides a comprehensive evaluation by focusing on cost of both health and social services before and after a particular recreation program. The model looked at five different ways of serving single mothers on welfare, including the option of doing nothing. She then established comparative baseline data by measuring the health status and use of services at the beginning of the program and upon termination of the program. Her data illustrate that, although equally effective, the measure of costs under each program was different because one actually enabled single mothers to exit from social assistance—an important effect that would not have been captured under a traditional evaluation model that looks just at the direct outputs (for more information visit: www. fhs.mcmaster.ca/slru/home.htm). What was also exciting about Dr. Browne’s model is it connects the utilization of health and social services. This comprehensive lens, in effect, captures some of the downstream effects of programs. A theme that emerged from the proceedings is that we possess little such comparative data on the impact of the voluntary sector relative to other sectors—public and private sector. In absence of such data, it may be difficult to come to term with the sector’s added value.
Dr. Browne’s model was unique in that it presented evidence showing that governments can obtain a serious shift in outcome for the same cost and within a short time frame. Her research also illustrates the downstream effects of investment in recreation, health, and wellness. Despite this evidence, persuading all relevant funders that an investment in recreation reduces health cost remains a challenge. Government departments at the provincial and federal level continue to operate in silos and do not see the economies of scale that can be gained by thinking holistically about programs and services. Nevertheless, many municipal governments, and some provincial government departments, have found her evidence compelling. She has been granted multiple research grants. In addition, a number of municipal governments have implemented her approach on a trial basis. We were lucky to count amongst the Annual Forum’s participants David Szwarc, Chief Administrative Officer of the Region of Peel, who shared his own experience with the implementation of Dr. Browne’s recreation strategy in his community. His administration saw returns on their investment within two years.
Despite the progress made around the development of indicators, there remains no way to ensure that the information obtained will be be used to improve decision-making and, ultimately, the results of programs and activities. Participants at the Annual Forum, even Dr. Browne herself, were careful to point to the limits of evidence. The development of social indicators such as the CIW and the development of benchmark analyses like those of Dr. Browne, require sophisticated techniques and skills. They illustrate that impact assessment is a complex process that requires the use of a diverse range of methodological tools and approaches. There are no simple indicators to capture the full extent of community and voluntary sector activities.
While empirical information is necessary, on its own it is insufficient in order to affect policy change. The identification of causal pathways from specific activities, relative to other drivers of change, remains difficult. The reliability and credibility of social indicators over more economic indicators are still being questioned. Most participants at the conference agreed that this complexity should not prevent efforts to develop strong and quality information both on a qualitative and a quantitative spectrum. If we are to truly reach the potential or “to measure up to” the challenges we face, innovation and leadership will be vital.
An approach that has gained a great deal of traction for voluntary organizations is participatory evaluation or empowerment evaluation, which involves all or selected stakeholders in the evaluation process. One positive aspect of this approach is to prevent the dilution of ownership of accountability that usually occurs when mandated activities are spread across organizations. Another positive aspect is that the involvement of the community in impact measurement is a good mechanism to ensure buy-in, which may reinforce the usability of the measurement in the longer term. Finally, engagement in the process generates greater transparency for beneficiaries and stakeholders. It may even motivate staff and volunteers.
Two notable examples were discussed at the conference, both in the area of poverty reduction. In Ontario, the Daily Bread Food Bank, in collaboration with the Caledon Institute of Social Policy and with the support of the Metcalf Foundation, first developed the idea of a “deprivation index” as a means to distinguish the poor from the non-poor based on an assessment of their standard of living. The deprivation index is rapidly becoming a new standard in the measurement of poverty. Michael Mendelson of the Caledon Institute of Social Policy credited the legitimacy of the measure and the resulting community support to the unique engagement approach they took to develop the measure.
Indeed, from the start, the Daily Bread Food Bank and the Caledon Institute of Social Policy engaged citizens in debate and discussion around the development of deprivation indicators. In the first stage of the process, they interviewed 1,775 food bank clients across Toronto. As they narrowed down the indicators, they led ten focus groups with 49 participants. Throughout the process, they also employed people living in poverty as researchers and facilitators. Reflecting upon the value of this new indicator, Michael Mendelson noted that it represented the real-life experiences of people living on poverty. As such, it provided an important complement to the current portrait of poverty in Ontario by capturing social dimensions of poverty, which traditional indicators did not. This measure gained such community support that the Ontario government adopted it as part of its poverty reduction strategy.
In Quebec, a different approach was taken in the selection of indicators to assess progress in poverty reduction. A consultative committee on poverty and social exclusion, presided by Dr. Alain Noël of the University of Montreal, was established to produce recommendations on targets and actions to reduce poverty, reporting directly to the department in charge of the poverty reduction strategy. While mostly composed of government officials and academics, two members of voluntary organizations working with people living in poverty also participated in this committee. They were key players in the transfer of knowledge to various stakeholders engaged in the fight against poverty and social inclusion. A good evaluation system should integrate the needs and purposes of the various actors and stakeholders. At the conference, Alain Noël discussed the important role that these two representatives played within the committee in bridging academic knowledge with lived experience. Their participation will also ensure that the recommendations of the committee gain buy-in from the community.
Both the Ontario and Quebec examples illustrate the important leadership role voluntary organizations can play in the development of indicators. Organizations took it upon themselves to create opportunities for citizens to engage in the process and, as a result, it is more likely that these indicators will gain credibility and fervor with the citizenry. This engagement can also enhance public debate and discussion regarding the impact of policy choices, ultimately leading to greater accountability. Still at its early stages, leadership will be necessary to keep the indicators on the policy agenda.
Conclusions
There is no one accepted and acceptable philosophy of evaluation and it is not our intention to impose one. We have seen some new developments over the past decade in terms of impact assessment: a widening of the perspective to include social dimensions, a refining of benchmark analyses, and the greater use of community engagement in the process. More innovation will likely happen in the future. The current context of fiscal constraint may provide some opportunities to reposition the relationship between government and the voluntary sector. Over the past decade, both have struggled mightily to connect with one another. On one hand, the voluntary sector has gone to great lengths to quantify its economic contribution and value, and measure the size and scope of its activity in order to be seen and acknowledged politically. Yet, those data have done little to change policy and the political orientation of current governments. On the other hand, governments require the knowledge base and information accumulated by voluntary organizations. They are profoundly dependent on the voluntary sector and its activities in their operations and the delivery of services. Therefore, initiatives such as those presented at the Annual Forum need to be encouraged and supported. Innovation and learning needs to occur in a safe environment. Until the environment becomes conducive to innovation, until capacity issues are properly dealt with, then perhaps this is as good as it will get for a while.
Rachel Laforest is Assistant Professor and Head of Public Policy and Third Sector Initiative, School of Policy Studies, Queen’s University. Email: laforest@queensu.ca.