Journal
Disrupting Disadvantage
Finding what (really) works and what doesn’t
For service based organisations working to break cycles of disadvantage and reduce poverty, there’s an imperative to measure their impact. To understand if the design of their services and the way they’re being delivered are having a positive impact and benefiting people as intended.
Strategic planning tools such as Theory of Change, Program Logic and the co-design of Service Delivery Models have introduced more rigour and efficacy into service delivery organisations - but these are sometimes filled with assumptions and hypotheses that collide with the complexity of real world environmental factors, human behaviour and unintended consequences.
There are now tools that make impact measurement more reliable and affordable such as the Centre for Social Impact’s (CSI) Amplify Online which puts validated and reliable social impact indicators in the hands of for-purpose organisations to conduct independent outcomes measurement. Folk was strategic design partner to CSI in developing Amplify and we heard first-hand in consultation the practical challenges organisations face in impact and outcomes measurement, particularly when delivering government programs where evaluation is primarily associated with performance, compliance and competition for funding.
For government there are much bigger questions - at the top of the list is how to get a better return on government programs? How can government provide the greatest positive impact, for the largest number of people, using taxpayer’s dollars? Which policies and programs are the most and least successful? Are we funding ineffective programs and at the same time stopping funding to effective programs?
To answer these questions governments look to systematically evaluate the programs they fund and use the results to improve decision making, returns and outcomes. But, according to a new research report from the Committee for the Economic Development of Australia (CEDA), not all evaluations are equal. And it sounds like others have come to a similar conclusion.
Dr Leigh spoke at this week’s report launch, alongside CEDA CEO Melinda Cilento and report author Senior Economist Cassandra Winzar. The new report is the third in CEDA’s research series on entrenched disadvantage which explores how Australia can disrupt the poverty cycle through better evaluation of the programs designed to tackle these issues. Disrupting Disadvantage - Finding What Works focuses on improving the evaluation of community services for their effectiveness and value. The report outlines how governments can use data collection to build more disciplined and consistent program evaluation, and how to foster a culture that enables this.
CEDA examined 20 Federal Government programs with a total program expenditure of more than $200 billion. Ninety-five per cent were found not to have been properly evaluated. And the Federal Government isn’t alone in this problem – analysis of state government evaluations shows similar results.
The report highlights how effective evaluation starts in policy and program design - having clearly stated objectives, a definition of success or outcomes and a plan to collect data over the program life to inform evaluation.
To help ‘build in’ evaluations CEDA supports the role of an “Office of the Evaluator-General“ that would champion and steward evaluation and develop capability and capacity across the public service, new data investments, and legislation requiring a regular review of all programs.
The report authors remind us that ultimately, evaluations are about accountability and transparency. That they should give communities the information they need to hold governments to account for the success, or otherwise, of their policies and programs.
You can read Disrupting Disadvantage - Finding what works and a snapshot of the findings.
And the Australian Government has just opened the second stage of consultation on the Measuring what Matters statement – relevant if you want to have your say.
In other related news, over the last two years I’ve happily been a part of CEDA’s Better Human Services member advisory committee. The committee brings together people from member organisations to advise on practical ways to improve the design and delivery of human services that are critical to the health and wellbeing of individuals, broader economic, social and community development, and improved standards of living. There’s more here if you’re interested in joining CEDA.