The Importance of Making Meaningful Measurable
August 10, 2015
For many in the United States, the United Kingdom (UK) has been a standout among political powers because of its treatment of youth work—afterschool programming, voluntary services, job training, housing—as a public good. Most youth services were primarily funded by local municipalities. In 2010, this public support was revoked as a part of broader austerity measures enacted in the wake of the global financial crisis. Many youth programs, already reeling from reduced resources under the Blair government, found themselves defunded overnight. Those serving the poorest children and youth were among the hardest hit.
Not all programs survived those deep spending cuts. The ones that did became increasingly fragmented, working hard to remain independently viable. Over time, youth work in the UK came to resemble the United States: a range of loose networks and individual programs reaching out to the same participants and competing for limited resources.
There remains one key distinction between the two countries’ systems, however, as I learned during a recent trip to the UK. In the UK, youth workers seem far more concerned than their US peers with quantifying program impact, quality and effectiveness—or making “meaningful” work measureable. Perhaps made wiser by their struggles, many UK youth workers are vocal that programs must be able to prove their value. If they cannot, then they risk a repeat of 2010, with funders or the government deciding they do not deserve continued support.
In June, I joined a group of youth development and education leaders for the National Afterschool Association’s (NAA) inaugural International Learning Exchange. In each meeting we heard some version of this cautionary tale, and the importance of making meaningful measurable.
Leon Feinstein of the Early Intervention Foundation said it best. In 2010, he was at the Treasury. Austerity programming was in full swing, and Leon—an academic and analyst—was one of those tasked with figuring out the measurable impact of various social policies and programs.
Meeting Leon, it is clear that he has a deeply held belief in the value of youth work. Even so, against a dismal backdrop of shoddy data and reporting, and in comparison to data-rich systems like healthcare and education, he could not make the case that youth work was worth the money. Although there must have been parents in Parliament whose kids benefitted from neighborhood youth services, in this case, stories of meaning were no match for measurable results and political will.
During hard times, stories are no match to proof points. It’s human nature. As a Mom, I take into consideration friends’ opinions and anecdotes when I decide what my children should eat or wear; but when my kids are sick or in trouble, I look for facts about the problem and proven solutions to fix it. When making decisions with potentially serious consequences, I always chose what I believe is safest and most effective.
I left London wondering if the UK’s experience might be a prophetic call to those of us across the pond. We differ from the UK’s pre-austerity youth services because we cobble funding from multiple sources, not just public dollars. As a field, youth workers seem to share an ambivalent, even laissez-faire attitude toward data reporting and evaluation. Many of us see reporting, logic models, and outcomes statements as a time-sucking requirement of grant applications, funding reports and new proposals. Few are trained and motivated to use data as a tool to make our programs matter even more. Will it take a crisis for those of us in the US to fully embrace the positive potential of measuring quality and impact?
Resources seem to be getting tighter and funders are making tough decisions. I have been on the receiving end of letters informing me that our organization did not get the grant because we were one of 20 applicants for two awards, or 200 for ten. Funders are demanding better data, proof that what they fund works. Youth program’s anecdotal evidence of impact will no longer suffice. It’s not that these stories are not important—they illustrate the human impact and give the data meaning. They are necessary, but no longer sufficient.
Austerity and those deep cuts of 2010 changed everything for youth services in the UK, making tough times tougher for many programs — and for young people and their families. Yet, the cuts also prompted critical improvements to practice. Today, UK leaders in youth policy and practice are working together and devoting more time to measuring their program impacts. From “What Works” centers to national evidence repositories, the field is actively curating and promoting evidence of what works and why. In the US, let’s not wait for our own crisis, let’s learn from this shift in focus and start making youth work meaningful and measurable. It matters.
The second annual NAA International Learning Exchange to Dublin and Belfast, Ireland is scheduled for June 27-July 1, 2016. For more information, visit www.naaweb.org.