Youth Today: Evaluation: Risk Or Responsibility?

By Karen Pittman, October 2004

The July/August issue of Youth Today featured not one, but two lengthy stories on evaluation. The first, sharing the front page with a horrific story of fatal neglect within a juvenile justice facility, detailed the fallout after a recent evaluation of the Court Appointed Special Advocate program (CASA). The second, a less controversial report on the results of a multifaceted evaluation of YouthBuild, ran inside.

I applaud the editors of Youth Today for devoting so much space to this important topic. I fear, however, that as is often the case in mainstream news, the messages in the front page article overshadowed the messages in the second.
The YouthBuild article reported evaluation methodologies, caveats and findings in a readable straightforward format. The main takeaway: Evaluations provide systematic answers that anecdotes simply cannot, as well as information that can help improve programs.

The CASA article, on the other hand, reinforced every program director’s worst fear: exposure. The main takeaway in this story was not only implied by the facts, but actually articulated by the reporter, who suggested the field note “the risks groups take when complying with mounting demands from funders to prove what they do works.”

I disagree. This is not the lesson I want the field to absorb.

The story said that glowing consumer satisfaction surveys emboldened CASA executives to commission a more rigorous control-group evaluation. CASA executives, according to reporter Barbara White Stack, assumed this move was “without risk.”

The more rigorous evaluation, however, not only challenged the effectiveness of the court volunteers’ services, but suggested that they spend little time on cases, particularly those of black children, and are associated with more removals from the home and fewer efforts to reunite children with parents or relatives.

The evaluation methodology admittedly had some weaknesses. The findings raised important questions. So far, according to the article, findings confirmed the suspicions of CASA critics and were discounted as evaluation flukes by CASA executives.

The moral of this story is not to avoid evaluation, but to start early. Don’t wait, as CASA did, until there is so much at stake the results have to be good.

YouthBuild and CASA are large national programs that receive significant public dollars and make significant claims about their impact. Rigorous external evaluation should be a given for them. (That there should be equally “given” ways to fund these evaluations could likely be the topic of another column.) The high visibility of these kinds of programs raises the stakes higher, which is reason to be cautious, not complicit.

The youth field, however, is made up mostly of programs no one has heard of. And local communities, even if they are blessed with YouthBuilds, CASAs, YMCAs and the like, still have the right to ask if the affiliates in their neighborhoods are delivering what the national studies show.

It is time to stop portraying evaluation as a risk and embrace it as a responsibility. This is true for the large nationals as well as for local networks. Consider the story of YouthNet of Greater Kansas City, a coalition of youth-serving organizations including such national affiliates as Camp Fire USA and locally-based groups such as Visible Horizons, which serves Native American Youth. After five years of careful work building relationships, defining standards and improving capacity, YouthNet has reinvented itself as a transparent, results-driven network. According to its new vision:

“[A]gencies collaborating with YouthNet will set themselves apart from other local youth serving organizations because of their willingness to share individual agency assessment data with local stake holders.”

Participating agencies agree to integrate the shared standards into their programs, share quality assessment ratings with relevant stakeholders (including funders and parents) and commit executive time to participate in the network. YouthNet agrees to coordinate technical assistance, training and capacity-building.

All 18 organizations that joined YouthNet’s slow transformation process voluntarily signed collaboration agreements. Why? A sense of responsibility, as articulated in the rationale statement:

“The release of assessment results of out of school time programs [is a] very important next step in the evolution of the youth development sector overall and in Kansas City... Only by taking this leap will money continue to flow to the youth serving sector and only then can there be any hope of increased investment.”

I spoke with YouthNet President Deborah Craig shortly after the 18th organization signed the agreement. Instead of expressing anxiety about the road ahead, Craig appeared to be ecstatic about what had transpired and proud that all of the YouthNet members had chosen to make this public commitment.

Is this type of action risky? Yes. But these organizations are banking on the fact that the distinction of being responsible rather than reluctant monitors of quality will help them leverage the additional investment dollars needed to sustain and expand accountable, quality-driven programs. YouthNet will accept new members in 2007. I’m betting that 18 new organizations will apply.

Read More:
White Stack, B. (2004, July/August). “An Evaluation of Volunteers Courts Controversy.” Youth Today. Retrieved October 5, 2004, from www.youthtoday.org/youthtoday/July_Aug04/
story2_7_04.html

YouthBuild USA Web site. Retrieved October 5, 2004, from www.youthbuild.org.

National Court Appointed Special Advocates for Children Web site. Retrieved October 5, 2004, from www.nationalcasa.org.

YouthNet of Greater Kansas City Web site. Retrieved October 5, 2004, from www.kcyouthnet.org.

Visible Horizons Web site. Retrieved October 5, 2004, from www.visiblehorizons.org.

Harvard Family Research Project. (2004, Spring). The Evaluation Exchange: Evaluating Out-of-School Time Program Quality. Retrieved October 5, 2004, from www.gse.harvard.edu/hfrp/eval/issue25/index.html.

Harvard Family Research Project’s Out-of-School Time Program Evaluation Database. Retrieved October 5, 2004, from www.gse.harvard.edu/hfrp/projects/afterschool/
evaldatabase.html.

Issue by Issue: Program Quality, a Web-based resource available from the Forum for Youth Investment at www.forumforyouthinvestment.org/issues/quality.htm.

_______________
Pittman, K. (2004, October). "Evaluation: Risk Or Responsibility?." Washington, DC: The Forum for Youth Investment. A version of this article appears in
Youth Today.

Karen Pittman is executive director of the Forum for Youth Investment.

Publishing Date: 
October 1, 2004
AttachmentSize
Youth Today--October 2004.pdf117 KB