University of Minnesota Extension
http://www.extension.umn.edu/
612-624-1222
Menu Menu

Extension > Youth Development Insight > Beyond boring data

Beyond boring data

6 Comments

Samantha-Grant.jpgBy now, we are all convinced of the importance of doing evaluation of our programs. I hope we've all begun to collect data to inform our stakeholders and ourselves about how our programs are doing. I have blogged about practical evaluation in youth programs, and the theme of evaluation has been echoed by others in their posts.

Let's assume that you are collecting and analyzing data about your program -- what next? I argue that you must put in as much effort in communicating data as you did in collecting it.

Before making choices about how to package your data, think about:

  1. What data do you have?
  2. Who is the target audience for the data?
  3. What do you want your target audience to know?

In the fall 2011 issue of New Directions for Evaluation, Stephanie Evergreen makes a case for thinking like a graphic designer when communicating data. She states, "Evaluators have a responsibility to make their work as clear and accessible as possible, both to enhance the evaluation's credibility and to encourage the use of evaluation in program change." I agree with her but also think youth workers who do evaluation carry this same responsibility.


numbers.jpgEvergreen says that we have a bad habit of making our communication of data boring: "The disconnect lies between our desire to have our findings used and our methods of presenting them." Are you boring your audience with data?

In youth work we have the tendency to be so pleased that we've conducted evaluations that we neglect to think about use and communication. What good are the data if we can't communicate them in a compelling manner? How can we best create communications with users in mind?

Here are some ways to create more compelling communications of data. Compare them with what you do:

  • Jot down the key messages from your evaluation. Build your presentation around these. Think about how you can make these 2-3 ideas stick.

  • Ask youth to help. Chances are they will be able to help you get your creative juices flowing. Plus the act of engaging others in discussing your communication methods has to help you break out of your presentation rut.

  • For an oral presentation, follow the 10-minute rule: if you can't get your point across in 10 minutes, restructure what you're talking about, otherwise your audience will be snoozing.

  • Think about creating two evaluation reports -- one with more depth for stakeholders who want the details and another short, 1-2 page summary that can be shared widely.

Have you found interesting, engaging ways to share evaluation data? What difference has it made?

--Samantha Grant, assistant Extension professor, program evaluation

6 Comments

Hi Samantha,

This blog post makes me so happy! " I argue that you must put in as much effort in communicating data as you did in collecting it." I couldn't have said it better. So true. And in my experience, asking youth for their ideas about how to present it usually brings up more options, and much more creative options, than I could ever devise on my own. I'd love to see how your reporting looks after you try some of the strategies you suggested!

Stephanie

Samantha Grant said:

Thanks for your comment Stephanie. I hope this post will encourage some people to jazz up their presentation of data. Right now I'm in the middle of a report, and although the funders are looking for a report, I hope to make it meaningful and then spend time talking about the results with a team. We can then take that information and share in creative ways. How about others, how do you balance the needs of creativity with grant stipulations?

Deborah Moore said:

Sam you have hit a big important practice dilemma in youth work. I think getting to the purpose of why we evaluate is also key. What if we reframed the purpose to include our natural curiosity? Could we answer "I wonder if... or I wonder why...?" questions as we started evaluation process. We have moved to conducting evaluation to please our funders as the primary purpose - which is not always motivating. It then becomes a check, done, drudgery kind of task. Thinking of evaluation as a way of reflecting, learning and sharing what we are learning is far more inviting and useful.

Samantha Grant said:

Hi Deborah,

I appreciate that you are seeing the value of evaluation in building a reflective practice. If we always view evaluation reports as meeting funder's needs, we lose sight of the value that an evaluation can give- information that can support the program. The one thing that I warn groups that I work with is to not follow every "interesting" lead because it often leads us to collect information that isn't useful. In reflecting about evaluation, if you think about what information you need that would either improve your program or tell about the effects of your program, you are more likely to collect useful information that can be translated into communication pieces. Have others struggled with focusing on why to evaluate?

Brian Hubbard said:

What are your thoughts with youth participatory evaluation? Do stakeholders and policy makers value this method. Do you have any additional resources that may shed light on this?
Thanks for this interesting topic.

Samantha Grant said:

Hi Brian,

I apologize for the late response. I see that technology isn’t helping me out because my first response didn’t make it through.

Some of the suggestions that I offered are based in a participatory approach. The key when involving participants, whether youth or adults, is to get feedback that makes an evaluation more grounded in the culture of the program and also increases buy-in during the process. In the field, there are mixed views on participatory evaluation. The biggest factor that weighs against this approach is the control that is lost when involving a team of participants. My suggestion is to look at what you value most in your current evaluation project- the ability to be responsive to the program or the ability to have a tight, evaluation design. Sometimes one weighs heavier than the other.

I would encourage you to check out resources from the American Evaluation Association on this topic. In addition, the Center for Youth Development published a research report on using youth as quality assessors. Although this wasn’t a complete participatory project, it is a nice example.
http://www.extension.umn.edu/youth/research/quality/docs/Minnesota-4-H-Quality-Improvement-Study.pdf

Thanks for your questions!

Samantha

Leave a comment


Type the characters you see in the picture above.

  • © 2014 Regents of the University of Minnesota. All rights reserved.
  • The University of Minnesota is an equal opportunity educator and employer. Privacy