University of Minnesota Extension
http://www.extension.umn.edu/
612-624-1222
Menu Menu

Extension > YD Update > Archives > Evaluation Archive

Recently in the Evaluation Category


The first year of an event is exciting! Reflecting on the process of program design, youth experience, success stories and unexpected challenges is an opportunity to ask ourselves where we can invest our time and resources to make our program stronger.

The final report of the 'Engineer It' workshop was designed by the state STEM team and implemented across Minnesota during the past 12 months by staff, volunteers and youth leaders. I encourage you to review the report, the data collected, ideas learned and recommendations for future events. The report is available on the YD Intranet here. If you have any questions regarding the report please contact me via email (hauge257@umn.edu) or telephone (763-482-1323).

If you see an opportunity for improvement or collaboration I ask that you share it with us.

Acknowledgements: Report compiled by Mark Haugen and Sherry Boyce, Extension educators.

Mark Haugen

Extension educator

From the director

| Leave a comment


Dear YD colleagues,

4-H is committed to contributing to helping youth learn and lead, and we know that school success is a very important outcome for youth. So we invested in conducting a study to see how youth who participate in 4-H do in standardized tests on math and reading, and in school attendance compared to other similar youth, and to understand how parent engagement and duration of 4-H participation affects youth achievement and attendance.

Dale Blyth was the lead investigator of the study, which was conducted in partnership with the Center for Advanced Studies in Child Welfare. Pam Larson Nippolt and others on our staff also contributed to the study, along with a group of internal and community advisors over the past two years. Josey Landrieu and I acted as advisors.

The study looked at a matched sample of 20,000 youth using data from the Minnesota Department of Education and Department of Human Services, and at math and reading scores, and attendance over five years for 3rd through 8th graders in 2006.

The results show that Minnesota youth who participate in 4-H did better in math and reading scores, and in attendance than the matched sample of youth who were not in 4-H. Youth who stayed in 4-H longer did better than youth who were in 4-H for shorter periods of time. And 4-H'ers whose families volunteered in 4-H did better than youth whose families did not.

The data show that youth who join 4-H come into the program already performing better in math and reading than other youth, and maintain (but do not increase) that better performance over time. The data do not show why or how the difference occurs. The difference may be the result of a variety of unknown factors that lead to a higher level of learning like parental support, more resources or other factors that come together to create a difference.

A summary of the findings are that:

  • Youth who participate in 4-H had consistently higher attendance, and better math and reading scores than their non-4-H peers.
  • Parent involvement in 4-H was associated with increased math scores, but not increased reading scores or school attendance.
  • 4-H youth with more extensive involvement over time had higher attendance and better math and reading scores.
This is very exciting news! You can access a copy of the Academic Achievement of Youth in 4-H report brief here.

Over the next month, we will develop key messages and tools to help you communicate about the study and findings to your local stakeholders.

Sincerely,

Dorothy McCargo Freeman

Associate dean & state 4-H director

YD evaluation tips

| Leave a comment


Note: The Youth Development program evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. Please send any questions or suggestions for future topics to samgrant@umn.edu.

4-H Youth Participation Patterns: What can we learn from enrollment data?

In the first decade of the new millennium, Minnesota 4-H invested in and implemented a statewide enrollment system for the first time in the organization's 100-year history. Newly released results from a descriptive longitudinal study paint a valuable picture of the 4-H "career" for youth participants. The findings can be used for program design and improvement efforts as Minnesota 4-H staff and volunteers, and other youth serving organizations, gain a clearer understanding of the importance of reaching new audiences, welcoming new members, and engaging youth in meaningful programmatic experiences.

We learned that the first grade and last grade of participation varied for youth based on their gender, where they lived, and their reported race/ethnicity. We learned that most of the youth participants joined in the early elementary grades and that these "early joiners" were most often male, lived on farms, and were white. Rural, female, and white youth were also likely to have more years in 4-H overall. Given that some groups were most likely to join early or stay longer, the opportunities and benefits that come from duration and longevity in a youth program were more likely for youth in those groups. This provides an impetus to both reach new audiences earlier and work to retain them longer.

The findings show that the largest proportion of youth left after their first year in 4-H, and that this likelihood rose the later youth join. The first year of participation is a critical year for welcoming and retention efforts. Satisfaction surveys show that this is important especially for "first generation" 4-H youth and families who have less knowledge of the program. This study makes the case for the urgency surrounding program improvement efforts that engage new audiences and first time youth and their families in Minnesota 4-H so that all youth can benefit from the differences made through 4-H programs.

Pamela Larson Nippolt

Evaluation and research specialist


The YD September State and Regional Report is now available online! This report includes data gathered from 8-12th grade 4-H youth across the state using the Universal Common Measure, the statewide results from the post-secondary plans survey project with 4-H seniors, and a reporting of higher education enrollment of 2013 4-H seniors.

The purpose for this report is to generate discussion about options for measuring two of the three target goals - 21st century skill development and enrollment in higher education for 4-H youth. Consider setting aside time on the agendas that you are helping to set in October and November to discuss and draw lessons from this latest addition. Sam Grant, Sherry Boyce and Pam Larson Nippolt are also available to work with you to use and apply the data in your work and planning.

Pamela Larson Nippolt

Evaluation and research specialist


Note: The Youth Development program evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the staff only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Here are a couple of great resources that you can use to add flare to your evaluation reports:

  • Kuler is a site that lets you play with the color wheel. Choose a main color and then select from six different color schemes to ignite your creativity. You can also import a photo and this tool will pull colors from the photo for you to consider using in graphics.
  • Tineye has a multicolor engine that will search for photos in Creative Commons on Flickr. For example, you can select the colors in the Extension template and Tineye will select photos with those colors. It's a great way to brainstorm different possibilities for illustrating points in your presentation.
Sherry Boyce

Extension educator for program evaluation


The program evaluation team is gathering data and producing a second 2014 report for regional and state program teams. This report will include data gathered from 8 - 12th grade 4-H youth across the state using the Universal Common Measure, the statewide results from the post-secondary plans survey project with 4-H seniors, and a reporting of higher education enrollment of 2013 4-H seniors (along with other reporting). Learn which institution of higher education had the highest Minnesota 4-H enrollment during 2013!

Consider setting aside time on the agendas that you are helping to set in October and November to discuss and draw lessons from this latest addition. Sam Grant, Sherry Boyce and Pam Larson Nippolt are also available to work with you to use and apply the data in your work and planning.

Program evaluation team


Note: The Youth Development program evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the staff only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Member and volunteer data from the 4-H Plus and 4-H Online enrollment are now available for review at the county and state level on the Minnesota 4-H Enrollment Data Tool designed by Robert Neff from Cornell University. Margo Bowerman, YD Educator in the northwest region, connected Minnesota 4-H to Robert and he graciously agreed to upload Minnesota data. The database has launched with 2007-2011 data, and more recent data will be added this fall. You can find it on the internal program evaluation website at: Minnesota 4-H Enrollment Data Tool.

How to use it

State-wide: The database can be used to understand and track annual trends in the youth member and adult volunteer enrollment at the state level. The youth data can be reviewed and compared by grade and school level, demographics, and retention from year to year. The adult volunteer data can be reviewed and compared by type of volunteer, place of residence, and retention from year to year.

County-wide: The data tool also shows trends in youth enrollment for each grade within each program year. Youth and volunteer gender, race and ethnicity can be compared each year with the state-level data. Retention (dropout rates) can be compared across program years.

Ideas for regional and county teams

This database can help teams to better understand enrollment trends and patterns at the county level and compare these local trends to other counties and the state as a whole. Teams can also set goals based on the information provided related to reaching new audiences of youth and volunteers, and retaining youth and volunteers from year to year.

We want to hear from you! Let us know how you are using this information on your own and with teams. If you have questions or comments, please contact Pam Larson Nippolt at 612-624-8445 or nippolt@umn.edu, Sherry Boyce at sboyce@umn.edu or Sam Grant at samgrant@umn.edu.

Pamela Larson Nippolt

Evaluation and research specialist


Infographics are starting to pop up everywhere. Take this example here and here and here. When it comes to sharing our data, educators are starting to understand that creative methods will draw attention to what we are saying by creating a story.

The problem with infographics is that the very technical ones require the use of a graphic designer, and even these pros can mess it up by focusing more on the image than what the data should convey. (Spoiler: don't use an infographic unless you know exactly what point you want to share.)

My new favorite tool allows infographics to be designed without fancy graphic design software. Piktochart was designed to help the non-designer build beautiful infographics. Their tag line is, "We are infographics made easy." What's not to love about that?! (It's also free for the basic version which has limited graphics and features.) Check out the infographic Shipi Kankane designed using Piktochart for her YD tip on the First Year Member experience.

I recently used this tool with the quality statewide team. I shared some information from our quality monitoring data to let quality contacts know where observations are taking place around the state. The whole piece took about an hour to complete once I had figured out how the tool worked.

Try it out and let me know what you think!

Sam Grant
Extension educator for program evaluation


All 8 - 12th grade youth who enrolled in 4-H this program year and their parent/guardian will be contacted in the next 3 - 4 weeks by family email with an invitation to participate in a statewide survey project. These youth will be asked to complete a brief survey, developed through National 4-H Council and National 4-H Headquarters, about how youth see themselves in the areas of communicating effectively, making positive choices, building connections, and making contributions. Minnesota results will be reported to your regional teams in the fall for planning related to the target area of 21st century skills.

Please encourage parents and youth to participate in this important project. If you are contacted with questions or have any questions of your own, you can contact Pam Larson Nippolt at nippolt@umn.edu or 612-624-8445. Thank you in advance for supporting this work.

Pam Larson Nippolt

Evaluation & research specialist

Staff news

| Leave a comment


  • Ann Nordby published in Journal of Extension

    Ann Nordby, web manager for our center, has published an article "Using Twitter to deliver 4-H show announcements." It appears in JOE's Tools of the Trade section, along with other web-based tools that Extension professionals can use to do their work. Read the full article here.

    Minnesota 4-H will use Twitter for announcements at the state fair again this year, and uses it year-round for social media conversation @MN4H.


  • Congratulations to Shipi Kankane!

    Congratulations to Shipi Kankane on her Graduate Education Diversity Internship (GEDI) graduation! The American Evaluation Association's (AEA) GEDI program engages and supports students from groups traditionally underrepresented in the field of evaluation. Along with other interns, Shipi made her final presentation of the internship this June during an intimate luncheon at the AEA/CDC Summer Evaluation Institute in Atlanta. The luncheon was attended by special guests, including AEA President Beverly Parsons.

    A mainstay at the institute, the GEDI graduation celebrates the successful completion of all internship program requirements by the current cohort. GEDI interns spend the academic year actively participating in specially designed training in culturally responsive evaluation. These trainings culminate in related deliverables due to exercise and applied learned concepts. Joining the alumni ranks of the 50 successful GEDI graduates, this year's impressive cohort is eager to impart their expertise on the world of evaluation.

    Learn more about the GEDI program here, and get to know this year's cohort here.

    Pamela Larson Nippolt

    Evaluation and research specialist


Note: The Youth Development program evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the staff only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

First year of 4-H enrollment: We asked, they responded.

Why focus on the first year? We recently completed "number crunching" of youth program data from more than 10 years ago and focused on the youngest 4-H'ers - those who enrolled in 4-H early. Why look backwards? Because we can tell a story about youth who enrolled and did not re-enroll over time, and we can understand more about who the youth are who stay and leave.

Important lesson. The first year of 4-H enrollment matters. A lot. We say this because the largest percentage of youth don't re-enroll after their first year of 4-H than after any other year of participation. This trend follows regardless of which grade or age they first join but it gets more noticeable for older youth - youth who join 4-H for the first time in grades 6, 7, 8 and older. Perhaps they enroll in a 4-H program that is designed as a one-year experience but, even so, the program is designed to provide greater benefits with longer involvement, right? Generally, we hope that youth re-enroll.

Fast forward to this year. There were more than 5,600 youth who enrolled in Minnesota 4-H for the first time last program year. We recently asked them and their parents to tell us about their first year in 4-H.

Some tough feedback.

  • Nearly 1 out of 5 parents thought the enrollment process was challenging.
  • 1 out of 5 parents thought that the policies were difficult for their family to understand.
  • Nearly 1 out of 5 parents did not feel welcomed in the 4-H program.
  • 1 out of 5 parents of younger youth, and 1 out of 4 parents of older youth were not satisfied with the first year of 4-H.
  • Parents wrote comments about the first year in 4-H. Many of the comments related to county fair and how this experience could be improved for newcomers.
Some great feedback.
  • Nearly all parents agreed (92%) that their child had a chance to work on a topic that was of interest to him/her while in 4-H for the first year.
So now what? We are an organization that has committed to regularly ask youth and their parents about their first-year experience. Now that they have told us, our job is to ask, "What is the most effective action we can take based on the feedback provided?" As you prepare for the county fair in your county and the fairs across your region, how can your team let parents know that you listened and take their feedback seriously?

Pamela Larson Nippolt

Evaluation and research specialist


Making the first year count! Sharing insights from the First Year Member Survey (youth)

While you are in the midst of county fair, YELLO! and other event planning activities at this time of the year, take a few minutes to see what youth members are saying about their first year in 4-H.

In the 2012-2013 program year FYInfographic Final.jpg we had approximately 5000 new 4-H members. Who were they and where did they come from? Click on the infographic to the right to get to know them better!

Every program year thousands of new youth members begin their exciting journey with 4-H. Our programs impact the lives of Minnesota youth in various meaningful ways. An annual First Year Member Satisfaction Survey is sent to first year members and parents to reflect on their involvement and experiences in 4-H. This year, parents were the primary audience for this project and their responses were recorded in monitoring reports. We received 54 youth responses to the evaluation from all over the state that were specific to the youth experience. The survey asked youth to reflect on their first year participation and experience in 4-H. A summary of these responses is shared in this update. These are shared at the state level because of the small sample size.

Key findings

Youth respondents' breakdown by gender was 35 females and 17 males, making it a 2:1 ratio for females to males. A majority of respondents spent most of their time in 4-H clubs that met at homes or in public settings (36%) and at the county fairs (35%).

4-H experiences. Fifty-one percent of the females and 44% of the males felt 'very welcomed' into the 4-H program. When asked, "How easy was it to make friends in 4-H", 65% (35 youth) responded either 'very easy' or 'fairly easy'. On the other hand, 17% (9 youth) responded either 'very difficult' or 'fairly difficult'. However, one youth mentioned, "It was kind of hard to make friends because everyone was in their own little groups and [didn't] ever talk to me." While the numbers highlight a positive trend, there might be room for improvement.

Youth have learning and leading opportunities. Eighty-seven percent (47 youth) "agreed" or "strongly agreed" that they had opportunities to choose among topics that were interesting, and 4-H allowed them to explore topics in the way they wanted to. Similarly, 89% (48 youth) felt that the topics they worked on were interesting to them and as a result, 78% (42 youth) acknowledged that they had become really good at a project, skill, etc.

Organizing the experience. Youth mentioned that more structured information would have been useful to them in their first year. Quotes from youth responses include:

  • "In the club I am in, it was very disorganized. We take too long discussing things we are going to do."
  • "It was confusing how 4-H worked at first, but I am slowly understanding it."
  • "Didn't work out well."
  • "The projects could be explained better. It was slightly confusing because there was a ton of information thrown at you in a very short time!"
  • "There were some points where my family/parents and I felt uninformed about events, change in event times, or just what to do or where to go in general. The monthly meetings are good to have and a good way to know what's going on in your club and county."
  • "Make more of an effort to inform newcomers on the actual projects. Not just all the stupid club meeting stuff."
  • "It was a little stressful trying to figure out how everything works and what to do for projects the first time."
Some of these quotes suggest that 4-H can be a maze, especially to new members. Opportunities might therefore exist to streamline processes and help first-year members navigate through the system.

Sticking to our motto. As you will hear more in the coming weeks, data from the "4-H Impact Study" and "Descriptive 4-H Career Study" reveal that the first year 4-H experience is a crucial component of 4-H youth participation. It therefore warrants conscious efforts on our part to enhance the experience and make it engaging to our new members!

Shipi Kankane

Program evaluation team


Evaluation study of the 4-H online training modules

A small group of 4-H adult volunteers will be contacted in April and asked to participate in an evaluation study of the 4-H online training modules. The study includes an e-mailed survey and a telephone interview for volunteers who have recently completed one or more of the modules. The results will be used for program improvement purposes. If you are contacted with questions or have any questions of your own, you can contact Pam Larson Nippolt at nippolt@umn.edu or 612-624-8445. Thank you in advance for supporting this work.

Regional monitoring reports available this month

The program evaluation team members will be contacting Educators in each region to introduce the regional reports and to offer technical assistance to your teams. Look for Sherry, Sam and Pam at YOUth and U - we will have limited hard copies available and will send electronic links to all in the coming weeks!

Pam Larson Nippolt

Evaluation & research specialist


Note: The Youth Development program evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the staff only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Conducting interviews with youth is one way to consider gathering information for program improvement, especially when you want to bring youth voice into your improvement process. While more time intensive than doing a survey, interviews will provide you with rich contextual information. Below is some information about planning interviews as well as tips for conducting interviews.

Planning Phase

Interviews with youth for program improvement are done within the philosophy and intentionality of the program itself - based on positive youth development principles and practices. Even when done for program improvement purposes, it is a good idea to comply with IRB regulations for data practices and protection for youth. Other major decision points in the planning process include how many individuals you will interview, how you will choose your participants, and how you will collect and analyze your data. In addition, you must decide what type of interview you want to conduct. There are three main types of interviews for research purposes:

  • Structured: Interviews that are structured ask all of the participants the exact same questions in the exact same order. There is little flexibility with this interview, meaning that follow-up questions should not be asked. This type of interview should be used if you are looking for specific information and is often used for quantitative research.
  • Semi-structured: Semi-structured interviews are more flexible and responsive to information that comes forward during the interview than structured interviews. While there is still a set protocol of questions, you are able to probe the participant for additional information as appropriate. This type of interview is recommended if you would like to elicit specific information but would like some flexibility to explore and dig deeper with the participant's responses. Also, compared with unstructured interviews, this type of interview forces the researcher to be more intentional about the information that they want to gather.
  • Unstructured: This type seems to be the most "laid-back" of the three types but also calls for the greatest level of skill on the part of the interviewer. There are no pre-determined questions, only a checklist of topics to be covered. This interview often flows more like a conversation with a specific purpose. A benefit of this type of interview is that, because of its free-flow nature, topics may be uncovered that do not get discussed in other interviews. A drawback, however, is that since there are no set questions, it might be hard to compare or aggregate the data between participants.

Conducting interviews

Here are some tips for conducting interviews:

  • Practice: Test the use of the protocol with a colleague (or a young person who you know well) and ask for feedback about the questions themselves and how they fit together.
  • Space: Find a quiet, secluded space for the interview in a public setting (or in a private home if the young person's guardian can be present). You don't want other people overhearing your conversation or your participant being distracted by anything.
  • Warm up: Start the interview with some informal chit chat. This will build your rapport with the participant and ease the participant's (and your) nerves.
  • Probe: If you are not doing a structured interview, make sure to ask participants to clarify or elaborate on their responses. This will provide you with much better data.
  • Notes: If you are using an audio recorder, don't trust it (fully). Jot down some quick notes during the interview. You can elaborate on these later if the audio recorder malfunctions.
  • Relax! If you mess up, that's okay. Also, if you're nervous and tense, the participant will sense that. Do whatever you can to put the participant (and yourself) at ease.
  • Learn More: A good resource for learning about how to conduct interviews is this newly released, comprehensive overview of the interviewing process: InterViews: Learning the Craft of Qualitative Research Interviewing (3rd Edition) by Brinkman and Kvale (2014).

Siri Scott
Doctoral student in family, youth, and community education


Note: The Youth Development program evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the staff only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Have you ever been using Excel to make a chart and struggled to make it look exactly how you wanted it to look? This happens to me all the time. I have an idea of what the final product should look like, but I can't make Excel work for me. Then I stumbled upon a handful of blogs that take the mystery out of formatting in Excel. Today I'll share one of my favorites.

Ann Emery is an evaluator who educates others about how to use Excel rather than paying for higher priced programs. She shares examples on a website and blog. Some of my favorites include:

  • A list of videos with screen shots of how to use Excel to do things like: create data labels, remove grid lines, and order categories in your charts.
  • How to make a stacked bar chart. Talk about wowing your audience if you do this trick in your next report or presentation.
  • How to make a circle chart in Excel. Check out the Urban Impact Report for an example of how this chart has been used in one of our 4-H Youth Development reports.
Take some time to look around this website for some informative tips that may just make your life a little bit easier.
Sam Grant

Extension educator for program evaluation

Note: The Youth Development program evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the staff only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Many of you are solidifying plans for summer camps and day camps. This is also the time to think about how you might evaluate these programs.

As you set up camp logistics; also add some notes on the intent of the content or activities in the camps to your file:

  • Describe the context. Why is it important to offer this content or these specific camp activities to youth in your county?
  • What expectations do you and families have as a result of attending camp?
  • What do you expect that youth will learn by attending camp?
  • Do the planned activities have the potential to add up to the learning outcomes that you have determined?
Think ahead to your evaluation:
  • What are the intended outcomes of the program? What is expected to change as a result of the program?
  • What is the purpose of the evaluation?
  • What do you want to learn from the evaluation?
Asking some of these questions in the planning stages will position you better for your evaluation process.

Sherry Boyce

Extension educator, program evaluation

Note: The Youth Development program evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the staff only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

The following evaluation tip by Pam Larson Nippolt was featured on the Youth Development Insight blog on Dec. 18., 2013:

If you are like me, you are often asked to rate your level of satisfaction with quality -- at the doctor's office, at restaurants, at the service station, while shopping online. This practice takes extra time and resources both on the part of the provider AND on the part of the participant.

So why do so many businesses and organizations want to know our opinions about their service, product or program?

The answer is deceptively simple. High satisfaction is a key sign that program participants will continue their participation in the program. As youth development professionals, we understand that program retention increases the chances that young people will reap the benefits - also known as program outcomes - from a high-quality program.

So, a comprehensive monitoring and evaluation approach for a youth development program has, at its foundation, a system for measuring participant satisfaction. In the case of youth programs, Caller, Betts, Carter & Marczak outlined three groups who are important to involve in determining satisfaction with programs - the youth participants themselves, their parents, and the program stakeholders in the community.

We know that parents play a big role in paving the way and making it possible for young people to participate in 4-H. For the past four years, Minnesota 4-H has asked parents and older youth to complete a satisfaction survey after their first year of involvement in 4-H.

The First Year Member Survey is currently "in the field" as they say in the survey business. You can look forward to receiving regional and state level reports for teams to use to understand, manage, and improve the 4-H program in the first quarter of 2014.

Pam Nippolt

Evaluation and research specialist

Note: The Youth Development program evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the staff only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Want a fun and interactive method to engage youth in dialogue? Read on! poster-sticker-activity.jpg

For the 4-H Middle School-Aged Youth Learning and Leading Study, we developed a 2-hour activity to help youth discuss the abstract concepts of learning and leading. The purpose of the activity was to better understand learning and leading from the youth's perspective and how 4-H fit into their view of themselves as learners and leaders in their areas of interest.

This activity had two main parts: a poster sticker activity and a focus group discussion. For the poster sticker activity, each youth received a poster where they listed their main areas of interest. Then, under each area of interest youth placed stickers with icons representing aspects of learning and leading as appropriate. Afterward, there was a focus group discussion where youth discussed their perspectives, definitions, and thoughts on learning and leading both in and outside of 4-H.

The findings from this activity showed that youth contribute important information about their own learning and leading if engaged in an age-appropriate manner. The poster sticker activity allowed youth to focus on their own areas of interest, which is naturally engaging. Furthermore, they were challenged to think about how these areas of interest incorporated different aspects of learning and leading. Most importantly, this activity was fun, interesting, and warmed youth up to talk about the complex concepts of learning and leading in the focus group discussion. The questions for the group discussion were intentionally broad to allow youth ample room for discussion. Also, poster paper was used to write ideas and thoughts, which provided a focal point for the discussion that helped youth stay on task.

This method has potential for use in 4-H programs when staff want to encourage youth to reflect on and discuss complex topics. In addition, it can be used for needs assessments, portfolio planning, or incorporating youth perspectives into programming.

Interested? Learn more and try it out! A lesson plan, the template for the poster, the sticker icons, and an example focus group protocol have been uploaded onto Google Drive. We also highlight what worked well with this method, and how to use it or adapt it for your own purposes. You can access it here. Soon, these documents will be uploaded onto the Extension research section of the YD webpages. Please read the instructions file first to answer any questions on downloading, descriptions of documents, or printing.

Have questions? Comment below! Also, feel free to contact Siri Scott (scot0398@umn.edu) or any program evaluation team member with questions. Stay tuned for the next eval tip where we will share the findings from this study.

Siri Scott

Doctoral student in family, youth, and community education

Note: The Youth Development program evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the staff only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Last year, I attended the American Evaluation Association's conference.

Stephanie Evergreen
presented a workshop on delivering a strong message in presentations. She laid out what she called "the architecture of a great presentation":

  • 5% Background
  • 20% Bottom Line
  • 50% Explanation
  • 15% So what
  • 10% Call to Action
Key takeaways: Spend a small amount of time on background information; push the key messages up earlier in the presentation rather than as the summary; and rather than ending with the questions or a "thank-you", provide cues for the audience on what you want them to do next.

I think that her breakdown of presentation segments is an excellent filter to use in reviewing PowerPoint presentations, whether they are evaluation reports or educational presentations.

You can watch all three of Stephanie's presentations at Potent Presentations and find more evaluation resources at American Evaluation Association.

Sherry Boyce

Extension educator for program evaluation

Note: The Youth Development program evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the staff only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Building evaluation capacity is crucial for our organization. We all need to know something about evaluation and incorporate evaluation into the design and delivery of our programs.

As you begin to work with your new statewide or regional teams, think about how to build the capacity of that team. Building capacity doesn't happen overnight, but with a few tactical strategies, you will be on your way. Here are a couple of tips that I've learned.

Start where the learner is at

Before embarking on capacity building, gain a good understanding of your team's competency in evaluation. Tailor training to the readiness of the group. Some learners may be ready for more advanced training while others are just getting a handle on the basics.

Build confidence and affirm expertise

We work with an incredibly skilled group of youth workers who are naturally building evaluation into their practice without even realizing it. Talk about the ways that they are evaluating or reflecting in their program; how they present data to stakeholders; and how they improve their programs with participant feedback. Knowing that they already act like an evaluator helps to build their confidence in gaining more skills.

Get creative

Use creative, hands-on strategies to get people engaged in the materials. I've found resources from people conducting youth-focused evaluations to be especially hands- on. Materials created for use with youth often work with learners of all ages.

Good luck with your future capacity building!

Samantha Grant

Extension educator for program evaluation

Note: The Youth Development program evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the staff only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

I'm not afraid to admit that I often "steal" resources from other evaluators. One of my favorite resources comes from University of Wisconsin Extension. Evaluating 4-H Youth Development Programs: Measuring and Communicating Value is a comprehensive website for all things evaluation.

Here you can get information on:

  • Getting started evaluating
  • Focusing your evaluation
  • Collecting evaluation data
  • Analyzing data
  • Communicating results
The Focus Your Evaluation section is one of my favorites. You can access information on building a logic model and designing strong outcomes and indicators.

This website is full of examples that are specific to 4-H programs. So often I will review an evaluation resource but struggle to see how I could use it with 4-H programs. With this resource, you don't have to worry. The information was designed (and utilized) with 4-H audiences in mind.

Chances are that if you ask me for some evaluation advice, I'll turn you to this website. Why reinvent the wheel, right? Check out these resources the next time that you conduct an evaluation.

Samantha Grant

Extension educator for program evaluation

From the director

| Leave a comment

Dear YD colleagues,

In the last edition of YD Update, Pamela Larson Nippolt shared an update from the evaluation team and a 2012 evaluation report that highlights some of their evaluation work last year.

One of Extension's strategic priorities is to use evaluation to ensure the relevance and effectiveness of our programs. Effective program evaluation can be used to show whether a program's identified goals and outcomes are being achieved, areas for potential program improvement, show value of efforts, or provide evidence to aid in funding or policy decisions.

The Center for Youth Development has made evaluation a priority in our work, and continues to invest in building strategies and support for staff to be successful in their evaluation efforts. In the near future you will begin to hear us talk more about common measures that have been developed by National 4-H as part of their effort to develop a 4-H logic model, common outcomes and measures for 4-H programming.

I am excited about this report , and I hope that our regional teams will review, discuss and use the valuable information in the 2012 evaluation report with stakeholder audiences to share the message that evaluation is important and our efforts to measure how 4-H is making a difference in the lives of Minnesota youth.

Sincerely,

Dorothy McCargo Freeman

Associate dean & state 4-H director

The Youth Development program evaluation team members worked with many of you to evaluate programs and report on results during 2012. Check out a summary of that work in the 2012 Youth Development Evaluation Highlights (located on the staff only page under evaluation resources), which provides an overview of some of the evaluation work in Youth Development last year. If you also conducted an evaluation study in 2012 that you would like to share with others, let us know and we will post it on our staff only page.

Our team, with support from the Youth Development Leadership Team, is starting to clarify the need for both program evaluation studies and a strategic, thoughtful approach to monitoring what we do and who we do it with across the organization. What is the difference between evaluation and monitoring? Evaluation studies are based on the questions that program designers want to answer about their program model so that they can move the program ahead to the next level of development. When a program model is ready, evaluation studies measure outcomes (the difference the program makes for participants) as identified in a theory of change or a logic model.

Another important measurement activity that we will begin to put more of our energy toward is "monitoring." Monitoring is the regular and consistent attention to strategic indicators that help us to understand our work. Monitoring data helps us to "course correct" during the program year, it helps us to show our work to others, it helps with planning. The choice of measures for monitoring is based on what we decide to pay attention to on a regular basis.

For example, a youth development team may decide that it is important to understand if a youth audience in a program represents the diversity of race and ethnicity in a given community. In order to pay attention to that, the team could gather, summarize, and reflect on data on race and ethnicity of youth in the program on a regular basis. It sounds simple and straightforward but it takes a lot of work "behind the scenes" to do this. The regular habit of looking at the data becomes part of how the team is strategic about working toward their goals and the goals of the larger organization. When done well, it helps make all of our work easier. In what ways do you already "monitor" your work in 4-H and Youth Development by paying attention to data?

Pamela Larson Nippolt

Evaluation and research specialist

Note: The Youth Development program evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the staff only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

We can learn about our programs by collecting multiple types of data. Certainly, we need to collect data regarding our outcomes. But, it can also be helpful to collect information for the purpose of program improvement.

When we are offering learning experiences with multiple activities; gathering satisfaction data after each activity is one way to help sort out where to make changes. Mat Duerden introduced a "Fun-o-meter" in the American Camping Association (ACA) Webinar, "Why Evaluations are Awesome!"

Dr. Duerden suggested using a birdhouse with multiple holes as the "Fun-o-meter" to use with younger campers. The top hole was labeled "lots of fun" and the bottom hole was labeled "not fun". After each activity, kids were given a marble to put into the hole to indicate their satisfaction. Staff could count the marbles in each compartment to get feedback.

This is where reusing comes into play. Do you have items in your office storage room that you could use or adapt to develop your own "Fun-o-meter"? Please share your ideas.

Dr. Duerden also offered three statements that he likes to use to get satisfaction feedback from youth:

  • I liked the program
  • I would sign up for the program again
  • I would tell my friends to sign up
This is the recycle part of this tip. Samantha Grant wrote about doing evaluation at camp in her evaluation tip for YD Update in June, 2011.

Sam offered several different ideas for gathering data in the camp setting, including her handout on Creative Evaluation Strategies. These ideas are great and worth reviewing as we enter camp season.

Sherry Boyce
Extension educator, program evaluation

Note: The Youth Development program evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the staff only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Starting point.

Involving youth in the inner workings of programs through evaluation seems like the "right" thing to do but it is easy for adults to lose steam while we figure out how to do it "right". Understandably, adults working with youth want to offer a meaningful learning experience. Why not test the waters by involving youth in conducting a needs assessment to inform new club or program approaches for youth?

Focus with a question.

Help youth get started by creating energy with a question about a problem, an opportunity, or about youth in the community. For example, a great starting question might be something like, "What neighborhoods in our community have the most youth residents?" Youth can then gather information by getting acquainted with existing data sources to answer their question. Minnesota Compass is a great resource for community and neighborhood level data, and some of it is directly about youth demographics and social issues.

Facilitate.

Once the youth pose the question and gather the data, facilitation of the discussion is key. After youth select and present their data, ask for their recommendations about what data are important to consider in designing programs for youth, and what the data mean to them. Do these data reflect your experience in this community? What else do we need to know?

It is almost a sure-fire thing that youth will have questions (or doubts) about the validity and reliability of the data. Their questions can lead to creative and fun methods that will help them to gather information. Observations, participant counts, photography, video tours, mapping ... the list is endless, so focusing on the question will help determine the best method. This is also a great opportunity to learn or strengthen skills.

Learn more about how the Minnesota Youth Council members surveyed 1,000 youth about issues of concern for an example of how Minnesota youth are involved in assessing needs: http://www1.extension.umn.edu/youth/training-events/docs/myc-youth-survey-results.pdf.

There are many of us who are exploring ways to involve youth in evaluation and research. Let's support each other!

Pamela Larson Nippolt
Evaluation and research specialist

Note: The Youth Development program evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the staff only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

How many of you can talk about a recent presentation that you attended that was really bad? I think we've all had experiences of watching a presenter deliver good content in very ineffective ways.

Recently, I've been learning more about effective presentations with the help of the American Evaluation Association. Stephanie Evergreen, a presentation and evaluation guru, has spearheaded an effort called Potent Presentations with the American Evaluation Association. The website is full of amazing tools that will help you to improve your presentations.

The tools are divided into three sections:

  • Message
  • Design
  • Delivery
All three of these components are important to deliver a dynamic presentation. I'm sure there's one area that feels like more a struggle for you, so here's your opportunity to make some improvements.

One of my favorite resources is the Messaging Model Handout found under the Message section. This worksheet will help you to organize a presentation that builds upon brain research and the audience's need to get right to the point. Get rid of presentations that waste the first precious 20 minutes to tell you all the background information. This tool forces you to organize your presentation in a new and compelling way.

Do you want your information to stick with participants? Try the Delivery Glue handout in the Delivery section, and you'll ensure that participants leave knowing your key messages.

Try one of these tools out the next time that you're planning to teach or present to a group. I hope they'll soon become a favorite resource for you. Now when you attend a really bad presentation, take notes of what the presenter is doing and compare to the Potent Presentation resources. Sometimes the worst examples can be the most enlightening about what not to do.

Samantha Grant

Assistant Extension professor, program evaluation

Note: The Youth Development program evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the staff only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Do you want to get more from your data, know that you have questions you can answer through statistics, but don't have access to a statistical package? I just learned about something and can't wait to share this information with others who want to be able to analyze data sets but who don't want to download or purchase software to do so. If you want to tinker with data and be able to run simple statistics - keep reading. You probably have access to this resource with just a few clicks of a mouse in your friendly Excel package!

First of all, make sure that you have the Data Analysis ToolPak add-in Excel. Thanks to Beverly Dretzke, a University staff member, author, and educator (her book is entitled Statistics with Microsoft Excel, 5th ed. Boston: Pearson), I now know how to do this and will share with you:

  • Click Data at the top of the Excel screen. Double-click Data Analysis in the Analysis group at the right.
  • If Data Analysis does not appear as a choice, you will need to load the Microsoft Excel Analysis ToolPak add-in. First click File at the top left of the screen and select Options. Select Add Ins from the list on the left. Analysis ToolPak and Analysis ToolPak - VBA should both be listed in the Active Application Add Ins. If you need to add either of these first click on the name to select it. Then click Go at the bottom. Click in the box to make the add-in active. Click OK.
  • Then start again by clicking Data at the top of the screen and then double clicking Data Analysis.
Once you have the add-in, you have access to a number of basic statistical tools. This is where I recommend purchasing Beverly's book in order to learn how to use the tools for various functions. Her clear and practical writing style makes her book easy to follow and will have you analyzing your data within minutes.

Pamela Larson Nippolt

Evaluation and research specialist


Note: The Youth Development program evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the staff only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Many of you are already planning promotions for 4-H summer events. We want these marketing and communications pieces to showcase all of the great things youth and families can expect from their involvement. Working on these promotional materials signals the start of summer programming efforts.

An evaluation report of your summer programming is like the other bookend. We want reports that represent the changes that happened as a result of the programming. So, although February seems early to think about evaluation reports; this time provides a great opportunity to review the program outcomes and the activities you have planned for those summer events.

The program evaluation educators have been recommending that you use a series of questions early in your planning to get set up for developing a stronger evaluation and evaluation report.

  • What are the outcomes for the program? What is expected to change as a result of the program?
  • What is the program's target population? Who is expected to change as a result of the program?
  • What are the strategies and activities? Are they intentionally designed towards meeting the stated outcomes?
  • What is the purpose of the evaluation? What do you hope to have documented or see changed?
Use these questions to help you think about the evaluation reports you'd like to share with stakeholders in the fall.
Sherry Boyce

Extension educator, program evaluation

Note: The Youth Development program evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the staff only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Happy New Year! The start of the year is a great time to take an inventory of your current programming efforts and your goals for the coming year.

As a member of the YD program evaluation team, I field many questions about how to best approach evaluation design. In the last month, I've worked with a few groups where I had to give the advice that evaluation may not be the next logical step. (Just because I have the word evaluation in my title doesn't mean that I believe we have to evaluate everything.)

Evaluation is a powerful tool to learn more about ways to improve your program or to demonstrate the success of your program. A tool, however, serves a specific purpose, and evaluation may not always be warranted.

Before starting an evaluation, ask yourself the following questions:

  • What is my motivation behind conducting an evaluation? What do I (or my team) want to know about the program?
  • Beyond what you want to know, what actions are you willing to take based on what you learn?
  • What is the dosage of the program, and does the time put into the evaluation outweigh programming time? (An hour long workshop often doesn't warrant extensive evaluation.) Think about the experience of the participant. The total length of the program may be different than the participant's learning experience time.
  • Am I following a question that is interesting but maybe not relevant to the operation of my program?
  • Do I have the time, budget, and resources to carry out an evaluation?
As you answer these questions, you'll get a better idea about whether or not evaluation is the next logical step for your program. If you're still wondering, buy a colleague a cup of coffee and talk through your reasoning or call one of the program evaluation educators.

Happy evaluating in 2013!

Samantha Grant

Extension educator, program evaluation

Note: The Youth Development program evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the staff only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Back in October I attended the session at Extension's program conference titled Tips on Successful Program Evaluation by Richard Krueger and Mary Anne Casey. One particular piece of their presentation stuck with me, and I find it very useful when talking to others about program design and program evaluation.

At one point during the presentation Mr. Krueger said, "We need to marry program development and program evaluation." I totally agree; these two program planning components go hand in hand. He showed the group Bennett's Hierarchy of Evidence which gets at this idea of gathering the necessary evidence to inform our program development but also measure our program's impact.

7-levels-evidence.jpgIf we consider these seven levels of evidence above, we can clearly see that the answers (evidence) in each of those will both inform our program development (the creation of a theory of change, logic model, etc.) and provide evidence that we are having the impact and outcomes we set out to accomplish.

Keep this in mind as you set out to do program planning in your new Regional Teams, and you will have a more successful program planning and evaluation process!

Josey Landrieu

Extension educator, program evaluation

Note: The Youth Development program evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the staff only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

At the recent conference of the American Evaluation Association, I got several great tips on youth-focused evaluation. One tip was from a presentation by a group from Claremont Graduate University. In developing a logic model, check for the balance of activities related to positive youth development. As an example, be explicit in your logic model as to whether it is a skill-building activity, an activity that is about a meaningful contribution, or an activity about building connections. Check to see how those activities map across to each outcome that is related to positive youth development. This tip will help you in the crafting of your evaluation plan.

Sherry Boyce

Extension educator, program evaluation

Note: The Youth Development program evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the staff only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Do you want to take the guesswork out of assessing your program? Then a standardized measure may be a good option for you to consider. The perk of a standardized measure is that someone else has spent time designing, editing, and testing it with a group of youth. In some cases, there may also be research on the reliability and validity of the tool, and there may be a sample group for score comparison.

The University of Illinois, under the direction of Reed Larson, developed a tool called the Youth Experience Survey (YES). This measure was designed for high school aged adolescents to give feedback on their learning experiences in out-of-school time programs.

The YES measures six domains of development:

  1. Identity Work
  2. Initiative
  3. Basic Skills
  4. Teamwork and Social Skills
  5. Interpersonal Relationships
  6. Adult Networks
Something unique about this tool is that it also assesses negative experiences that are known to impact youth's sense of safety and belonging in youth programs.

This summer, the Rochester East/Region 15 camp team used this assessment as one piece of their camp counselor evaluation. The team found this measure adequately covered the leadership experiences that they hoped their counselors would get from serving in a counselor role.

Remember that a standardized measure isn't always the best fit for your program. First think about your program goals and outcomes and then view measurement through this lens. A standard measure may be an easy fix, but if it doesn't measure your learning outcomes, the data you receive will not be useful.

Take a look at some other standardized measures that Cornell University put together. These cover a variety of topics from youth leadership to youth development to food and nutrition.

Samantha Grant

Extension educator, program evaluation

Have you ever asked yourself this question when working on a research or evaluation project? As the program evaluation team, we often get this question from many of you. It's a valid and important question.

A few days ago the program evaluation team educators met with two Institutional Review Board (IRB) staff members so we could get some more specific answers. Our main question went like this: I collect evaluation data (survey, interview, focus group, observations) to evaluate a program. Now that I am done, I want to present and/or share the evaluation findings with external and professional audiences. Do I need to go through IRB? The answer is: No. If you didn't need an IRB to begin the process (it was not considered human subject research) you don't need an IRB to share or disseminate the findings. In other words, if your purpose is to evaluate a program then you will not need an IRB to collect or disseminate the data. However, if you do want to conduct a research study, then you would need to begin by visiting the following website: Does my research need IRB?

So remember, when doing straight up program evaluation and your data answers evaluation questions; you can share and present findings without an IRB.

Nevertheless, as responsible evaluators and researchers our advice is to use safe and respectful practices when collecting any type of data about youth, families, and programs. Please let us know if you have any IRB-related questions!

Josey Landrieu
Extension educator- program evaluation

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the staff only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

I want to share my top two favorite resources that I've gotten from other evaluators.

  1. Likert scale examples. I love this resource to get thinking about different ways to score questions. I can get very stuck trying to figure out how I want to ask a question on a survey, and this is a tool that has helped me time and time again. It also is my most shared link with other educators, so I'm sure many of you are seeing this for a second time.
  2. Tagxedo. In one of our first YD evaluation tips, we told you about Wordle. Wordle allows you to make word cloud images from text, giving greater prominence to text that appears more frequently in the source text. Now, think of Wordle on steroids and you get Tagxedo. What I love about Tagxedo is that it allows you to customize your images beyond what are available in Wordle plus it can be downloaded for easy entry into reports or PowerPoint presentations. You can also create word clouds in any shape. The evaluation team recently designed one for a presentation in the shape of the number 10. Definitely worth checking out.
Tagxedo-pic.jpg

Hope you find some use for these tools in your upcoming projects!

Samantha Grant

Extension educator, program evaluation

Congratulations to Sherry Boyce, Samantha Grant, and Josey Landrieu for being selected as the winning team in the "Minnesota Next Top Amazing Evaluator" by the University's Minnesota Evaluation Studies Institute on July 12, 2012. The Minnesota Evaluation Studies Institute is an interdisciplinary training institute for evaluation studies at the University of Minnesota. The institute brings together faculty expertise from the College of Education and Human Development, the School of Public Health, the Humphrey School for Public Affairs, the Center for Urban and Regional Affairs, and Extension. This fun and creative contest involved submitting products in response to three challenges related to evaluation. Our own Sam, Sherry, and Josey rose to the challenges with wit and insight. Perhaps the most notable "product" that they submitted, and for which received great acclaim, is the Evaluation Cloak which is currently on display at the Cloquet Regional Office.

2012-07-12 09.53.33.jpg

Pamela Nippolt
Assistant Extension professor and YD program leader, program evaluation

Note: The Youth Development program evaluation team will share tips and resources in YD Update to aide staff in program evaluation efforts. The information will be archived on the staff only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

We are near the end of the 4-H program year. I know that many of you will be giving presentations to county Extension committees and writing articles for local newspapers as well as having conversations with volunteers and colleagues about 4-H in your counties.

The message from Dorothy Freeman in YD Update and February's evaluation tip, Writing for Impact: The message, the audience, and the medium highlighted the templates that are available for presenting or packaging your presentations and evaluation reports.

I'd like to encourage you to also think about the packaging or framing of the story you tell.

Revisit the outcomes for youth. Consider the intentions of the events and activities that young people and families participate in.

  • Tip #1: Use language from the outcome statements in your presentations, articles, and reports. It can be easy to use shorthand and talk about the number of exhibits at the fair but again, this is a great time to talk about the learning and leading experiences of young people in 4-H.
  • Tip #2: Craft your statements around young people rather than numbers of exhibits or numbers of animals. Use the showcases as an opportunity to gather evidence for evaluation purposes.
  • Tip #3: Consider one of these possibilities: Implement a short survey at the end of demonstration day or after conference judging of a specific project. Systematically, ask questions about the benefit of participating in the showcase and record the responses. Have a list of photos that you'd like take that capture young people in active leadership roles.

Happy fair season,

Sherry Boyce
Extension educator, program evaluation

Juice Analytics Chart Chooser is a simple tool allows you to think creatively but concretely about ways to visualize data that you've collected through an evaluation.

Each of the chart types serves a particular purpose. The tool won't give you detailed specifics, but it does allow you to sort by six purposes on the left hand side. As you think about how you might use the tool, consider the following questions:

  • What are the reasons for displaying your data? Do you want to compare responses from two groups? A bar chart may be your best option. You can use the selection of "Comparison" on the left hand side of the Chart Chooser to see other charts that are handy for comparisons.
  • Do you want your readers to understand actual scores or average responses on an evaluation? A table may be the best way to present your data.

The three most common charts that we use in our reporting are: some variation of the column/bar chart, pie chart, and table. Keep in mind that pie charts are intended to show relationships within a common variable, usually on categorical data (gender, age). Bar graphs are meant to show relationships between variables (test scores on multiple questions.)

In creating any type of chart, first think about what message you are trying to convey. Don't use a chart to fill space or to try to impress an audience. Charts should be used sparingly to tell your story. Sometimes the best story isn't told with a chart but with a quote from a participant or a more detailed explanation.

Make sure to bookmark the Chart Chooser for the next time you report data.

Samantha Grant

Extension educator, program evaluation

As the only federal funding stream that provides dedicated funds for afterschool programs, the 21st Century Community Learning Centers (21st CCLC) initiative plays an important role in supporting innovation in the out-of-school time (OST) field.

The latest issue in their Research Update series reviews evaluations and research studies that showcase innovations in afterschool programs supported by 21st CCLC funding, and focuses on three areas of innovation:

  • Promoting academic achievement
  • Evaluating and continuously improving major OST initiatives
  • Providing high-quality OST experiences for youth

You can find profiles of all cited evaluations and research studies in the OST Research & Evaluation Database.

Download now!

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aide staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

At the end of March, the Program Evaluation team attended the Minnesota Evaluation Studies Institute. One of our favorite sessions was on 25 Low-Cost/No-Cost Tools for Program Evaluation by Susan Kistler. Check out this resource and you'll find some tips on:

  • Professional looking timelines for projects.
  • Three great sites to find photos for use in presentations.
  • Search tools to keep you up to date on current trends.
  • And so much more!

Let us know what resource helped you in your practice.

Samantha Grant

Extension educator, program evaluation

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to support staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Have you been looking for new and exciting learning opportunities? Or even an easier way to build, customize, and share your program logic model? I invite you to sit back and enjoy the new and improved CYFERnet website.

The Children, Youth, and Families Education and Research Network web site includes interactive features and vetted resources designed to build evaluation capacity for programs and projects funded by USDA-NIFA as well as the general public. On this site you can learn from experts in the field about working with youth and build a logic model with an easy-to-use custom logic model builder.

You can LEARN, BUILD, EVALUATE, and REPORT! Here are some of the highlights:

  • In the LEARN section of the site, you can explore all aspects of program evaluation with Interactive Learning Modules. Watch videos addressing a range of evaluation topics discussed by experts in the field, and even take quizzes to test your knowledge.
  • In the BUILD section of the site, you can use the interactive Survey and Logic Model builders. (Seriously ... it doesn't get better than this!)
  • In the EVALUATE section of the site, you have access to evaluation resources vetted and reviewed by experts in the fields of evaluation and youth development. Try the Searchable Database of Evaluation Instruments for easy-to-use identification of valid and reliable measures.
  • Last but not least, in the REPORT section of the site, users who have used clickers to gather data are able to create their own reports.

Explore the site to find out more. And bring any of your questions, ideas, or evaluation needs to the YD Program Evaluation Team; we can learn, build, evaluate and report on our YD programming.

Josey Landrieu

Extension educator, program evaluation

Many parents ask their children about school, hoping to find out if their child had positive interactions with other kids that day, felt a sense of belonging, if there was homework, if there was something in the school day that sparked his/her interest that could be nurtured, etc. But many questions result in less-than-satisfactory answers:

  • How was school today? "Fine."
  • What did you learn at school today? "Nothing."
  • Do you have homework? "No."

When working on a draft of an end-of-workshop session for youth, I have often caught myself defaulting to the work parallel to those after-school questions:

  • How was the program?
  • What did you learn from the workshop?
  • What will you take away from today's workshop?

The responses on the evaluation forms have not always provided me with the type of information I had hoped for. As a parent, I eventually learned that I needed to change my approach to get more meaningful information. Tell me about what happened at the lunch table today. Regarding homework, I found that I needed to be very specific. Do you have any worksheets to do? Do you have pages to read for social studies? Do you have an assignment to do problems from the math book? Do you have any report that has been assigned? Do you have a test later this week to study for? Do you need to practice your instrument?

Survey design should follow the same logic of digging for the meaningful information. Spending time to develop good questions can yield more useful data.

Tips for writing end-of-workshop surveys for youth:

  • Ask yourself and/or ask the committee you are working with, "What is it that we really hope to learn from the survey responses?" And further, "What action will we take based on what we learn?"
  • Be specific. Work on crafting questions so that young people understand the exact event or activity that you are referring to.
  • Check your questions for understanding. Terms (ex: community service or long-term project) or titles (volunteer staff or LQA&E) may be jargon to young people.
  • Consider adding Questionnaire Design: Asking Questions with a Purpose and Collecting Evaluation Data: End-of-Session Questionnaires to your evaluation resources. Both of these, and more, are available at http://www.uwex.edu/ces/pdande/evaluation/evaldocs.html.

Have you found evaluation questions that are effective in getting meaningful responses from youth?

Sherry Boyce

Extension educator, program evaluation

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aide staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Let's say you take the time to conduct an evaluation at the end of your program or training. (First of all, pat yourself on the back.) You maybe glance over the results, but now what? This is where many people get stuck. Posts from me in the next year will focus on interpreting data and building compelling reports, so I hope to give you tools for sharing your evaluation findings.

George F. Grob has produced a well-known article on Writing for Impact. He argues that to effectively communicate your findings, you have to think about three things: the message, the audience, and the medium.

The Message: What do you want people to take away from your evaluation? One of best tricks is what Grob discusses as the "mom test." Think about the evaluation you just completed. If someone asked you, "What did you find out?" How would you answer that question in 1-2 sentences? That's the mom test. Being able to think about the information that you collected and sum in into one thought is challenging, but I guarantee it will force you to really examine your evaluation data.

The Audience: Who do you want to read about or hear about your evaluation? Remember that not all audiences need to or want to hear the same thing. For instance, county commissioners may want to know about changes in learning, but parents may want to know about the quality of the learning environment. Being able to tailor the message to your audience makes your presentation much stronger.

The Medium: How do you package the information? You could deliver a PowerPoint presentation, a 2 page fact sheet, or a full evaluation report. No matter what you decide, the presentation of your information is important. In future posts, I will talk more about elements of strong reporting. I recently wrote a blog post for Youth Development Insight that can get you thinking about presentation.

We are lucky in Extension because our communications team has made our lives easier by creating templates for us to use. The report template is my favorite for putting together a professional document.

As you begin to share your data, think about the message, the audience, and the medium.

Samantha Grant

Extension educator, program evaluation

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

The Forum for Youth Investment, with support from the William T. Grant Foundation, recently released a reviewed collection of youth outcome measures of "soft skills" - communication, relationships and collaboration, critical thinking and decision making, and initiative and self-direction. The report cites the Preparing Students for College and Careers policy report that "according to teachers, parents, students and Fortune 1000 executives, the critical components of being college- and career-ready focus more on higher-order thinking and performance skills than knowledge of challenging content." In my opinion, the concise review of eight measurement tools does three things very well:
  • Names outcomes that frame the niche of youth learning in community (that happens within intentionally-designed programs like 4-H and other community learning opportunities)
  • Lays out the measures in an easy-to-understand guide with details about reliability, validity, and costs associated with the use of the eight measures
  • Issues a call to action to advance the field by designing practical studies that are also technically sound, and by improving and advancing the measurement of soft skills.

This is a tool that should be on the shelf of youth development leaders, program designers and scholars. You can link to this resource at http://www.forumfyi.org/content/soft-skills-hard-data-

Pam Larson Nippolt
Program leader, Youth Development program evaluation

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

How do we (as educators and program evaluators) remain responsive to program and participants' cultural understandings about what has merit, worth, and value?

Classical evaluation theorists such as Scriven (1991) define evaluation as determining the merit, worth, or value of things. However, notions of merit, worth, and value are constantly shaped and influenced by the cultural context of the program and its participants. A common thread between culture and evaluation is values.

Cultural responsive evaluation is defined as a "systematic, responsive inquiry that is actively cognizant, understanding, and appreciative of the cultural context in which the evaluation takes place; that frames and articulates the epistemology of the evaluative endeavor; that employs culturally and contextually appropriate methodology, and that uses stakeholder generated, interpretive means to arrive at the results and further use of the findings" (SenGupta, Hopson, and Thompson-Robinson, 2004).

What does this mean to our work in program evaluation and youth development?

  1. We must recognize, appreciate, and incorporate culturally related contextual factors in our design and implementation of our evaluation tools. These factors include but are not limited to: participants' socio-economic status, language, power relationships within and outside the program, and diverse cultural traditions and meaning.
  2. Practicing ongoing self examination of values, assumptions, and cultural contexts is critical for the culturally responsive evaluator. Such an approach doesn't ask us to leave behind our cultural understandings, values and skills; instead it challenges us to a heightened state of awareness as the first building block.
  3. Merryfield (1985) calls for a range of methods that include open dialogue, and participatory ways to involve stakeholders throughout the process of conducting the evaluation.

How can we apply this to our practice?

  • Gather input from various participants and stakeholders about the evaluation goals, questions, and methods. Do they fit with their understanding of what deems merit and value within the program?
  • Acknowledge the lenses and filters that frame our perceptions and meaning-making and discover what they can illuminate, but more importantly what they might obscure or ignore.

Adapted from SenGupta, S., Hopson, R., and Thompson-Robinson, M. (2004). Culturally competence in evaluation: An overview. New Directions for Evaluation, 102, 5-19.

Merryfield, M. (1985). The challenges of cross-cultural evaluation: Some views from the field. New Directions for Program Evaluation, 25, 3-17.

Scriven, M. (1991). Evaluation Thesaurus. Newbury Park: CA. SAGE.

Josey Landrieu

Extension educator, program evaluation

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

I like pie. When in doubt and I'm thinking of dessert, I make pie.

As we review data about program participants and develop presentations to share this data with stakeholders, just as we default to making a favorite dessert, we often default to a favorite way of presenting data about our programs.

We might use actual numbers, percentages, graphs, charts, photos, or stories.

I'd like to encourage you to use graphs or charts to present program data. These graphics are an excellent way to both simplify the presentation of data and emphasize important messages.

But before you decide whether to use a bar graph, a pie chart, or a line graph; consider the purpose of your presentation. In Using Numbers to Tell 4-H Success Stories, Mary Marczak recommends asking, "What do we want to show?"

Generally speaking, bar graphs show comparison, pie charts show parts of a whole, and line graphs show growth or progress over time. Choose the graphic that will help your audience learn and remember the key message. Using Graphics, a power point presentation on Wisconsin Extension's Evaluating 4-H Youth Development Programs web site has excellent examples and practical tips for using graphics in presentations.

And graphics are like dessert; good data or good ingredients will help make it more memorable.

Sherry Boyce

Extension educator, program evaluation

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aide staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Would you like to have more control over the collection, analysis, and presentation of your program data? And would you like to do so in a time-efficient way? If your answer is YES then this YD Evaluation Tip is for you!

This past Spring I did some work with Excel and because of its relatively low learning curve, and its powerful analytical and presentation capabilities, I would like to share some features you could use to work with youth development data.

The website for SchoolDataTutorials.org has great resources to help us work with data in Excel. Although the name includes the word "school" in it, its content is still very pertinent to our work in out-of-school time programs and the use of data.

I invite you to take a few minutes from your busy schedules and take an exciting and productive tour of this website and its tutorials. I've highlighted a couple here that I found most useful for our work in youth programs.

You can start here: These first tutorials show how to move around the Excel workspace. They also teach useful "hot key" combinations that can help you move around, select data, and save you time. They also teach how to rearrange, resize, and/or move your rows and columns to enhance the readability or create subsets of your data.

One of the most useful tutorials on the website is Entering Data in Excel. This tutorial shows how to utilize Excel's AutoFill feature to quickly create sequential items (e.g., Youth 1, Youth 2, and Youth 3) or to create pre-made lists (e.g., days of the week, months of the year, etc.).

Another tutorial I recommend is on Formatting Data. This group of short tutorials shows how to format your raw data files for printing. Create customized headers and footers; put that one column that prints on its own page back with the others; and print column headings on every page, not just the first one!

Go ahead and try some of these with an existing data set you might have in Excel (it doesn't have to be a large or complex data set, any data can work for this). Since Excel is already on your computer, there is no need to purchase additional software and you can begin working with data immediately.

Please note: The tutorials were recorded in an older version of Excel so you might notice that buttons or commands might differ a bit on your computer's version.

Josey Landrieu

Extension educator, program evaluation

YD evaluation tips: Learning

| Leave a comment

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aide staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Learning: it underlies all of what we do in Youth Development. Whether you're planning for youth to learn more about specific content matter or what youth workers will learn to be better practitioners, learning is key in our work. Donald Kirkpatrick has become well-known for his four levels of learning evaluation.

The four levels of learning evaluation are:

  • Reaction - what participants thought and felt about the training
    An example question: Would participants rate the session as a good use of time?
  • Learning - an increase in knowledge or capability
    An example question: What did you learn during this program that you will use in the future?
  • Behavior - extent of behavior improvement and implementation or application
    An example question: How have you used the information from the training you completed two months ago?
  • Results - the effects on the organization or environment due to the participant's learning
    An example question: How did what you learned impact your 4-H club?

As you can see from the four levels, reaction and learning are easier to assess immediately after the program or training. Behavior change and results are better suited for a follow-up evaluation. With some word changes you could use these in a post-training evaluation as a way to assess the intention for improvement (i.e. How will what you learned impact your 4-H club?)

Find out more about Kirpatrick's model, including research and evaluation examples. I challenge you to include one of each type of question in your next evaluation of learning.

Samantha Grant

Extension educator, program evaluation

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aide staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Summer is busy with county fairs, State Fair, getting ready for fall enrollment and 2011-2012 programming. As you finalize those plans for programming, consider which program you will evaluate in the 4-H new year.

The webinar, Evaluation and Intentionality: You Can't Have One Without the Other, is a great way to jumpstart your thinking as you begin to develop an evaluation plan for one of your programs.

The webinar is presented by Mat Duerden, Coordinator of Texas A&M's Youth Development Initiative (www.ydi.tamu.edu), and Peter Witt, Professor and holder of the Bradberry Recreation and Youth Development chair at Texas A&M University. The link above provides access to the power point slides as well as the archived webinar.

Here are some highlights:

  • five dimensions or focus areas of "intentionality" in programs.
  • tips on developing strong outcome statements.
  • questions at the end of the webinar that will help you examine the connection between outcome statements for your program and evaluation.
So, savor those summer moments, and use this webinar as a launching point to jump into the new program year with some fresh ideas about intentionality and evaluation.

Sherry Boyce

Extension educator, program evaluation

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Are you planning on conducting a focus group in the near future? This is a great data gathering technique that can elicit useful information about your programs.

A focus group is a method of group interviewing in which the interaction between the moderator and the group, as well as the interaction between group members, serves to produce information and insights in response to carefully designed questions. When desired information about behaviors and motivations is more complex than a questionnaire, focus groups with well-designed questions can often get at more honest and in-depth information.

Selecting participants. Focus groups are typically comprised of six to 10 participants who have similar associations to the topic (i.e., Elementary teachers discussing a new reading curriculum). Selecting participants who are similar may help to share ideas more freely and may prevent results from being so mixed that no conclusions may be drawn.

Number of groups conducted. For evaluation purposes two to three focus groups are often conducted. Using only one focus group to arrive at conclusions about a particular topic is risky since the opinions expressed may have had more to do with the group dynamics.

Organizing the meeting. Focus groups typically cover about five main questions (each with sub questions or probes) in the span of 90 minutes.

Setting considerations. The setting in which a focus group is conducted should be comfortable. Quality refreshments and comfortable chairs may go a long way in making participants who have volunteered their time to participate in a focus group feel appreciated. Tables and chairs should be arranged so that all participants can easily see one another.

Facilitator of a focus group. You should keep the following considerations in mind: have an assistant moderator, be mentally prepared, use purposeful small talk, use pauses and probes, record the discussion, and use an appropriate conclusion (summarize, review purpose, thanks and dismissal).

Finally, a few reminders about focus group questions: use open-ended questions, avoid Yes/No or Why questions, use "think back" questions (i.e: "Think back to a time when X,Y, or Z happened, what do you remember the most about that experience?"), use questions that get participants involved and focus the questions from general to specific.

If you would like more information on how to prepare and conduct a focus group please visit the Richard Krueger website as well as the NY State Teacher Centers on Program Evaluation. Please share your questions or comments in the comments section below, which could inform our future tips!

Josey Landrieu

Extension educator, program evaluation

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

The camp season is quickly approaching. For many youth, camp is an intense youth development experience and a wonderful leadership opportunity for camp counselors. As with any youth development program, it is critical to evaluate and measure whether you are achieving your intended goals. Staff can build in evaluation by:

  • Having an end of camp survey: Focus your measures around the key items that you are interested in knowing more about. Are you hoping that youth will gain leadership skills or that youth will try new activities? The Oct. 20 webinar on Analysis and Interpretation highlights findings from West Virginia 4-H camping. The way that this team targeted their evaluation around key questions is worth noting.
  • Building evaluation into your activities: Have quick evaluation questions as part of an activity (try some Creative Evaluation Strategies), or ask youth to rate their experience in the session they just completed. Make sure someone (maybe a camp counselor) is taking notes. This will not only help you analyze results but also helps counselors to gain skills in gathering data.
  • Using cameras to capture data: Have Flip or digital cameras available, and ask youth to take a photo of something that best describes their camp experience. You will begin to see trends in what youth capture as being most important.
  • Conducting mini-focus groups at the end of the day: Camp counselors could facilitate asking a series of questions with their cabin groups.
  • Mailing a survey home to youth and parents a week after camp is over: Find out if this experience left any lasting impressions.
Gathering evaluation data can help you to improve your camp program's quality and the camp experience for future years. It will also allow you to modify activities in "real time" to best meet the needs of your audience. Depending on the data, you may also learn what value your camp has for participants.


Best of luck evaluating your camp activities! I hope these ideas provide dialogue for your teams.

Samantha Grant

Extension educator, program evaluation


Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aid staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

There are many choices of online surveys that will help you in your evaluation efforts. The Program Evaluation team encourages you to consider using UMSurvey. This online survey tool, managed by the Office of Information Technology at the University of Minnesota, is particularly helpful if you are doing needs assessment or formative evaluation. Advantages to using UMSurvey are that it has the assurance of data security and University of Minnesota branding. The University encourages students, faculty, and staff to use this online survey tool because of these reasons and offers technical support through the website.

Go to www.oit.umn.edu/umsurvey/ to access everything you need to know about utilizing UMSurvey in your work.

Tips:

  • Watch the Online Orientation listed under Quick Links.
  • Before you craft your survey questions, take a look at the Question types and Flexible label sets under Survey Options. Having an idea of the type of questions is a great shortcut in developing questions.
  • Watch the Online Orientation again and use it to guide you through the steps of creating your survey.
Sherry Boyce

Extension educator, program evaluation

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aide staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Evaluation is fun - especially when it helps you to know if your program is working or how to improve it. At YOUth & U, I shared a handout on creative evaluation strategies. These three easy evaluation activities can be implemented with little supplies and can give you important feedback from your participants.

So what can you learn from these methods?

Checking the pulse of the group can help you to gauge how participants are responding to your content. If you see that your group is too low or too high, you can quickly modify your plan or ask more questions from participants to see what is making them uneasy.

Evaluations typically are implemented to be anonymous but the "Hands in the Air" activity can allow you to scan for individuals at the extreme. From this, you may learn that one participant is still grappling with the content, so you could have a quiet conversation with them or provide more clarification.

All of these methods give instant feedback. Process the results as a group and ask participants to add explanations of their responses. This easily turns into a great reflection activity.

Think back to the outcomes of your program. What did you promise? These evaluation strategies can help you see if you're hitting your target.

I hope you use these and other creative feedback strategies to get more "real-time" evaluation results.

Samantha Grant

Extension educator, program evaluation

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aide staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

With much of the program year still ahead, it is an excellent time to make a plan to take photos that can be used in your evaluation reports. Photos can add a lot to your PowerPoint or written evaluation report, but remember that good data and good presentation of data is critical.

Richard Krueger's chapter on "Story-Telling for Evaluation" has important points about the use of stories in evaluation reports. He states that, "Stories can help evaluators get their audience's attention, communicate emotions, illustrate key points or themes, and make findings more memorable". To some extent, good photos can tell a story and add value to your reports.

Krueger's chapter has important lessons that can apply to photographs and their use in evaluation reports.

Consider how photos can make the findings more memorable. The photos that are needed for enhancing evaluation reports are different than the ones we want for promoting our programs and for recognition of youth and adults involved in our programs. In evaluation reports, we can use photos to help the lessons from the data "stick" much the same way that stories are used. Photos can be used to illustrate the data or provide insights into the experience.

Develop a "shoot sheet" or checklist for all the types of photos you want. Here are a few questions to help you think about the types of photos you need for evaluation reports:

  • What are the evaluation questions for the program you are photographing?
  • How will the photo help tell your evaluation story?
  • Is the photo representative of the experiences of many youth in the program?
  • Does the photo illustrate a young person's engagement with the learning?
  • Is mastery illustrated?
  • Does the photo tell the story about the program (rather than focusing on an individual's accomplishments)?

Truth is essential. Krueger points out the importance of truth in stories. It is the same for photos, especially in evaluation. Well planned learning experiences don't require setting up photos. Planning for taking photos of real experiences is what we are striving for.

References
Krueger, Richard A. (2010). Using Stories in Evaluation. In Wholey, J., Hatry, K., & Newcomer, K. (Eds.) Handbook of Practical Program Evaluation 3rd Edition. San Francisco: Josey-Bass.

Sherry Boyce

Extension Educator, Program Evaluation

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aide staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

What is a PI (principal investigator) and when is PI certification needed?

Have you found yourself ready to start an evaluation or research project only to learn that you are not PI certified? Have you considered being a PI for a grant only to find out you need PI certification? If so, this YD evaluation tip might have information you've been looking for!

We would like to draw attention to the different situations in which an Extension educator (and other Extension employees) might need PI certification. At the University, all the training requirements, information, and forms submissions take place through the Office of the Vice President for Research.

One of the main areas of work that requires PI certification is when an employee wishes to lead a project as a principal investigator, often under a grant-funded project.

If that's the case, PI certification is needed and there is a simple way to get certified. The Extension employee would have to complete the Responsible Conduct for Research (RCR). This is an in-depth training opportunity that includes a short face-to-face training as well as an online component. Schedules of face-to-face components are all available online. This training series takes approximately 4-6 hours to complete.

Within Extension, PI certification is expected when:

  • You receive grant funding or are involved in a grant-funded project (major grants require certification) with the funds received through Sponsored Projects Administration (SPA).
  • Program leadership strongly recommends that Extension educators are certified (RCR)
  • You are, or are on a team that is, conducting a study that requires Internal Review Board (IRB) approval

University of Minnesota Extension has additional resources to help you decide if certification is needed or not. You can take a brief questionnaire that will let you know if certification is necessary.

In addition, a project might require the employee to go through IRB. To see if IRB is needed, please visit the IRB website where you'll find all the necessary information to complete this process.

Information pertinent to program coordinators interested in being PI certified:

Although there is not an across-the-board expectation that all program coordinators be certified as PIs, some may wish to participate in the training and assume this role for a specific project. If so, they should visit with their direct supervisor about their desire to increase their knowledge/skills in this area and how their participation will enhance their ability to contribute to important work within their county.

Josey Landrieu

Extension educator, program evaluation


Kate Walker's article titled, "The Multiple Roles the Youth Development Program Leaders Adopt with Youth," has been accepted for publication in Youth & Society. This article, based on Kate's dissertation research, examines the complex roles program leaders create in youth programs, and investigates how they balance multiple roles to most effectively respond to the youth they serve.

The full article is available through OnlineFirst:
http://yas.sagepub.com/content/early/2010/12/02/0044118X10364346.full.pdf+html

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aide staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Clean up your old practices and start the New Year by getting organized in building evaluations.

Do you get stuck building evaluations? With endless choices, how do you hone in on what you really want to know? Ideally you will have a logic model or theory of change to guide your thinking, but sometimes all we have are outcome statements or goals. Use your outcome statements as a primary guide in building your evaluation.

Make a list of all the questions that you would potentially use in an evaluation. Take this document and compare it with your goals and outcomes for your program. Draft three columns- yes, no, and maybe. Move the potential evaluation questions to one of these columns in response to the question, "Does this question fit with my outcomes?" In engaging in this process, you should come closer to building an evaluation that meets your program objectives and zeros in on the necessary components.

Happy New Year. The Evaluation team wishes you a year of using data for decision making!

Samantha Grant

Extension Educator, Program Evaluation

YD evaluation tips: Reflection

| Leave a comment

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aide staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

As we approach the end of another year and we reflect on our work as well as we prepare for a new year, I want to highlight the importance of REFLECTION in regards to your evaluation efforts. Here are a few ways in which reflection can help your evaluation efforts:

  • In the initial stages of issue identification or stakeholder needs assessment it is critical to engage into some sort of self-reflection. This could be accomplished either at the individual level (journaling, note taking, blogging, etc.) or as a group (guided conversations, note sharing, etc).
  • As you begin working with new evaluation tools (surveys, questionnaires, etc.) try to make room in the document for some of your own ideas and thoughts. You can take notes on the side of the page. Notes can include your first reactions to the tool or the question, additional questions you might have, and topics or concepts that might relate to the particular question.
  • Throughout the data collection process reflection continues to be a key element. Both the evaluator (evaluation team) as well as participants can be involved in the reflection process through focus groups, journaling, midpoint check-ins, etc. Journaling can be useful since it captures real time reactions to the tools and the process.
Reflection will not only help the evaluator become more effective, but it has the potential to provide much deeper insights into the evaluation process and tools that he or she is implementing as well as the reaction of the participating programs or individuals. The time you spend reflecting will pay off when you find yourself at the stages of analysis and reporting. Reflecting on how well you did in different dimensions not only helps you learn more about the efficiency and efficacy of your program, it also serves to build capacity.
Josey Landrieu

Extension educator, program evaluation


Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aide staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

The 2010 Data Scans provided selected data points from the 2007 Minnesota Student Survey. The statewide results from the 2010 Minnesota Student Survey have just been released. This is a great reason to visit the Minnesota Department of Education's web site.

This site is loaded with data that can be helpful in developing a County Needs Assessment. Here are some tips on navigating the home of the Minnesota Department of Education site: http://education.state.mn.us/.

  • Click on Learning Support on the search bar.
  • Then click on Safe and Healthy Learners.
  • Scroll down the menu on the right and click on Minnesota Student Survey. This page will give you access to the newly released 2010 state data as well as the 2007 data by county.
The Minnesota Student Survey is administered every three years to 6th, 9th, and 12th graders. The report has information, reported by students, on a variety of questions including risk factors and how students spend their out-of-school time. Almost all school districts participate and data is available on this site for the state as well as by county. You need to go to individual school districts to get access to local school district data. Watch this site for the 2010 county reports. Snapshots on Minnesota Youth, quarterly reports on special topics, is also found on the Learning Support page.


Sherry Boyce

Extension educator, program evaluation


Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aide staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

The beginning of the program year brings the opportunity to go back to the drawing board and craft new programs and modify existing ones. In being strategic about your work, try to build outcome statements into your program planning -- it will help you immensely in focusing your program and later in constructing an evaluation.

The Center for Disease Control and Prevention
has a great resource on developing SMART Outcomes. As a recap, SMART outcomes are:

  • Specific
  • Measureable
  • Achievable
  • Realistic
  • Time-bound

Taking the time to develop SMART outcomes will put you on the road to success.

Samantha Grant

Extension educator, program evaluation


Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aide staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

In a past edition of YD Update, YD Program Evaluation Leader Pam Larson Nippolt shared information on accessing an Extension Teaching Evaluation for YD staff. Please use this as a guide in collecting consistent evaluation feedback on your teaching and adapt the measure to fit your needs.

At Extension's fall conference, Family Development Evaluation and Research Specialist Mary Marczak and EFANS AFE Evaluation Specialist Tom Bartholomay led a session on Reflective Practice in Teaching. The presentation discussed five focus areas in the pursuit of reflective teaching practice: sound theory of change, research on best practices, peer review, participant feedback, and facilitator journals. Evaluations of teaching effectiveness are only one piece in this puzzle.

I want to highlight the importance of multi-methods to prevent us from becoming too narrow in our evaluations by only looking at participant's evaluations of our teaching. Even when considering information that can be collected from participants, the teaching evaluation tool does not get at program changes or other outcome evaluation measures. These forms of documentation need to be developed in response to your unique program design and will likely change from program to program. Think about what other information you will need to document evidence of your teaching and build a reflective practice.

Samantha Grant

Extension educator, program evaluation

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aide staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Let's revisit design for those end-of-meeting surveys. Whether you choose to use a four point scale or a five point scale; spend some time crafting your instructions and tables. It can make your task of compiling results much easier.

One recommendation is to direct the participants to circle their responses. This means putting the initials (SD, D, N, A, SA) in each line following the statement. Survey instructions that direct participants to place an X in the box or check the space in the column under the heading are often used. However, it is sometimes more difficult to discern which response is actually indicated (a check may be placed halfway between responses).

Do you have practical tips from your survey experience that you would like to share?

This tip is from Richard Krueger's class, "Principles and Methods of Evaluation."

Sherry Boyce

Extension educator, program evaluation

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aide staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Sometimes survey design is more about personality than method. How do you choose whether to have four or five response options in your response scale?

There are pros and cons to both choices. A four-response option limits the variability of responses. Without the choice of a midpoint, it forces participants to choose one pole. This can be a good thing, as it doesn't give participants the option to be neutral, but it may also force participants to choose a side even if they don't believe strongly in it. It may also make people lean towards being more positive.

A five response option gives more variability in responses. I have found that it gives a clearer picture because often times our evaluations have positive reactions from participants (and a four item scale really distorts the positive impact).

One thing to note on either scale is that very young youth (around 4th grade and younger) tend to miss the distinction between agreeing and strongly agreeing and will only pick the most extreme ends of the scale. It's a developmental milestone to begin to think abstractly, so consider the age of your survey respondents in the development process.

What design works best for you?

Samantha Grant

Extension educator, program evaluation


Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aide staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

The main role of an evaluation question is to focus the evaluation on the areas of program performance at issue for decision makers and stakeholders. With that said, good evaluation questions should:

  • Be reasonable and appropriate for performance dimensions that are relevant to stakeholders and that represent domains in which the program can realistically have accomplishments. We should first relate the question to the context of the program and secondly analyze their appropriateness in relationship to the findings reported in research.


    Example of an unreasonable question: "Are our education and outreach services successful in informing the public about the risk of AIDS?"

    Example of a reasonable question: "Do our education and outreach services raise awareness of AIDS issues among the audiences addressed?"

  • Be answerable, that is they must be specific, concrete, practical, and measurable. For example, it would be difficult for an evaluator determining if a youth development program improved a community's competitiveness in the global economy. In order to avoid drafting an unanswerable question we should include measurable performance dimensions.


    Example of an unanswerable question: "Does this program enhance family values?"

    Example of an answerable question: "Are we reaching the families most in need of this program?"

This tip was adapted from Rossi, P. Lipsey, M. & Freeman, H. (2004). Evaluation: A systematic approach. (7th edition). Sage Publications: Thousand Oaks, CA

Josey Landrieu

Extension educator, program evaluation


Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aide staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Tip for this edition - when you start to say, "It would be interesting to know about..." stop yourself and exclude that item from your evaluation. I've worked with many groups in the brainstorming process of developing an evaluation tool and have often heard this line. Unless you are truly conducting an exploratory evaluation, chances are that it's an accessory item that isn't important to your main question.

I've seen that most of the time people will say this in response to getting more information about their survey respondents. They'll want to know more about their years of service, career choices, residence, etc. These questions tend to be off putting to your audience as they are identifying information and chances are the answers don't get used - they stay at just being interesting. So save yourself the effort and keep your evaluations focused on your meaningful, main points.

Samantha Grant

Extension educator, program evaluation

The Program Evaluation Educator team, with input from many and with the leadership of Samantha Grant, designed a tool for all Educators who want evidence from adult participants about the effectiveness of their teaching methods. This tool is designed for use across settings, curricula, and content areas and can be used as a supplement to other evaluation efforts (for example, the Youth Work Institute has a set of items that are always used when courses are offered, these items could be added to that list for individual Educators to track and assess their teaching). This 13-item tool can be used to track responses across your teaching efforts for the purpose of summarizing your work, and making adjustments to your teaching. Stay tuned for a session in October at the Extension Fall Conference about using tools for gathering evidence on effectiveness in teaching. If you have questions about using this tool, please contact Samantha Grant or Pam Larson Nippolt. The tool is located on the Staff Only page under "Evaluation Resources":

http://www1.extension.umn.edu/Youth/staffonly/

Thanks,

Pam Larson Nippolt

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aide staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Non-formal learning programs can make improvements and justify their activities through data collection and analysis. It's that easy! So what can data do for you and your program? It can guide continuous quality improvement efforts to enhance the program offerings, identify opportunities for professional development, address and advocate for the changing needs of youth and their families, and share information with others.

What data could you collect?

  • Youth-level measures: program attendance, school attendance, grades, family structure, youth's interests, etc.
  • Program-level measures: positive youth-adult relationships, activities that offer leadership opportunities, activities for hands-on opportunities, etc.

What can you do with the data?

Accurate attendance records are essential to provide information on cost per child being served, program participation and yearly retention. Parent satisfaction surveys along with youth attendance and retention are powerful datasets that might be needed to advocate for more funds, etc., and facilitate the desired change.

More information can be found in the summer 2010 issue of After School Today published by the National Afterschool Association at http://www.naaweb.org/ .

Josey Landrieu

Extension educator, program evaluation

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aide staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

Sometimes it's challenging to know where to find good, reliable information about evaluation. I would recommend bookmarking the Harvard Evaluation Exchange. You can also subscribe to receive the information via e-mail.

The Exchange publishes information impacting the field of evaluation of children, families and communities. The information goes beyond the school setting and is relevant to non-formal youth development contexts. I would recommend checking out the archived editions such as Evaluating Out-of-School Time Program Quality. As an added benefit, you can also build your scholarship by contributing to the Evaluation Exchange after you design a super star evaluation.

Samantha Grant

Extension educator, program evaluation

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aide staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

This is a little early to reveal my sources for tips and tricks, but I encourage all of you to check out the AEA365 alerts. Sponsored by the American Evaluation Association, this blog showcases a tip a day by and for evaluators. Topics run the spectrum, so make sure to search the archives for relevant content. You can subscribe to the blog and receive daily posts in your e-mail or RSS. Use the "Subscribe to Blog via..." link on the right hand side of the webpage.

I had a tip published on June 25th. Check it out for the tip this edition.

I also serve as a curator for blog content, so if you would be interested in adding a tip or trick, I would be happy to work with you to develop an idea.

Samantha Grant

Extension educator, program evaluation

Note: The Youth Development Program Evaluation team will share tips and resources in YD Update to aide staff in program evaluation efforts. The information will be archived on the Staff Only web page. Please send any questions or suggestions for future topics to samgrant@umn.edu.

In survey design it is very important to use language that respondents will understand. This is especially true when working with youth who have varying levels of reading abilities. Have you ever wondered the reading level of your writing? Use a "readability calculator" to assess your text.

Just copy and paste text, and you're provided with a reading ease score as well as some grade level scores. Deciding which scale to base your decision on is the only part that requires some thought. Click "submit" and the complex words in the document will be underlined.

The use of this tool goes beyond survey design and can also be used in marketing, newsletter submissions, and e-mails.

This tip is written at a 6th grade reading level. Would you agree?

Samantha Grant

Extension educator, program evaluation

YD evaluation tips: Wordle

| Leave a comment

Greetings from the YD Program Evaluation team! Incorporating evaluation into programs is sometimes a challenging task. To help make this task a little bit easier, our team will be sharing tips and resources in YD Update that will aide you in your program evaluation efforts. We will also archive these resources and tips on the Staff Only web page. If there is a topic that you would like us to address, please contact me at samgrant@umn.edu.

For the first edition, I'm going to start out with my new personal favorite: Wordle.

Wordle: This tool will blow your mind. Are you overwhelmed with qualitative feedback that you don't know what to do with? Wordle is a data visualization tool that can help to make sense out of your qualitative data. Simply copy and paste the text into Wordle to create an image. Words that are used more often carry more weight and thus are bigger.

Ways to use Wordle:

  • To get a first glimpse at data.
  • To use as a graphic in a report.
  • To begin conversations about evaluation data. It's great for adults or youth who have little evaluation experience.
  • To see the differences between multiple responders (i.e. compare responses from adults versus youth.)

Try it out to see all the neat features, like changing text size, layouts, and removing common words.

The one drawback to this tool is you cannot save the image directly from Wordle. To keep the image, download Jing. This will allow you to take a picture of any portion of your screen (another fun tool that will get use once you download it.)

Samantha Grant

Extension educator, program evaluation

  • © 2014 Regents of the University of Minnesota. All rights reserved.
  • The University of Minnesota is an equal opportunity educator and employer. Privacy