University of Minnesota Extension
http://www.extension.umn.edu/
612-624-1222
Menu Menu

Extension > Food, Agricultural and Natural Resource Sciences Intranet > Archives > Evaluation Happenings Archive

Recently in the Evaluation Happenings Category

Evaluation data can be used to promote your program--not just to report for accountability purposes. While it is always a good idea to demonstrate the public value of your program to funders, when targeting future participants, you might want to emphasize the private value of your program. If you are purposeful about the data you collect, you can likely use the same data to do both; you will just want to present it differently for each audience. Here are some ideas on how to do this:

Type of data Public value emphasis Private value emphasis
Statistics on knowledge gains and behavior changes Show connection to program's impact-improvements in environment, economy and community Highlight what participants will learn in program
Positive feedback from open-ended survey questions Tell a story about how the things that your program teaches participants leads to a real impact Quote past participants as testimonial about value of the program
Data on participants' background Demonstrate the scope of your program's impact Communicate who will benefit from participating in your program

-Whitney Meredith, evaluation specialist

While end-of-workshop surveys are a great way to gather data on your program's short-term outcomes, it is worthwhile to think about other data collection methods and evaluation questions. The following are alternative questions and ways to collect answers without using a survey:

What motivates your participants to attend your program?
Have participants interview each other as an ice-breaker. Break participants into groups of three. Assign one the role of interviewer, one the role of interviewee, and one the role of recorder. Give them a set amount of time (eg: two minutes) and have them interview each other. Rotate so everyone has a chance to be interviewed. Collect their answers.

What major challenges do participants face in tackling the issue your program addresses?
Give participants post-it notes and have them write down their greatest challenges and stick it on the wall. Their responses not only can be used to start discussion, but also collected and analyzed as needs assessment data.

How have your participants successfully applied what they learned in your program?
Have a photograph or video contest to gather success stories--and promotional material for your program. Ask participants to post pictures or videos of how they have been successful on your program's Facebook page and award a prize to the best one. Those who participate can also be used to generate a list of people to potentially interview for data that demonstrates your program's influence.

How do people use your website?
Use Google analytics data to identify when people come to your website and what sections they use. Such data can help you understand what drives people there (a newsletter, an external event, attention in the press?) and in what issues are they most interested.

If you need help or want to brainstorm other ideas on how to collect data, please contact me. Also note that the new Extension evaluation web section (still a work in progress!) also has an overview of different evaluation methods and different types of evaluation you could use.

-Whitney Meredith, evaluation specialist

Use evaluation data to tell your story

| Leave a comment

When telling your program's story, use evaluation data to illustrate your program's value. In fact, this is essentially what you should do when asked for success stories for the Federal Report. Moreover, the three parts of a Federal Report success story (issue, what has been done and results) provide a useful framework to structure any story to communicate your program's scope, effectiveness and impact.

Learn about the types of evaluation data you can use to illustrate the three parts of a Federal Report success story.

1. Issue


  • Needs assessment data illustrate the problem your program addresses.

  • Data collection method: surveys, focus groups and key informant interviews that ask about interests, priorities and needs, secondary data



2. What has been done

  • Output data give an idea of scope of your program.

  • Data collection method: tracking number of events, counting number of participants, surveys to collect background information on participants



3. Results

  • Outcome data demonstrate your program's effectiveness. However, be sure to articulate how your outcomes can be linked to an impact of public value. Don't just say participants said they will change their behavior, but explain what that behavior is and how that potentially leads to a tangible impact, such as money saved or acres influenced. Personal stories or quotes can be used to make your results more meaningful.

  • Data collection method: surveys with pre-post retrospective questions (including follow-up surveys), survey questions or interviews on how new practices influenced participants' work, focus groups

The Center for Disease Control has a useful guide on how to tell a success story. (It is targeted for public health stories, but the ideas can be applied to our work too.)

Below are examples of how to use evaluation data from past Federal Reports.

Examples

FORESTRY (2013)

Issue
One example of a community-based approach to forest management exists in Itasca County. With two million acres of rural forested land, the county hosts 45,000 residents and cabin owners.

Historically, the county averages 60 wildfires each year, and experts predict an increase in the frequency and intensity of fires. Access is an issue for rural fire trucks and emergency service vehicles. According to the Fed Gazette, estimates of the total cost of wildfires to landowners, investors and taxpayers range from 10 to 50 times the cost of fire suppression.

What has been done?
Through education to property owners and facilitation of nine sectors of public service in the county--including 18 rural fire departments--Extension helped mobilize the county to reduce risks from wildfire and improve the safety of Itasca County residents. In 2013, 276 property owners volunteered 19,891 hours to improve defensible space and remove hazardous materials around structures, improving access for emergency service vehicles.

Results
The value for this in-kind contribution equals more than $440,000. In addition, property owners contributed 1,089 tons of hazardous fuel. Deer River Hired Hands, a local nonprofit, hauled materials to neighborhood consolidation sites where it was chipped and used for renewable energy at the Minnesota Power Rapids Energy Center in Grand Rapids, Minnesota.


CROPS (2013)

Issue (Who cares and why)
In a highly scientific industry, producers need the newest information about crop and livestock production. One example is the need to examine and manage nitrogen content using recommended fertilizer nitrogen rates. With increasing costs for corn production and greater concern over environmental quality, it is critical that corn growers make sound decisions on purchased inputs. The most frequent and extreme cases of over-application of N in corn often occur in first and second year corn after alfalfa.

Minnesota Agricultural Experiment Station researchers conducted a statistical analysis using 259 site years of data from the literature and recent research conducted in Minnesota. They surveyed alfalfa-corn growers in Minnesota to quantify the extent to which they have adopted alfalfa nitrogen credits.

What has been done?
During 2013, follow-up educational presentations on alfalfa nitrogen credits to corn were given at five Extension workshops and at a program sponsored by a commercial soil testing laboratory. These presentations were given to producers and agricultural professionals managing over 1.9 million acres of land.

Results
According to participant evaluations, 55 percent of respondents said that they would modify future fertilizer nitrogen management for first year corn after alfalfa by much or very much. Assuming they reduce their applied or recommended fertilizer nitrogen rate by 40 pounds of fertilizer nitrogen per acre, and that first-year corn after alfalfa represents five percent of the cropland they manage or provide recommendations for, the educational presentation at these programs will cause growers to reduce fertilizer nitrogen use by 2.09 million pounds per year without reducing corn yield. This is an annual savings of $1.15 million at $0.55 per pound of fertilizer nitrogen. With this reduction in fertilizer nitrogen use, energy input to corn production will be reduced by 45.8 million mega joules per year.

MASTER NATURALISTS (2012 Federal Report)

Issue (Who cares and why)
With a significant percentage of its geography preserved in forests, waters and natural fields, organizations struggle to provide all needed environmental education and protection to Minnesota.

What has been done?
Master Naturalists work with and through organizations that are developing and delivering projects that educate and engage citizens and act to make a difference.

Results
With an increased participant pool and more instructors, volunteers and organizations across the state made a stronger impact on Minnesota's land and water. According to the longitudinal study, organizations find Master Naturalist volunteers to be useful in the following ways:

1. Building a network or community invested in their organization.
2. Producing an improvement or outcome for their environmental center.
3. Increasing educational support and leadership.
4. Increasing general awareness of the environment in the community and for organizations.

As one example of an impact, an organization reported, "We were lucky enough to have a Master Naturalist volunteer design our butterfly garden, a project that would have not come to fruition without that particular volunteer."

-Whitney Meredith, evaluation specialist

How Educator Abby Neu uses Qualtrics

| Leave a comment


For those of you who haven't started using Qualtrics for your surveys, it's time to start! As Abby explains, it's easy.

Qualtrics also has a useful support page where you can learn basic to advanced skills. You'll find step-by-step instructions with screenshots and videos. I am also compiling a list of helpful Qualtrics tricks. If you have discovered any, please submit them.

Read
a previous newsletter article for basics on Qualtrics. All you need is your x500 and password to create an account. Go to umn.qualtrics.com to get started.

-Whitney Meredith, evaluation specialist

As you're preparing for your spring and summer events, be sure to plan how you'll evaluate events; evaluations serve you best when they're part of your program planning process and not an afterthought. Event evaluation planning helps you:
  1. Refine your team's understanding of the purpose and objectives of your event and how that connects with your program's objectives and public value.
  2. Consider the interests of your target audience and other stakeholders and how you can best meet their needs.
  3. Identify what data will be useful to collect at your event and how you can use it to take your program to the next level.
  4. Strategize on the best way and time to distribute your evaluation--and to plan ahead to do so.
Download this worksheet to help with evaluation planning.
-Whitney Meredith, evaluation specialist
Most of you use some type of end-of-workshop evaluation survey to gather feedback on your program, which is great! However, as you go into the spring and summer workshop season, take some time to ensure you're getting the most out of your surveys. The following are some questions to ask yourself as you review your surveys:
  1. Are you asking the right questions?
    • What information would help take your program to the next level?
    • What information about your program would really impress your stakeholders?
    • Are the questions providing useable and useful data?
    • Do you have the skills and time needed to analyze the data and use the results?
  2. Are you implementing the survey in the most effective way possible?
    • Are you distributing your survey at a strategic time so response rates are maximized?
    • If you're using a paper survey, does it fit on one page (front and back)?
    • Do you communicate to participants why completing the survey is important?
    • What incentives can you raffle off to increase response rate?
    • Are you using the right mode of survey distribution? Would using TurningPoint Clickers or a follow-up online survey to collect data make more sense than a paper survey?
  3. Is an end-of-workshop survey actually the method of data collection you should be using? Here are some alternatives:
    • Have participants interview each other.
    • Have participants identify goals. Collect their goals and follow up with them in a couple months to check their progress.
    • Post questions on the wall and have participants stick their answers on the wall with sticky notes.
    • Collect demographic data using Qualtrics on a tablet when they check in. This frees you up to use other data collection methods--and saves you from having to enter their information later.
  4. Have you sent your survey to me for feedback? Please don't hesitate to ask me to review it! (I prefer to get it at least a week before you need it ready.)
-Whitney Meredith, evaluation specialist

Federal Report: highlights from our Center

| Leave a comment

Thank you for turning in your reporting data! Of those who were asked to lead the effort to collect data for their program team, 92 percent sent your data on time. This puts all of us in a position to show the public value of your work to stakeholders.

Preliminary results show the following as some of what you accomplished as a Center in 2013:

  • The number of participants you reached could fill TCF Stadium more than 5.5 times.
  • You averaged almost 50 events a week.
  • Volunteers logged hours the equivalent of more than 250 full-time employees.
  • Your participants potentially influenced almost 6 percent of the land in Minnesota.
  • About 271,000 people know more as a result of your work, which is more than 3 times the population of Duluth.
  • About 261,000 participants said they would change their behavior as a result of your work, which is almost 4 times the population of St. Cloud.

I'll be sending summary statistics to you and your team in the next few weeks. Please look over what you and your coworkers accomplished and consider how you might use this data. Here are some ideas:

  • Meet with your team and use the data to talk about what you did well and what you can improve.
  • Take time now to plan how you will collect evaluation data for the upcoming year. Identify what data you need to collect for reporting purposes and what data will be useful to your program. I'm happy to help you think through this and develop tools.
  • Share your success with each other and your stakeholders. You did a lot of valuable work this past year!
Thanks again and good job!


-Whitney Meredith, evaluation specialist


Federal Report deadline approaching

| Leave a comment

The deadline for annual reporting data is Monday, February 10. At this time, you should have received a request either from me or one of your teammates regarding data. If you have not, please let me know.

After you finish gathering your reporting data, spend some time developing and/or revising your evaluation plan for the upcoming year. Design your evaluations so they collect both information that is needed for reporting and that is useful to you.


Questions to help get you started thinking about what data to collect:


  1. What do you want your participants to learn?

  2. What practices do you want them to implement as a result of what they learn?

  3. How could you deliver your program better?

  4. How can you best demonstrate the public value of your program?

  5. What information about your program would help you be more effective?

Get started by using this planning sheet (81 K DOC). If you need help developing evaluation tools, or if you want me to review what you have already developed, please contact me!

-Whitney Meredith, evaluation specialist

It's Federal Report time!

| Leave a comment

What is the Federal Report?
The Federal Report is a document we turn in to the federal government that demonstrates what we have accomplished in the past year. In turn, we receive funding. Think of the Federal Report as our chance to show off the public value of our programs to the federal government.

When is the Federal Report data due?

Federal Report data is due to Whitney Meredith by Monday, February 10.

What do you need to do to prepare?

I will be sending program team leaders specific requests for the Federal Report. Coordinating how you gather the requested data with the rest of your program team(s) will be the greatest challenge. Please put some thought on how to best approach this.
In order to help your program team(s) get the data ready, you should have the following information about your program(s) ready to go:

  • Target audience
  • # events
  • # direct contacts with adults
  • # direct contacts with youth
  • # nonwhite participants
  • # indirect contacts with adults (newsletter subscribers, website traffic)
  • # volunteers
  • # publications
  • % achieved significant learning gains in targeted knowledge areas
  • % reporting change in targeted practices/behaviors
  • Success stories illustrating quantitative results

Some programs have additional outputs and outcomes to report, which your program team leader will be made aware of. Also, we recognize that gathering this data may be difficult. Your best estimates will suffice.

What's in it for you?
Gather this data to learn how your program is measuring up to your goals. You can potentially use the data to shape your plan of work, to start a discussion with your program team on how to improve your program, or to show off your program's accomplishments to key stakeholders.

-Whitney Meredith, evaluation specialist

All about TurningPoint Clickers

| Leave a comment

What are TurningPoint Clickers?

TurningPoint Clickers are an easy way you can use technology to collect data during PowerPoint presentations. TurningPoint allows you to add slides with questions that your audience can answer using a "clicker"--a small, remote control-like device that has buttons with both numbers and letters as answer options. You collect the responses and see the results in real time, which means your audience gets to see the results too. You can also export the data into Excel.

We're sorry, you need Javascript enabled to view this video.



What kind of questions can I ask with TurningPoint Clickers?

Multiple choice questions are a natural fit with the clickers' buttons. This means you could use clickers to quiz participants during a presentation to see if they understand a concept or you could use the clickers at the end to assess their learning gains and intentions to change their behavior. You can also poll your audience, have them enter a numerical answer, or have them use the clickers' alphabet mode to type a short response.


Where can I get TurningPoint Clickers?

You can check them out at many regional offices. Extension IT also has clickers you can borrow.


How do I get started?

First you will have to download TurningPoint software, which is free. The clickers you check out should come with instructions on how to do so. (Instructions vary depending on the version of the clickers you're using.) Once you've downloaded the software, a TurningPoint tab will appear in your PowerPoint toolbar. Click on that tab to insert a question slide. You can find tutorials on how to use TurningPoint here.

-Whitney Meredith, evaluations

Qualtrics 101

| Leave a comment

What is Qualtrics?
Qualtrics is an online survey tool. The University recently got a campus-wide license, which means it is now available for everyone to use. I have been using it for over two years now and love it!

Why use it instead of SurveyMonkey?
In my experience, Qualtrics is more user-friendly and has more features. This means if you've used SurveyMonkey, making the transition should be relatively easy. In addition, it meets the University's data security requirements.

How do I get an account?
Getting an account is easy! All you need is your x500 and password. Set up an account.

What features might be useful?


  • You can send personalized emails using the panel feature. This feature also lets you send reminders to those who have not responded.

  • You can use piped text, which allows you to insert respondent-specific information into questions. This might be information based on how a respondent answered a previous question, or it might be information from embedded data, which is information you have saved about a respondent, such as what workshop they attended.

  • At the end of your survey, you can redirect respondents to your website. Go to the "survey termination" section under "survey options" to do this.

  • You can share a survey. Go to the "collaborate" button and add people to review or edit your survey.

  • You can create basic survey reports and also export the data into Excel and SPSS.

For more information, go to the University's Qualtrics User Guide.

-Whitney Meredith, evaluation

Program business plans: next steps

| Leave a comment

Thank you all for attending our panel session on Program Business Plans last Wednesday afternoon at Program Conference! Hearing your feedback was very useful, and we are working on figuring out how we can best share your program business plans within the Center. This way, one program can learn from another program. Also, please feel free to let me know other suggestions for ways to improve the program business planning process.


IMG_1294.JPG

Program business plan panel
(left to right) Dianne Sivald, grants manager; Tom Rothman, director of ag stakeholder outreach; Jeff Reneau, program leader; Sam Bauer, educator; Susanne Hinrichs, regional director; and moderator Whitney Meredith, evaluation specialist.


I hope you left the panel thinking about how your program business plans are not an assignment, but a tool that can be used for multiple purposes.

In fact, one way they can be used is to help you identify the data you need for the Federal Report, which is not that far away. All programs need to report their outputs and outcomes.

Outputs are what you include in your deliverable section. These are the things you count for the Federal Report. For example, you may have three workshops that train 50 people.

Outcomes are what you identify in your expected results section and results measurement section. They are what you hope your participants will do as a result of your program, such as increase their knowledge (learning gains) or implement best management practices (behavior changes). These are things you measure in your program evaluation.


IMG_1298.JPG


If you have done a thorough job in these sections in your program business plan, collecting the data and turning it in for the Federal Report should be relatively easy. Please let me know if you have any questions!

-Whitney Meredith, evaluation

  • © 2014 Regents of the University of Minnesota. All rights reserved.
  • The University of Minnesota is an equal opportunity educator and employer. Privacy