University of Minnesota Extension
Menu Menu

Extension > Food, Agricultural and Natural Resource Sciences Intranet > Archives > Evaluation Happenings Archive

Recently in the Evaluation Happenings Category

Most of you use some type of end-of-workshop evaluation survey to gather feedback on your program, which is great! However, as you go into the spring and summer workshop season, take some time to ensure you're getting the most out of your surveys. The following are some questions to ask yourself as you review your surveys:
  1. Are you asking the right questions?
    • What information would help take your program to the next level?
    • What information about your program would really impress your stakeholders?
    • Are the questions providing useable and useful data?
    • Do you have the skills and time needed to analyze the data and use the results?
  2. Are you implementing the survey in the most effective way possible?
    • Are you distributing your survey at a strategic time so response rates are maximized?
    • If you're using a paper survey, does it fit on one page (front and back)?
    • Do you communicate to participants why completing the survey is important?
    • What incentives can you raffle off to increase response rate?
    • Are you using the right mode of survey distribution? Would using TurningPoint Clickers or a follow-up online survey to collect data make more sense than a paper survey?
  3. Is an end-of-workshop survey actually the method of data collection you should be using? Here are some alternatives:
    • Have participants interview each other.
    • Have participants identify goals. Collect their goals and follow up with them in a couple months to check their progress.
    • Post questions on the wall and have participants stick their answers on the wall with sticky notes.
    • Collect demographic data using Qualtrics on a tablet when they check in. This frees you up to use other data collection methods--and saves you from having to enter their information later.
  4. Have you sent your survey to me for feedback? Please don't hesitate to ask me to review it! (I prefer to get it at least a week before you need it ready.)
-Whitney Meredith, evaluation specialist

Federal Report: highlights from our Center

| Leave a comment

Thank you for turning in your reporting data! Of those who were asked to lead the effort to collect data for their program team, 92 percent sent your data on time. This puts all of us in a position to show the public value of your work to stakeholders.

Preliminary results show the following as some of what you accomplished as a Center in 2013:

  • The number of participants you reached could fill TCF Stadium more than 5.5 times.
  • You averaged almost 50 events a week.
  • Volunteers logged hours the equivalent of more than 250 full-time employees.
  • Your participants potentially influenced almost 6 percent of the land in Minnesota.
  • About 271,000 people know more as a result of your work, which is more than 3 times the population of Duluth.
  • About 261,000 participants said they would change their behavior as a result of your work, which is almost 4 times the population of St. Cloud.

I'll be sending summary statistics to you and your team in the next few weeks. Please look over what you and your coworkers accomplished and consider how you might use this data. Here are some ideas:

  • Meet with your team and use the data to talk about what you did well and what you can improve.
  • Take time now to plan how you will collect evaluation data for the upcoming year. Identify what data you need to collect for reporting purposes and what data will be useful to your program. I'm happy to help you think through this and develop tools.
  • Share your success with each other and your stakeholders. You did a lot of valuable work this past year!
Thanks again and good job!

-Whitney Meredith, evaluation specialist

Federal Report deadline approaching

| Leave a comment

The deadline for annual reporting data is Monday, February 10. At this time, you should have received a request either from me or one of your teammates regarding data. If you have not, please let me know.

After you finish gathering your reporting data, spend some time developing and/or revising your evaluation plan for the upcoming year. Design your evaluations so they collect both information that is needed for reporting and that is useful to you.

Questions to help get you started thinking about what data to collect:

  1. What do you want your participants to learn?

  2. What practices do you want them to implement as a result of what they learn?

  3. How could you deliver your program better?

  4. How can you best demonstrate the public value of your program?

  5. What information about your program would help you be more effective?

Get started by using this planning sheet (81 K DOC). If you need help developing evaluation tools, or if you want me to review what you have already developed, please contact me!

-Whitney Meredith, evaluation specialist

It's Federal Report time!

| Leave a comment

What is the Federal Report?
The Federal Report is a document we turn in to the federal government that demonstrates what we have accomplished in the past year. In turn, we receive funding. Think of the Federal Report as our chance to show off the public value of our programs to the federal government.

When is the Federal Report data due?

Federal Report data is due to Whitney Meredith by Monday, February 10.

What do you need to do to prepare?

I will be sending program team leaders specific requests for the Federal Report. Coordinating how you gather the requested data with the rest of your program team(s) will be the greatest challenge. Please put some thought on how to best approach this.
In order to help your program team(s) get the data ready, you should have the following information about your program(s) ready to go:

  • Target audience
  • # events
  • # direct contacts with adults
  • # direct contacts with youth
  • # nonwhite participants
  • # indirect contacts with adults (newsletter subscribers, website traffic)
  • # volunteers
  • # publications
  • % achieved significant learning gains in targeted knowledge areas
  • % reporting change in targeted practices/behaviors
  • Success stories illustrating quantitative results

Some programs have additional outputs and outcomes to report, which your program team leader will be made aware of. Also, we recognize that gathering this data may be difficult. Your best estimates will suffice.

What's in it for you?
Gather this data to learn how your program is measuring up to your goals. You can potentially use the data to shape your plan of work, to start a discussion with your program team on how to improve your program, or to show off your program's accomplishments to key stakeholders.

-Whitney Meredith, evaluation specialist

All about TurningPoint Clickers

| Leave a comment

What are TurningPoint Clickers?

TurningPoint Clickers are an easy way you can use technology to collect data during PowerPoint presentations. TurningPoint allows you to add slides with questions that your audience can answer using a "clicker"--a small, remote control-like device that has buttons with both numbers and letters as answer options. You collect the responses and see the results in real time, which means your audience gets to see the results too. You can also export the data into Excel.

We're sorry, you need Javascript enabled to view this video.

What kind of questions can I ask with TurningPoint Clickers?

Multiple choice questions are a natural fit with the clickers' buttons. This means you could use clickers to quiz participants during a presentation to see if they understand a concept or you could use the clickers at the end to assess their learning gains and intentions to change their behavior. You can also poll your audience, have them enter a numerical answer, or have them use the clickers' alphabet mode to type a short response.

Where can I get TurningPoint Clickers?

You can check them out at many regional offices. Extension IT also has clickers you can borrow.

How do I get started?

First you will have to download TurningPoint software, which is free. The clickers you check out should come with instructions on how to do so. (Instructions vary depending on the version of the clickers you're using.) Once you've downloaded the software, a TurningPoint tab will appear in your PowerPoint toolbar. Click on that tab to insert a question slide. You can find tutorials on how to use TurningPoint here.

-Whitney Meredith, evaluations

Qualtrics 101

| Leave a comment

What is Qualtrics?
Qualtrics is an online survey tool. The University recently got a campus-wide license, which means it is now available for everyone to use. I have been using it for over two years now and love it!

Why use it instead of SurveyMonkey?
In my experience, Qualtrics is more user-friendly and has more features. This means if you've used SurveyMonkey, making the transition should be relatively easy. In addition, it meets the University's data security requirements.

How do I get an account?
Getting an account is easy! All you need is your x500 and password. Set up an account.

What features might be useful?

  • You can send personalized emails using the panel feature. This feature also lets you send reminders to those who have not responded.

  • You can use piped text, which allows you to insert respondent-specific information into questions. This might be information based on how a respondent answered a previous question, or it might be information from embedded data, which is information you have saved about a respondent, such as what workshop they attended.

  • At the end of your survey, you can redirect respondents to your website. Go to the "survey termination" section under "survey options" to do this.

  • You can share a survey. Go to the "collaborate" button and add people to review or edit your survey.

  • You can create basic survey reports and also export the data into Excel and SPSS.

For more information, go to the University's Qualtrics User Guide.

-Whitney Meredith, evaluation

Program business plans: next steps

| Leave a comment

Thank you all for attending our panel session on Program Business Plans last Wednesday afternoon at Program Conference! Hearing your feedback was very useful, and we are working on figuring out how we can best share your program business plans within the Center. This way, one program can learn from another program. Also, please feel free to let me know other suggestions for ways to improve the program business planning process.


Program business plan panel
(left to right) Dianne Sivald, grants manager; Tom Rothman, director of ag stakeholder outreach; Jeff Reneau, program leader; Sam Bauer, educator; Susanne Hinrichs, regional director; and moderator Whitney Meredith, evaluation specialist.

I hope you left the panel thinking about how your program business plans are not an assignment, but a tool that can be used for multiple purposes.

In fact, one way they can be used is to help you identify the data you need for the Federal Report, which is not that far away. All programs need to report their outputs and outcomes.

Outputs are what you include in your deliverable section. These are the things you count for the Federal Report. For example, you may have three workshops that train 50 people.

Outcomes are what you identify in your expected results section and results measurement section. They are what you hope your participants will do as a result of your program, such as increase their knowledge (learning gains) or implement best management practices (behavior changes). These are things you measure in your program evaluation.


If you have done a thorough job in these sections in your program business plan, collecting the data and turning it in for the Federal Report should be relatively easy. Please let me know if you have any questions!

-Whitney Meredith, evaluation

Thinking beyond surveys

| Leave a comment

Surveys are not the only way to gather evaluation data for your program! While many of you use surveys as your primary data collection method, it's worth thinking about other options, such as interviews, observations, focus groups, secondary data, and document review. In fact, using more than one method to answer your evaluation questions not only will give you greater understanding of your program, but can also increase the validity of your conclusions. Ask Whitney ( to help you determine how to design an evaluation using multiple methods.

Evaluation data collection methods

More than just surveys!
Method Advantages Disadvantages
  • Relatively inexpensive
  • Online surveys allow you to invite many people to participate at no additional cost
  • Online survey programs often analyze at least some of your data for you
  • Can gather the numbers you need for the Federal Report
  • Good items are harder to develop than you think!
  • Hard to understand participants' thought process and motives
  • Low response rate and response bias can decrease data quality
  • Can provide in-depth understanding
  • Researcher can clarify questions and ask for more detail when needed
  • Can collect stories that illustrate your survey data, which is needed for the Federal Report
  • Time consuming and relatively costly
  • Analyzing data can be subjective
  • Cultural norms and rapport with interviewer may prevent respondents from answering honestly
  • Relatively unobtrusive
  • Relatively objective
  • Hard to identify indicators you can observe
  • Participant behavior may be influenced by presence of observer
  • Can be time consuming
Focus groups
  • More cost efficient and time efficient than interviews
  • Can provide in-depth understanding
  • Researcher can clarify questions and ask for more detail when needed
  • Logistics may be challenging
  • Group setting may influence responses
  • Requires strong facilitation skills
  • Analyzing data is time consuming and can be subjective
Secondary data (e.g. census data)
  • Unobtrusive
  • Don't have to spend time collecting data
  • May be hard to access data or to find relevant data
  • Quality of data influenced by source
Document review
  • Unobtrusive
  • Relatively inexpensive
  • May be hard to gain access to needed documents

-Whitney Meredith, evaluation

Amy Rager on Evaluation

| Leave a comment

If you've ever wondered what evaluation can do for your team, I encourage you to watch this interview I conducted with Amy Rager.

-Whitney Meredith, evaluation

We're sorry, you need Javascript enabled to view this video.

Evaluation can help you figure out the "why" of your program because:

  • Evaluation planning helps identify desired results and how your program achieves them.
  • Evaluation data provides evidence that your program is actually achieving results.
  • Evaluation data can unearth valuable results you had not considered.
If you're having trouble starting, ask yourself:
  1. What activities, events, and products do I currently produce?
  2. What learning gains and changes in attitudes/aspirations do I hope my participants will have after they experience my activities, events, and products?
  3. What practices do I want my participants to implement as a result of my program?
  4. Implementing these practices contributes to achieving what big, social, economic, or environmental goal?
  5. How will I measure my results?
This worksheet helps you determine how your program works and how to measure results.

-Whitney Meredith, evaluation specialist

Evaluation use

| Leave a comment

When planning your evaluation, it is very important to think about how the results will be used. In fact, there is a whole field of evaluation based on evaluation use! (Click here for more information.) This is because, if no one is going to use the data you're collecting, why collect it?

Accordingly, when planning an evaluation, you should always keep in mind how the data will be used. These are some questions to think about:

  • What do you want to know about your program, and what do your external stakeholders want to know?

  • If you're writing a survey, does each question have a clear purpose?

  • Are your survey questions framed in a way that you're actually asking for the information you need/want?

  • Who should you enlist to help you interpret the meaning of the results?

  • How should the results be presented/communicated to key stakeholders?

  • What can you do to facilitate the use of the results?

If you need help figuring out what questions to ask or what to do with your results, don't hesitate to contact me!

-Whitney Meredith, evaluation specialist

You first need to identify what information you can actually use. Think about:

  • What do you want to know about your program?

  • What do your stakeholders want to know about your program?

  • How are you and your stakeholders going to use the data you collect?

Answering these questions should help you avoid collecting information just because it sounds good or you've always done it. Moreover, thinking about what you will do with the data should also help you decide how to frame your questions and when they should be asked.

Click here for a handout that provides an overview of different types of questions and how they might be used.

If you need help figuring out what information you want to collect or making sense of the data, please ask me (

-Whitney Meredith, evaluation specialist

  • © 2014 Regents of the University of Minnesota. All rights reserved.
  • The University of Minnesota is an equal opportunity educator and employer. Privacy