February 3, 2014

How do your programs create public value?

In the Building Extension's Public Value workshop, we highlight three main ways Extension programs create public value. Programs address concerns about fairness, close an information gap, or encourage actions that benefit the greater community (or equivalently, discourage actions that impose costs on the community). Each of these can be thought of as a criterion or justification for public sector involvement. In my experience, most Extension programs focus on the third type of value creation and base their public value message on the ways a program encourages beneficial activities.
clipboard.bmp

During a recent a webinar for University of Minnesota Extension, we conducted an onscreen poll asking which of the three criteria participants thought applied to their programs. Respondents could choose any criterion that applied, including choosing all three. Out of about 32 participants, five thought their programs addressed a concern about fairness, and both the information gap and public benefits criteria received 20 votes.

We can't generalize from this non-scientific poll, but I wondered about the lack of attention to the fairness criterion. In the workshop, I encourage program teams to use this criterion with caution. Whether the unfairness of a situation warrants public sector action is subjective, and stakeholders with different values may assess fairness differently. So, I think Extension makes a more effective case when it uses the fairness criterion selectively. For this reason, I wasn't surprised by the small number of responses for fairness. I wondered, though, whether it arose because respondents thought their programs did not address a concern about fairness, or if they thought the program did address fairness, but they planned to emphasize a different criterion in order to make a stronger case.

Do you think a relatively small share of Extension programs address a fairness concern? Which criterion would you have chosen for the Extension programs you work with?

November 19, 2013

Private benefits from short-term outcomes

The students in my recent lecture for the Penn State University Agricultural Extension and Education program asked me several useful questions. One was whether the public value message structure (below) implied that private benefits can only be accrued after condition changes have occurred. Note that the thin, black arrow leading to the participant's private benefits implies exactly this.
privatebenefits.bmp
In fact, I think the original construction--with the arrow extending from condition changes to private benefits--misleads. A program participant may very well enjoy benefits long before conditions have improved. Indeed, many participants directly benefit from improvements in knowledge and skills. For example, a participant in leadership development program may see personal career benefits from the enhanced leadership and facilitation abilities he acquired through the program. And this may occur well before the program can claim improvements in community conditions. So, I wonder if the diagram should be redrawn (as I did above) to indicate that those private benefits can, indeed, arise from the intermediate stages in public value creation. What do you think (besides that I need to employ someone with better graphic design skills than mine. :))?

November 13, 2013

The private-public benefit intersection

intersection_ahead.jpgIn the public value message structure (seen here for example), I distinctly separate private benefits to program participants from the public value accruing to the rest of the community. In a recent seminar for Penn State University Agricultural Extension and Education program, I was asked whether I saw an intersection between private and public benefits, or need they always be separated in the model. I think the intersection between private and public benefits occurs when the program participant is a member of the community that enjoys the subsequent public benefits. In those cases, the participant will benefit from her own involvement in the program--through gaining new skills or making behavior changes that personally benefit her--but along with her neighbors, she will also enjoy the community-level changes the program generates. For example, someone who participates in an entrepreneurship program may enhance her business skills and improve the profitability of her own business. Her business' success improves local economic conditions--perhaps attracting new customers or suppliers to the area or enlarging the tax base--which improve opportunities for everyone in the community, along with the original entrepreneur.

I don't typically emphasize this intersection, because the objective of the public value works is to adopt the perspective of the non-participant payers of Extension programs--the community-members who are being asked to share in the cost of the programs through public funding, but who do not receive the private benefits of program participation. Nevertheless, the point that an intersection between public and private benefits exists, is well-taken.


November 4, 2013

How long is the long-term?

In the diagram below, I map the outcomes section of the University of Wisconsin Extension logic model to the components of a public value message. In the parlance of the UWEX model, learning outcomes are short-term, behavior changes are medium-term, and condition changes are long-term.

Participants in two recent seminars--one for the UM Center for Integrative Leadership and one for Penn State University's Agricultural Extension and Education program--challenged this pattern. They argued that some programs may generate public-value-level outcomes in less time than it takes other programs to generate behavior changes. In these cases, doesn't labeling the outcomes as short-, medium-, and long-term cause confusion?

Logic model.bmp

I think this is a useful point. What matters for the generation of public value is that the desired community-level condition changes are achieved, not how long it took to get there. If a program is able to alter economic, social, civic, or environmental conditions in ways that matter to critical stakeholders, then those impacts can be the basis of a public value message, even if they arose in the course of weeks or months, rather than the years or generations it may take for other programs to see impacts.

October 23, 2013

Does public value magnitude matter?

Is it enough for a stakeholder to learn that your program produced public value, or do stakeholders want to know how much value was created? Put another way, is it adequate to demonstrate that a program has a positive return on investment for a community? Or does it have to have a higher return than all the alternative investments the community could have made?
applesoranges.jpg

I was asked this question today at a Center for Integrative Leadership Research Forum where I presented, "How Cross Sector Leadership Builds Extension's Public Value." It seems that the answer has to be yes, it does matter whether a program generates a large or small amount of public benefit relative to its cost.

A potential funder wants to direct dollars toward their highest use. Ideally, all programs would report a return using a common metric. The metric could be dollars for programs whose impacts can be monetized (i.e., converted to dollars); it could be some other common metric (e.g., amount of pollution remediation, high school graduation rate) for programs addressing similar issues. With common metrics, a funder could rank programs by return on investment and fund as many of the top programs as they can.

Such apples-to-apples comparisons must be rare, though, even for programs with common objectives. I also imagine that the magnitude of a programs' expected public value--if it is known--will inform, but not drive a funder's decision.

What has your experience been? Have you sought funding from a source that demands to know the expected return on their investment in terms of dollars or some other metric? Do you know of funders that use those metrics to rank programs by return?


October 14, 2013

Working Differently in Extension Podcast

Interested in a short introduction to "Building Extension's Public Value"? Check out this Working Differently in Extension Podcast, featuring a conversation between Bob Bertsch of Agricultural Communication at North Dakota State University and me. If you'd like to actually see us converse, check out the video of the podcast below.

September 10, 2013

Evaluation and accountability

Last month at the 2013 AAEA Annual Conference, I was one of three presenters in a concurrent session on "Creating and Documenting Extension Programs with Public Value-Level Impacts." I learned a lot from both of my co-presenters, but here are just two quick ideas I took away from the session:

From Jo Ann Warner of the Western Center for Risk Management Education: As a funder, the WCRME demands evaluation plans--and ultimately results--from grant applicants. But the Center itself is accountable to its own funders, who demand that the Center demonstrate the effectiveness of the programs they have funded. So generally, grant applicants should keep in mind to whom the granting agency is accountable and provide the kind of results that will help the agency demonstrate effectiveness.

From Kynda Curtis of Utah State University: Intimidated about evaluating the effectiveness of your extension program? Don't be! It's not as hard as you think it is, won't take as much additional time as you think it will, and costs less than you expect. No excuses!

September 3, 2013

Critical conversation about public value

NL_logo_square.jpgListen in on a thought-provoking conversation about "The role of evaluation in determining the public value of Extension" with this eXtension Network Literacy Critical Conversation video. Participants are Nancy Franz (Iowa State University), Mary Arnold (Oregon State University), and Sarah Baughman (Virginia Tech), and the host is Bob Bertsch (North Dakota State University).

The participants make a number of interesting points about where Extension program evaluation has been and is headed. Several of my own take-aways from the video influenced my recent presentations at the 2013 AAEA Annual conference and the APLU meeting on Measuring Excellence in Extension. I've paraphrased three of those insights below.

==Nancy Franz: Extension should work toward changing the unit of program evaluation analysis from the individual to the system.

==Mary Arnold: Extension is still too often limiting evaluation to private value and isolated efforts. We need to move closer to systems-level evaluation.

==Sarah Baughman: We need to do more program evaluation for program improvement, not only evaluation for accountability.

July 31, 2013

Let's talk about public value in Washington, DC

Next week I will be in Washington, DC, presenting on Extension's public value at two venues. If you will be attending either of these meetings, I hope you will share your experiences with documenting public value impacts.

aaea.PNGFirst, on Tuesday, August 6, I will be on a panel at the Agricultural and Applied Economics Association (AAEA) Annual Conference at the Marriott Wardman Park. The session, "Creating and Documenting Extension Programs with Public-Value Level Impacts," is intended as a resource for AAEA members in their first few years of a partial or full Extension appointment at the university level. In addition to my presentation, "Tell Us About Your Extension Program's Public-Value Level Impacts," the panel features Kynda Curtis of Utah State University and Jo Ann Warner of Washington State University Extension. The panel is moderated by my UM colleague, Elton Mykerezi. Ours is concurrent session 2003 from 9:30 to 11:00.

Second, on Wednesday, August 6, I will meet at the Associationof Public and Land Grant Universities (APLU) offices with the Measuring Excellence in Extension Implementation Team. I'll be presenting on how to effectively incorporate public value into impact statements.

At both of these meetings, I'll be eager to hear about public value work and impact analysis taking place across the country.

July 1, 2013

Give voice to the public value experts

Occasionally I become aware that some of the participants in a BEPV workshop have had prior experience with the workshop. Some may have participated in a full workshop or completed the train-the-trainer course, others may have been introduced to the BEPV content in a speech or webinar. 397080364_0b8225f5b6.jpgI am often uncertain about how to address the range of experience in the audience. If I teach primarily to the inexperienced, I run the risk of disengaging those familiar with the content. If I teach to those with experience, I may frustrate the newbies. Because I usually do the former, I am willing to bet that more than a few participants have emerged from one of my workshops mumbling, "Well, that was nothing new."

Last week at a training session for the LEAD21 leadership development program, a trainer used an approach that I think can be effective with a mixed-level-of-experience group. The trainer first asked group members who had been through a similar training to identify themselves. S/he then named these people as the group's experts on the topic, and said that s/he would call on them to enrich the training by sharing their own experiences. Instead of expressing unease that some people in the group were already familiar with the content (which I'm sure I have done), the trainer showed gratitude that the room was rich with peer expertise.

Here are some ways I can see using this approach in a BEPV workshop:

==Ask people with prior experience to not only identify themselves, but describe briefly the kind of experience they have had (e.g., prior workshop, writing public value messages).
==If time allows--and if the experts are few in number--ask them to explain why they have chosen to attend the training again. I might use that information to more effectively prioritize the program content.
==Arrange participants so that the experts are distributed among the work groups.
==Before setting groups to work on an activity, ask the experts what they recall as the pitfalls for that activity, For example, I can imagine someone saying, "I remember that it takes a while to get all the way through the stakeholder exercise. Make sure you quickly choose a program to work on and move ahead to the exercise."
==During the next steps module, ask the experts what steps they have taken since their original training, and what obstacles and successes they have experienced.

I am grateful to the LEAD21 trainers for the reminder to draw "expert" participants into the conversation and to encourage them to share their knowledge with their peers.

(Photo credit: Joe Shlabotnik on FlickR.)

May 28, 2013

Mark Moore: Recognizing Public Value

My UM Extension colleague Neil Linscheid has alerted me to this recording of Harvard Kennedy School's Mark Moore discussing his new book, Recognizing Public Value. moore.PNGI haven't read the book yet, but I've ordered it from the library, and I'll write more once I've read it. However, I can share some quick impressions from having listened to the recording of Moore's talk.

Both Moore and his discussant, Tiziana Dearing of the Boston College School of Social Work, commented on the limitations of monetizing public value created by service providers. Starting around 28:00 of the recording, Moore explains that to measure the value created by government, we are often challenged with "describing material conditions that are hard to monetize....We don't know how to convert them into some kind of cash value." He doesn't give examples, but it's easy to imagine changes in civic or social conditions as falling into this category. Moore then decries the amount of effort that goes into trying to monetize these effects and says that effort "mostly distorts the information rather than clarifies." For clarity, he would prefer a concrete description of the effect (presumably something like units of pollutant reduction or increase in graduation rates) to "somebody's elaborate method to try monetize it."

Dearing shares Moore's skepticism of monetizing government and nonprofit benefits, and around 42:00 of the recording says, "It's a very dangerous thing outside of the private sector to have the same enthusiasm for the bottom line." Later she warns that "it's so easy to follow the metrics that follow the dollar that it becomes shorthand for a theory of change."

In this post, I also urged caution about monetizing the impacts of Extension programs. Nevertheless, when they are carefully framed and rigorously executed, dollar-denominated metrics such as cost-benefit analyses allow for comparison across programs. I wonder how Moore and Dearing would advise a local government that is choosing between anti-pollution and education investments when the only impact data they have is a "concrete description of the effects"? How would they balance units of pollutant mitigation with a percentage change in the graduation rate?

Have you read Moore's new book or at least listened to his talk? What do you think about what Dearing calls our "enthusiasm for the bottom line"? Do we go too far in trying to translate program impacts into dollars? What other contributions of the new book are particularly relevant for Extension?

May 14, 2013

Telling 4-H's public value story

MC900436946.bmpNancy Franz of Iowa State University Extension has alerted me (and the rest of the her Public Value Network listserv) to the California 4-H program's involvement of volunteers and staff in a statewide effort to develop public value stories. On the program's California Public Values web page, a survey link invites volunteers and staff to participate in the effort to "shape California 4-H's public values that will be shared with the broader community that has a stake in 4-H."

The survey asks respondents to identify (1) positive benefits to youth from participating in 4-H, (2) ways society benefits from those positive youth outcomes, and (3) community or state benefits from adult volunteer participation in the 4-H program.

"Crowdsourcing"--at least from select sources, such as those working with 4H across the state--sounds like a potentially efficient way to assemble a body of program impact stories. Local program staff can bring to administrators' and evaluators' attention impacts that might otherwise have been overlooked. I am eager to hear how the California 4H public value project develops.

May 7, 2013

Can public value help you get promoted?

promotion.jpgLast year I presented to faculty from University of Minnesota's Research and Outreach Centers (ROCs) about communicating the public value of research, outreach and engagement scholarship for promotion and tenure purposes. Since then I've had others ask me about whether and how the public value approach can be useful in documenting scholarly accomplishments.

The guidance I came up with is pretty similar to what I wrote about here: begin with the end in mind. Basically, a scholar who aims early on for public value-level impacts and outcomes, and then evaluates and documents those outcomes, will have built a strong case for her work's public value. Because each step in creating public value involves scholarship--of research, engagement, teaching, and evaluation--the scholar who meticulously documents her contributions should, in the end, be well-positioned to defend her record.

I know, with all of the demands on outreach and engagement faculty, this is easier said than done. I know clients and community-members expect these faculty to engage in activities that are hard to classify as scholarship. But I do think that leading with the end game can help a faculty member prioritize for success.

engage begin.png

I also wrote in this blog entry about ways that the public value approach can help close the loop between research and engagement. The research-design-engagement-evaluation loop illustrated in that entry provides a number of opportunities for a scholar focusing on engagement work--in contrast with outreach education--to document her contributions and impact. How did you contribute to (1) the research that underpins the program curriculum, (2) the program design, (3) the engagement itself, (4) the program evaluation? It seems to me that viewing your engagement program in the context of the loop can bring to mind scholarly contributions that you might not have thought to document. Perhaps it can even lead to a more complete promotion and tenure case.

May 1, 2013

National resources for impacts and public value

7042011201_5ba93f1364_n.jpgAt last week's 2013 PILD conference, I heard about a couple of national initiatives that, once developed, should help Extension organizations share impact data that can inform public value messages. In my own comments at the conference, I supported cross-state sharing of ideas, so I was encouraged to hear about these national projects.

First, I think it was NIFA director Dr. Sonny Ramaswamy who mentioned an effort this summer to develop a national portal for impact reporting that will consolidate Extension program impact results.

Second, I believe it was ECOP chair Dr. Daryl Buchholz who highlighted ECOP's Measuring Excellence initiative. It appears that this project is meant to define and demonstrate excellence and to report impacts for Cooperative Extension as a whole. From the website: "Cooperative Extension has advanced from merely reporting inputs and outputs to documenting outcomes and impacts of its programs. However, most of these measures are tied to specific programs. They are not generally assessed or considered at the organizational level." While the "excellence" part of the website is well-developed, the pages having to do with impact are still under construction. I look forward to the work that will populate these pages with resources and guidance for Extension impact teams. Meanwhile, I have to give a shout out for the public value statements on the front page of the website!

By the way, my notes from the PILD keynote talks are a little sparse. If I am wrong about which speaker spoke about which initiative, please correct me in the comments. And if you know more about how the portal or the Measuring Excellence project can strengthen Extension's public value case, please share that, too!

(Photo credit: USDAgov on FlickR)

April 29, 2013

Do people's eyes glaze over when you talk about your Extension program?

bored.bmpA couple of posts ago I highlighted the California Institute for Regenerative Medicine's (CIRM's) elevator pitch challenge for its fund recipients. I linked to CIRM's #sciencepitch web page that contains links to the grantees' videos. But I failed to draw your attention to the video on the front page that depicts CIRM's Director of Public Communications, Kevin McCormack, introducing the challenge. He asks, "Do people's eyes glaze over when talk about your research?" and "Do reporters hang up on you when you talk about your work?" Check out the video for amusing scenes of researchers struggling to hold a co-worker's or a reporter's attention. Do Extension advocates ever struggle in the same way? ;-)

In the video, McCormack offers tips for constructing an effective pitch: make the pitch "short, simple, clear, articulate, informative, engaging, even entertaining." All of those adjectives could apply to an effective public value message for an Extension program, with a few more suggestions shown in the slide below

tips.png


Categories