Evaluation rigour for harvesting
Art of Harvesting, Art of Hosting, Collaboration, Evaluation, Featured, Learning
We are embarking on a innovative approach to a social problem and we need a framework to guide the evaluation process. As it is a complex challenge, we’re beginning with a developmental evaluation framework. To begin creating that,I was at work for most of the morning putting together a meta-framework, consisting of questions our core team needs to answer. In Art of Hosting terms, we might call this a harvesting plan.
For me, when working in the space of developmental evaluation, Michael Quinn Patton is the guy whose work guides mine. This morning I used his eight principles to fashion some questions and conversation invitations for our core team. The eight principles are:
- Developmental purpose
- Evaluation rigor
- Utilization focus
- Innovation niche
- Complexity perspective
- Systems thinking
- Co-creation
- Timely feedback
The first four of these are critical and the second four are kind of corollaries to the first and the first two are essential.
I think in the Art of Hosting and Art of Harvesting communities we get the first principle quite well, that participatory initiatives are, by their nature, developmental. They evolve and change and engage emergence. What I don’t see a lot of however is good rigour around the harvesting and evaluation.
All conversations produce data. Hosts and harvesters make decisions and choices about the kind of data to take away from hosted conversations. Worse, we sometimes DON’T make those decisions and then we end up with a mess, and nothing useful or reliable as a result of our work.
I was remembering a poorly facilitated session I once saw where the facilitator asked for brainstormed approaches to a problem. He wrote them in a list on a flip chart. When there were no more ideas, he started at the top and asked people to develop a plan for each one.
The problems with this approach are obvious. Not al ideas are equal, not all are practical. “Solve homlessness” is not on the same scale as “provide clothing bundles.” No one would seriously believe that this is an effective way to make a plan or address an issue.
You have to ask why things matter. When you are collecting data, why are you collecting that data and how are you collecting it? What is it being used for? Is it a reliable data source? What is your theoretical basis for choosing to work with this data versus other kinds of data?
I find that we do not do that enough in the art of hosting community. Harvesting is given very little thought other than “what am I going to do with all these flipcharts?” at which point it is too late. Evaluation (and harvesting) rigour is a design consideration. If you are not rigourous in your data collection and your harvesting methods, others can quite rightly challenge your conclusions. If you cannot show that the data you have collected is coherent with a strategic approach to the problem you are addressing, you shouldn’t be surprised if your initiative sputters.
In my meta-framework the simple questions I am using are:
- What are our data collection methods?
- What is the theoretical basis and coherence for them?
That is enough to begin the conversation. Answering these has a major impact on what we are hosting.
I high recommend Quinn Patton et. al.’s book Developmental Evaluation Exemplars for a grounded set of principles and some cases. Get rigourous.
Chris, good insights, thanks for sharing.
I am also reminded of the harvesting mantra “begin with a final harvest in mind” – harvest not as the finished result of course but as the type of data you want, informed by what you want to do with that data. Wrt the developmental purpose, I love when the evaluation makes us learn more about our own lenses through which we used to see the problems, which I believe is developmental in itself.
(As often, I understand your writings two years later 🙂 Now more recently it made sense to me that all hosted conversations are developmental in their aim -email you sent me two years ago. See? Time is a dimension that cannot be underestimated haha). Best.
Thanks Chris, great points. When you’re creating your framework is it worth adding one other question? When / on what basis / how will we revisit our methods, and if needed adjust them, to remain responsive to what we’re learning?
Yes certainly. The main feature of Developmental Evaluation (and the embodiment of the first principle) is that it is Developmental and so by nature it is iterative. Reflective practices, redesigns and course corrections are all a part of the ongoing work. These are often anxiety points for leaders needing to deliver on mandates, because they or their bosses need outcomes and certainty. Holding people in that anxiety and emotional and cognitive complexity is very challenging, but ultimately leads to increased capacity for responsiveness and resilience