Chris Corrigan Chris Corrigan Menu
  • Blog
  • Chaordic design
  • Resources for Facilitators
    • Facilitation Resources
    • Books, Papers, Interviews, and Videos
    • Books in my library
    • Open Space Resources
      • Planning an Open Space Technology Meeting
  • Courses
  • About Me
    • Services
      • What I do
      • How I work with you
    • CV and Client list
    • Music
    • Who I am
  • Contact me
  • Blog
  • Chaordic design
  • Resources for Facilitators
    • Facilitation Resources
    • Books, Papers, Interviews, and Videos
    • Books in my library
    • Open Space Resources
      • Planning an Open Space Technology Meeting
  • Courses
  • About Me
    • Services
      • What I do
      • How I work with you
    • CV and Client list
    • Music
    • Who I am
  • Contact me

Category Archives "Complexity"

Working with principles: some thoughts and an exercise

October 15, 2018 By Chris Corrigan Chaordic design, Complexity, Design, Featured 2 Comments

I’m continuing to refine my understanding of the role and usefulness of principles in evaluation, strategy and complex project design.  Last week in Montreal with Bronagh Gallagher, we taught a bit about principles-based evaluation as part of our course on working with complexity. Here are some reflections and an exercise.

First off, it’s important to start with the premise that in working in complexity we are not solving problems, but shifting patterns. Patterns are the emergent results of repeated interactions between actors around attractors and within boundaries. To make change in a complex system therefore, we are looking to shift interactions between people and parts of a system to create a beneficial shift in the emergent patterns.

To do this we have to have a sufficient understanding of the current state of things so that we can see patterns, a sense of need for shifting patterns and an agreed upon preferred and beneficial direction of travel. That is the initial strategic work in any complex change intervention. From there we create activities that help us to probe the system and see what will happen, which way it will go and whether we can do something that will take it in the beneficial direction. We then continue this cycle of planning, action and evaluation.

Strategic work in complexity involves understanding this basic set of premises, and here the Cynefin framework is quite useful for distinguishing between work that is best served by linear predictive planning – where a chain of linked events results in a predictable outcome – and work that is best served by complexity tools including pattern finding, collective sensemaking and collaborative action.

Working with principles is a key part of this, because principles (whether explicit of implicit) are what guide patterns of action and give them the quality of a gravity well, out of which alternative courses of action are very difficult.. Now I fully realize that there is a semantic issue here, around using the term “principles” and that in some of the complexity literature we use, the terms “simple rules” or “heuristics” are also used. Here I am using “principles” specifically to tie this to Michael Quinn Patton’s principles-based evaluation work, which i find helpful in linking the three areas of planning, action, and evaluation.  He defines an effectiveness principle as something that exhibits the following criteria:

  1. Guides directionality
  2. Is useful and usable
  3. Provides inspiration for action
  4. Is developmental in nature and allows for the development of approaches (in other words not a tight constraint that restricts creativity)
  5. Is evaluable, in that you can know whether you are doing it or not.

These five qualities are what he calls “GUIDE,” an acronym made from the key criteria. Quinn Patton argues that if you create these kinds of principles, you can assess their effectiveness in creating new patterns of behaviour or response to a systemic challenge.  That is helpful in strategic complexity work.  

To investigate this, we did a small exercise, which I’m refining as we go here.  On our first day we did a sensemaking cafe to look at patterns of where people in our workshop felt “stuck” in their work with clients and community organizations.  Examples of repeating patterns included confronting aversion to change, use of power to disenfranchise community members, lack of adequate resources, and several others. I asked people to pick one of these patterns and asked them to create a principle using the GUIDE criteria that seems to be at play to keep this pattern in place.  

For example, on aversion to change, one such principle might be “Create processes that link people’s performances to maintaining the status quo.” You can see that there are many things that could be generated from such a principle, and that perhaps an emergent outcome of such a principle might be “aversion to change.” This is not a diagnostic exercise. Rather it helped people understand the role that principles have in containing action with attractors and boundaries. In most cases, people were not working with situations where “aversion to change” was a deliberate outcome of their strategic work, and yet there was the pattern nonetheless, clear and obvious even within settings in which innovation or creativity is supposedly prized and encouraged.

Next I invited people to identify a direction of travel away from this particular pattern

using a reflection on values. If aversion to change represents a pattern you negatively value, what is an alternative pattern, and what is the value beneath that? It’s hard to identify values, but these are pretty pithy statements about what matters. One value might be “Curiosity about possibility” and another might be “excitement for change.” From there participants were asked to write a principle that might guide action towards the emergence of that new pattern.  One such example might be “Create processes that generate and reward small scale failure.”  I even had them take that one statement and reduce it to a simple rule, such as “Reward failure, doubt

The next step is to put these principles in play within an organization to create tests to see how effective the principle is.  If you discover that it works, refine it and do more. If it doesn’t, or if it creates another poor pattern such as cynicism, stop using it and start over.

Share:

  • Click to share on Mastodon (Opens in new window) Mastodon
  • Click to share on Bluesky (Opens in new window) Bluesky
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to email a link to a friend (Opens in new window) Email
  • Click to print (Opens in new window) Print
  • More
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Telegram (Opens in new window) Telegram

Like this:

Like Loading...

The limits of certainty

September 28, 2018 By Chris Corrigan Complexity, Evaluation, Featured

An interesting review essay by John Quiggan looks at a new book by Ellen Broad called Made by Humans: The Ai Condition. Quiggan is intrigued by Broad’s documentation of the way algorithms have changed over the years, from originating as “a well-defined formal procedure for deriving a verifiable solution to a mathematical problem” to becoming a formula for predicting unknown and unknowable futures.  Math problems that benefit from algorithms fall firmly in the Ordered domains of Cynefin. But the problems that AI is now be deployed upon are complex and emergent in nature, and therefore instead of producing certainty and replicability, AI is being asked to provide probabilistic forecasts of the future.

For the last thousand years or so, an algorithm (derived from the name of an Arab mathematician, al-Khwarizmi) has had a pretty clear meaning — namely, it is a well-defined formal procedure for deriving a verifiable solution to a mathematical problem. The standard example, Euclid’s algorithm for finding the greatest common divisor of two numbers, goes back to 300 BCE. There are algorithms for sorting lists, for maximising the value of a function, and so on.


As their long history indicates, algorithms can be applied by humans. But humans can only handle algorithmic processes up to a certain scale. The invention of computers made human limits irrelevant; indeed, the mechanical nature of the task made solving algorithms an ideal task for computers. On the other hand, the hope of many early AI researchers that computers would be able to develop and improve their own algorithms has so far proved almost entirely illusory.


Why, then, are we suddenly hearing so much about “AI algorithms”? The answer is that the meaning of the term “algorithm” has changed. A typical example, says Broad, is the use of an “algorithm” to predict the chance that someone convicted of a crime will reoffend, drawing on data about their characteristics and those of the previous crime. The “algorithm” turns out to over-predict reoffending by blacks relative to whites.


Social scientists have been working on problems like these for decades, with varying degrees of success. Until very recently, though, predictive systems of this kind would have been called “models.” The archetypal examples — the first econometric models used in Keynesian macroeconomics in the 1960s, and “global systems” models like that of the Club of Rome in the 1970s — illustrate many of the pitfalls.
A vast body of statistical work has developed around models like these, probing the validity or otherwise of the predictions they yield, and a great many sources of error have been found. Model estimation can go wrong because causal relationships are misspecified (as every budding statistician learns, correlation does not imply causation), because crucial variables are omitted, or because models are “over-fitted” to a limited set of data.


Broad’s book suggests that the developers of AI “algorithms” have made all of these errors anew. Asthmatic patients are classified as being at low risk for pneumonia when in fact their good outcomes on that measure are due to more intensive treatment. Models that are supposed to predict sexual orientation from a photograph work by finding non-causative correlations, such as the angle from which the shot is taken. Designers fail to consider elementary distinctions, such as those between “false positives” and “false negatives.” As with autonomous weapons, moral choices are made in the design and use of computer models. The more these choices are hidden behind a veneer of objectivity, the more likely they are to reinforce existing social structures and inequalities.


The superstitious reverence with which computer “models” were regarded when they first appeared has been replaced by (sometimes excessive) scepticism. Practitioners now understand that models provide a useful way of clarifying our assumptions and deriving their implications, but not a guaranteed path to truth. These lessons will need to be relearned as we deal with AI.


Broad makes a compelling case that AI techniques can obscure human agency but not replace it. Decisions nominally made by AI algorithms inevitably reflect the choices made by their designers. Whether those choices are the result of careful reflection, or of unthinking prejudice, is up to us.

In general I think that scientists understand the limits of this approach to modelling, and that was borne out in several discussions that I had with ecologists last week in Quebec. We do have to define what we mean by “prediction” though. Potential futures can be predicated with some probability if you understand the nature of the system, but exact outcomes cannot be predicted. However, we (by whom I mean the electorate and policy makers who work to make single decisions out of forecasts) do tend to venerate predictive technologies because we cling to the original definition of an algorithm, and we can come to believe that the model’s robustness is enough to guarantee the accuracy of a prediction.  We end up trusting forecasts without understanding probability, and when things don’t go according to plan, we blame the forecasters rather than our own complexity illiteracy. 

Share:

  • Click to share on Mastodon (Opens in new window) Mastodon
  • Click to share on Bluesky (Opens in new window) Bluesky
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to email a link to a friend (Opens in new window) Email
  • Click to print (Opens in new window) Print
  • More
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Telegram (Opens in new window) Telegram

Like this:

Like Loading...

Selecting weak signals and building in diversity and equity

June 14, 2018 By Chris Corrigan Art of Harvesting, Complexity, Emergence, Facilitation, Featured 3 Comments

When working in complexity, and when trying to create new approaches to things, it’s important to pay attention to ideas that lie outside of the known ways of doing things.  These are sometimes called “weak signals” and by their very nature they are hard to hear and see.

At the Participatory Narrative Inquiry Institute, they have been thinking about this stuff.  On May 31, Cynthia Kurtz posted a useful blog post on how we choose what to pay attention to:

If you think of all the famous detectives you know of, fictional or real, they are always distinguished by their ability to hone in on signals — that is, to choose signals to pay attention to — based on their deep understanding of what they are listening for and why. That’s also why we use the symbol of a magnifying glass for a detective: it draws our gaze to some things by excluding other things. Knowing where to point the glass, and where not to point it, is the mark of a good detective.

In other words, a signal does not arise out of noise because it is louder than the noise. A signal arises out of noise because it matters. And we can only decide what matters if we understand our purpose.

That is helpful. In complexity, purpose and a sense of direction helps us to choose courses of action from making sense of the data we are seeing to acting on it.

By necessity that creates a narrowing of focus and so paying attention to how weak signals work is alos important. Yesterday the PNI Institute discussed this on a call which resulted in a nice set of observations about the people seeking weka signals an dthe nature of the signals themselves:

We thought of five ways that have to do with the observer of the signal:

  1. Ignorance – We don’t know what to look for. (Example: the detective knows more about wear patterns on boots than anyone else.)
  2. Blindness [sic]- We don’t look past what we assume to be true. (No example needed!)
  3. Disinterest – We don’t care enough about what we’re seeing to look further. (Example: parents understand their toddlers, nobody else does.)
  4. Habituation – We stopped looking a long time ago because nothing ever seems to change. (Example: A sign changes on a road, nobody notices it for weeks.)
  5. Unwillingness – It’s too much effort to look, so we don’t. (Example: The “looking for your keys under the street light” story is one of these.)

And we listed five ways a signal can be weak that have to do with the system in which the observer is embedded:

  1. Rare – It just doesn’t happen often.
  2. Novel – It’s so new that nobody has noticed it yet.
  3. Overshadowed – It does happen, but something else happens so much more that we notice that instead.
  4. Taboo – Nobody talks about it.
  5. Powerless – Sometimes a signal is literally weak, as in, those who are trying to transmit it have no power.

You can see that this has important implications for building in equity and diversity into sense-making processes. People with different lived experiences, ways of knowing and ways of seeing will pay attention to signals differently. If you are trying to build a group with the increased capacity to scan and make sense of a complex problem, having cognitive and experiential diversity will help you to find many new ideas that re useful in addressing complex problems.  Furthermore, you need to pay attention to people whose voices are traditionally quieted in a group so as to amplify their perspectives on powerless signals.

Share:

  • Click to share on Mastodon (Opens in new window) Mastodon
  • Click to share on Bluesky (Opens in new window) Bluesky
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to email a link to a friend (Opens in new window) Email
  • Click to print (Opens in new window) Print
  • More
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Telegram (Opens in new window) Telegram

Like this:

Like Loading...

How complexity principles can inform participatory process design

March 28, 2018 By Chris Corrigan Complexity, Emergence, Facilitation, Featured 7 Comments

Sonja Blignault has been blogging some terrific stuff on Paul Cilliers’ work on complexity.  Specifically she has been riffing on Cilliers’ seven characteristics of complex systems and the implications of complexity for organizations.

Yesterday I was teaching an Art of Hosting here in Calgary, where we were looking at Cynefin and then followed with a discussion about how the nature of complex systems compels us to make important design choices when we are facilitating participatory processes to do work in organizations.

This is a cursory list, but I thought it would be helpful to share here. Cilliers’ text is bold.

Complex systems consist of a large number of elements that in themselves can be simple. 

If you are planning participatory processes, don’t focus on working on the simple problems that are the elements in complexity. Instead, you need to gather information about those many simple elements and use facilitation methods to look for patterns together.  We talk about describing the system before interpreting it. Getting a sense of the bits and pieces ensures that you don’t begin strategic process work with high level aspirations.

The elements interact dynamically by exchanging energy or information. These interactions are rich. Even if specific elements only interact with a few others, the effects of these interactions are propagated throughout the system. The interactions are nonlinear. 

Non-lienarity is truly one of those things that traditional planning processes fail to understand. We want to always be heading towards a goal, despite the fact that in complex systems such controlled progress is impossible.  What we need to be doing is choosing a direction to move in and make decisions and choices that are coherent with that direction, all the while keeping a careful watch on what is happening and what effect our decisions have.  Participatory processes help us to make sense of what we are seeing, and convening regular meetings of people to look through data and seen what is happening is essential, especially if we are making decisions on innovative approaches.  Avoid creating processes that assume casualty going forward; don’t make plans that are based on linear chains of events that take us from A to B.  Traditional vision, mission goals and objectives planning has little usefulness in a complex system. Instead, focus on the direction you want to move in and a set of principles or values that help you make decisions in that direction.

There are many direct and indirect feedback loops.

The interactions between the parts of a systems happen in a myriad of ways. To keep your strategy adapting, you need to build in feedback loops that work at a variety of time scales. Daily journalling, weekly sense making and project cycle reporting can all be useful.  Set up simple and easily observable monitoring criteria that help you to watch what you are doing and decide how to adjust when that criteria are triggered.  Build in individual and collective ways to harvest and make sense of what you are seeing.

Complex systems are open systems—they exchange energy or infor- mation with their environment—and operate at conditions far from equilibrium.

You need to understand that there are factors outside your control that are affecting the success or failure of your strategy. Your and your people are constantly interacting with the outside world. Understand these patterns as they can often be more important than your strategy. In participatory process and strategy building I love it when we bring in naive experts to contribute ideas from outside our usual thinking.  In natural systems, evolution and change is powered by what happens at the edges ad boundaries, where a forest interacts with a meadow, or a sea with a shoreline. these ecotones are the places of greatest life, variety and influence in a system. Build participatory process that bring in ideas from the edge.

Complex systems have memory, not located at a specific place, but distributed throughout the system. Any complex system thus has a history, and the history is of cardinal importance to the behavior of the system.

Complex systems are organized into patterns and those patterns are the results of many many decisions and actions over time. Decisions and actions often converge around attractors and boundaries in a system and so understanding these “deep yes’s and deep no’s” as I call them is essential to working in complexity.  You are never starting from a blank state, so begin by engaging people in understanding the system, look for the patterns that enable and the patterns that keep us stuck, and plan accordingly.

The behavior of the system is determined by the nature of the interactions, not by what is contained within the components. Since the interactions are rich, dynamic, fed back, and, above all, nonlinear, the behavior of the system as a whole cannot be predicted from an inspection of its components. The notion of “emergence” is used to describe this aspect. The presence of emergent properties does not provide an argument against causality, only against deterministic forms of prediction.

So again, work with patterns of behaviour, not individual parts.  And of course, as Dave Snowden is fond of saying, to shift patterns, shift the way the actors interact. Don’t try to change the actors. Once, when working on the issue of addictions stigma in health care, the health authority tried running a project to address stigmatizing behaviours with awareness workshops. The problem was, they couldn’t find anyone that admitted to stigmatizing behaviours. Instead, we ran a series of experiments to change the way people work together around addictions and people with addictions (including providing recognition and help for health care workers who themselves suffered from addictions). That is the way to address an emergent phenomenon.

Complex systems are adaptive. They can (re)organize their internal structure without the intervention of an external agent.

And so your strategy must also be adaptive. I’m learning a lot about Principles Based Evaluation these days which is a useful way to craft strategy in complex domains.  Using principles allows people to make decisions consistent and coherent with the preferred direction of travel the strategy is taking us in.  when the strategy needs to adapt, because conditions have changed, managers can rely on principles to structure new responses to changing conditions.  Participatory processes become essential in interpreting principles for current conditions.

 

This is a bit of a brain dump, and as usual it makes more sense to me that perhaps it does to everyone else. But I’d be very interested in your reflections on what you are hearing here, especially as it relates to how we craft, design and deliver participatory processes in the service of strategy, planning and implementation.

 

Share:

  • Click to share on Mastodon (Opens in new window) Mastodon
  • Click to share on Bluesky (Opens in new window) Bluesky
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to email a link to a friend (Opens in new window) Email
  • Click to print (Opens in new window) Print
  • More
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Telegram (Opens in new window) Telegram

Like this:

Like Loading...

Shallow dives into chaos in teaching and leading

December 14, 2017 By Chris Corrigan Complexity, Facilitation, Featured, Leadership, Learning

In the Cynefin framework, the domains are really shades with some clear boundaires.  Strategic work using Cynefin is about making various moves between different domains for different reasons.  This is called Cynefin dynamics, and there’s an old but good paper on it here.

In Cynefin dynamics there is a strategic move of “taking a shallow dive into chaos” which is useful for strategic purposes when one needs to break pattern entrainment.  It is a very useful move in teaching contexts when we are trying to get people to let go of some of their fixed ways of seeing and doing things.  Even putting a group in a circle can be a shallow dive into chaos.  The idea here is that in complexity you have a system with a permeable boundary with lots of connections between the elements in the system (people, ideas, resources).  That allows for emergence to happen.  In chaos, the connections break down and you need to hold a tight container – nothing is emerging, everything is breaking.  So if you want to take a shallow dive into chaos, the container needs to be very tight, very constrained, and the relationships between people and ideas that are within that container are very open.  That’s how you break patterns without creating a deep experience of chamos, which would be when everything breaks down, including the container.  Sometimes that is required, but there is a much lower likelihood of recovering from that kind of thing.  I wouldn’t call that “leadership.”  It’s more like “abandonment.”  No one wants to create a deep dive into chaos unless you want to create a civil war or a revolution, and even then you have no right to expect you’ll survive it.

Chaos is a very high energy state, and it costs a lot to be in it. As a result systems (or learners) that are in a state of chaos won’t stay there for long.  Typically they will respond to the first person that comes along and applies tight constraints (think about a paramedic arriving on the scene of an accident).  From the perspective of the person in chaos, anything that helps stabilize the situation is welcome.

This can make chaos in systems VERY VERY vulnerable to unchecked power.  In times of war, fear or conflict, it is very easy for people to choose and trust despotic leaders that bring tight constraints to the situation, because bringing constraints is actually the right move.  I have seen meetings and gatherings happen where chaos was deliberately triggered (sometimes under the guise of “there’s not enough happening in this container”) and then people come in and hijack the agenda and apply their own power.  In my experience, very few people are deeply skilled at initiating deep levels of chaos to break patterns and then creating complexity responses (rather than imposing their will), but on the national scale perhaps Iceland is an example.

In workshops  sometimes participants want to question or check the power of the facilitators.  This has happened twice to my colleague Tuesday Ryan-Hart and I when we have taught groups of activists who seized on her power teaching to question the power dynamics of teacher/student within the workshop.  In both cases we took responsibility as hosts to hold a tight container in which the relationships could dissolve and so that the group itself could discover what to do next. We did this by suspending the agenda and hosting a circle and a Council.  The decisions that came out were both group owned and I think made the workshop a better learning experience for everyone AND proved the efficacy of our tools and processes.  I have seen other examples where the hosts did not take that responsibility and instead the participants were left designing their own gathering.  That kind of thing is poor strategy in chaos, unless you are planning on just abandoning the situation and letting others take over, in which case it’s an excellent strategy to ensure you’ll never be invited back (I have also done this sometimes intentionally and sometimes accidentally.)

So that is the kind of decision that you have to make from time to time.  Working with constraints is what leaders and teachers do.  Being conscious about that is good practice.

At his two day class last week in Vancouver, Dave Snowden presented this constraints based take on Cynefin and shared the evolution of the framework.  There is now a new version of this known as “liminal Cynefin” that explores the boundary conditions between complicated and complex and complex and chaotic.  I like this because it begins to highlight how dynamic the framework is.  I use Cynefin to explain systems and I use the Chaordic Path to talk about developing the leadership capacity to stay in the dynamism of flows around these types of systems.

Share:

  • Click to share on Mastodon (Opens in new window) Mastodon
  • Click to share on Bluesky (Opens in new window) Bluesky
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to email a link to a friend (Opens in new window) Email
  • Click to print (Opens in new window) Print
  • More
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Telegram (Opens in new window) Telegram

Like this:

Like Loading...

1 … 24 25 26 27 28 … 35

Find Interesting Things
Events
  • Art of Hosting April 27=29, 2026, with Caitlin Frost, Kelly Poirier and Kris Archie, Vancouver, Canada
  • The Art of Hosting and Reimagining Education, October 16-19, Elgin Ontario Canada, with Jenn Williams, Cédric Jamet and Troy Maracle
Resources
  • A list of books in my library
  • Facilitation Resources
  • Open Space Resources
  • Planning an Open Space Technology meeting
SIGN UP

Enter your email address to subscribe to this blog and receive notifications of new posts by email.
  

Find Interesting Things

© 2015 Chris Corrigan. All rights reserved. | Site by Square Wave Studio

%d