Selecting weak signals and building in diversity and equity
Art of Harvesting, Complexity, Emergence, Facilitation, Featured
When working in complexity, and when trying to create new approaches to things, it’s important to pay attention to ideas that lie outside of the known ways of doing things. These are sometimes called “weak signals” and by their very nature they are hard to hear and see.
At the Participatory Narrative Inquiry Institute, they have been thinking about this stuff. On May 31, Cynthia Kurtz posted a useful blog post on how we choose what to pay attention to:
If you think of all the famous detectives you know of, fictional or real, they are always distinguished by their ability to hone in on signals — that is, to choose signals to pay attention to — based on their deep understanding of what they are listening for and why. That’s also why we use the symbol of a magnifying glass for a detective: it draws our gaze to some things by excluding other things. Knowing where to point the glass, and where not to point it, is the mark of a good detective.
In other words, a signal does not arise out of noise because it is louder than the noise. A signal arises out of noise because it matters. And we can only decide what matters if we understand our purpose.
That is helpful. In complexity, purpose and a sense of direction helps us to choose courses of action from making sense of the data we are seeing to acting on it.
By necessity that creates a narrowing of focus and so paying attention to how weak signals work is alos important. Yesterday the PNI Institute discussed this on a call which resulted in a nice set of observations about the people seeking weka signals an dthe nature of the signals themselves:
We thought of five ways that have to do with the observer of the signal:
- Ignorance – We don’t know what to look for. (Example: the detective knows more about wear patterns on boots than anyone else.)
- Blindness [sic]- We don’t look past what we assume to be true. (No example needed!)
- Disinterest – We don’t care enough about what we’re seeing to look further. (Example: parents understand their toddlers, nobody else does.)
- Habituation – We stopped looking a long time ago because nothing ever seems to change. (Example: A sign changes on a road, nobody notices it for weeks.)
- Unwillingness – It’s too much effort to look, so we don’t. (Example: The “looking for your keys under the street light” story is one of these.)
And we listed five ways a signal can be weak that have to do with the system in which the observer is embedded:
- Rare – It just doesn’t happen often.
- Novel – It’s so new that nobody has noticed it yet.
- Overshadowed – It does happen, but something else happens so much more that we notice that instead.
- Taboo – Nobody talks about it.
- Powerless – Sometimes a signal is literally weak, as in, those who are trying to transmit it have no power.
You can see that this has important implications for building in equity and diversity into sense-making processes. People with different lived experiences, ways of knowing and ways of seeing will pay attention to signals differently. If you are trying to build a group with the increased capacity to scan and make sense of a complex problem, having cognitive and experiential diversity will help you to find many new ideas that re useful in addressing complex problems. Furthermore, you need to pay attention to people whose voices are traditionally quieted in a group so as to amplify their perspectives on powerless signals.
Great post, Chris! I love Cynthia’s work….
about “weak signals”.. just spent some time with the vTaiwan folks, and they were talking about how the pol.is software they use in the early parts of their participatory public policy processes, is intentionally designed to keep track of those perspectives that are in the “numerical minority”. Of course also valuable to do that with in-person processes as well. By attending to and giving some air-time to what is a “minority perspective”, we can addresses several of the points mentioned above. Agazarian’s work super helpful in this regard as well… she may have influenced the Deep Democracy approach of “amplifying the weak signals” by asking for others to “join”.
This resonates with the findings of research reported in “Bridging the Risk Gap
The Failure of Risk Management in Information Systems Projects” by Elmar Kutsch , Tyson R. Browning , and Mark Hall in Research-Technology Management • March—April 2014 p26-32. Eleven large high tech companies were asked to identify disruptive events that had affected major projects. People involved with the nineteen projects put forward were interviewed to discover why the disruption occurred in spite of there being various forms of risk management in place, they found 4 clusters of factors at work. They gave the raw numbers. My summary is that:
– 2% were considered to have been unknowable (one can argue about that)
– 6% of the disruptions were visible but no one involved had the capacity to spot them, the expertise and experience of those who took part did not intersect with the nature of the potential disruption
– 17% were identified and understood to be worth exploring but those concerned did not feel competent to analyse them so they did not work out how serious they were and set them aside
– 21% understood the significance but declined to escalate them or take serious action for one of three reasons [1] they feared it would undermine confidence in the project (personal versus corporate objectives) [2] being unwilling to spend resources now to affect something that would not arise until a long time in the future (could be a form of personal versus organisational objectives or the time value of resources) [3] the project manager lacked the authority to act and, presumably, the fortitude to escalate the matter (culture).
That meant that only about half the factors that put projects at risk were visible, being watched and being managed. In the world of risk management, this represents a soft target for improved performance as none of those issues, apart from the unknowable issue, is beyond rectification.
I am sure there is a lot of scope to work on the overlap between risk and complexity. The parallels between the factors outlined in this post and the paper by Kutsch et al hint at some deep connections.
Brilliant! Thanks.