A Trace in the Sand
by Ruth Malan
To get an idea of the kinds of topics my Trace has covered, you might like to take a look at one or other of the following maps/topics lists:
I think this Trace was a good thing. Different, to be sure. But sparkling with insight and life. Well... it's something, anyway.
Articulating Appreciation and Gratitude
It would be so nice if, from time to time, we saw it as doing our part in the web of community responsibility, to just put an expression of enthusiasm, or appreciation and gratitude, for a piece of work out there. It
Positive words on the backchannel are nice. But these deflate the value of the second point above -- something that is easy to say/doesn't cost much of ourselves, doesn't carry as much weight. And contribute nothing to the third. Sometimes we're shy. or feel awkward about being affirmative on the front channel and that's fine. We just need to be careful to make sure that it isn't all backchannel.
Web of Responsibility
When I say something like "we need to step up the web of responsibility," one won't ponder what that means, if one doesn't expect that I mean anything profound.
Profound? Well, maybe not so much profound as a nudge to our mental model. We need to see ourselves as being in a shared web -- vast and intertwingled -- of responsibility, and we need to step up, step in, fill the breach, when we see something being neglected in a way that pushes the system towards a vulnerability. We need to bring this into our mindset just as surely as we need to advocate the "avoid blame" shift, steering away from that single point of failure/find a person to blame mentality. This is a hard cultural shift, for largely it means we need to empower ourselves and each other more. We compartmentalize and locate authority in expertise and so forth, and these can be good in their way, but we need to become more wise about these boundaries and our responsibility to each other.
So far, I've been hearing a lot about the one side of the web of responsibility coin: complex systems fail. And they fail because the web fails -- the web of interacting factors that keep the system working, fails not for one reason we can seek out and plug up to prevent another incident. The other side of that coin is not viewing responsibility in isolation, but rather in an interacting web where we impact other's responsibilities and abilities to keep the system from sliding into vulnerability and even failure. This puts " can do" back on the table. Systems will still fail, but webs of people stepping up to the responsibility to proactively reduce vulnerabilities and stave off disaster will help fend off or reduce the impact of more of them.
Um... I realize that may sound like common sense. "It takes a village to raise a child" and other such "web of responsibility" notions are in our lexicon. But so is the culture of partitioning and compartmentalizing. It gave rise to the need in the resilience and safety field, to raise awareness of the web of responsibility that means it is not just fallacious but counterproductive to attempt to locate the cause of failure in order to lay blame (to "enforce" accountability). The stopping rules tend to be self-serving, stopping when blame can be assigned to satisfy the powers pulling the strings of command. But the causes are multiple, and can be unwound further and further in webs of drift into brittleness and failure. And if it makes sense for books to be written to make that case, it makes sense that we need to raise the profile of the flip side of that coin -- there is a web of responsibility to proactively step it up to avert drift into vulnertability. And doing so requires a cultural shift in many organizations, even "villages." Who have forgotten that we share responsibility.
Okay. That's repetitive/needs to be edited. That's what happens when I invent your objections and try to stave them off. Failure is a slip-slide-y beasty that edges in wherever attention lapses -- and it will lapse. :-)
I simply needed to get the idea off my fingertips... The pendulum swings between "OMG catastrophic failure: find the cause so we can hold an accounting and regain a sense of control" and "chill out already; complex systems fail, dude." But all along the way, we aren't helping people understand that humans -- as integration and dynamic response points between increasingly complex and failure prone (in novel ways) systems -- are themselves failure prone systems. So we need to amp up our sense of mutual responsibility or responsibility across role compartments.
Here's an example. A kid crashed and from the helmet damage, concussion should have been a concern. The adult in charge was under pressure to move things along, and made the kid get back on the grueling course, despite the helmet damage and concussion concern being raised. It is hard for kids to assert a decision that runs counter to a call made by an adult authority figure. But if we share the sense that any one of us, under annoyances (hey, we're human; emotional and stress prone) and pressure, will make poor judgment calls some of the time, then we open up the space for others to step in, no matter their age or standing in the pecking order, and avert potentially devastating vulnerability. It's still not going to always work, because stressed authority figures are blind to their own weakness in that state. Power corrupts -- people and systems. Etc. But shifting our emphasis to webs of responsibility, means we can start to open up the expectation that we learn how to cope with our own fallibility, and proactively work to increase the resilience of our systems -- especially the human component thereof.
7/16/15: We might (usefully) frame this as good citizenship and common sense. The issue is to loosen the edges we have around assigned responsibility and charter that confines or restricts our empowerment to given roles and status or standing. To empower acting across boundaries or charter when that is what is needed to ensure the ongoing resilience of the system, or to proactively counteract drift. We push the envelope of system performance, and this advances what we can do. But we need to create a shared sense of responsibility for ensuring that we invest in resilience buffers, for working at system integrity, and enabling and empowering ourselves to make accommodations for the very humanness of the the humans we put in the loop to act as dynamic response mechanisms when compute/machine-intensive systems go awry.
I don't know if that makes sense. John Gall's Systemantics was first published in 1978. Long before that, Schopenhauer explored how motives behind causes are themselves products of complex interactions of circumstance, history, character and more. So this sense that there are webs behind webs of interacting factors leading up to some observable incident of failure, is also part of our gestalt -- when we stop and think about it. We need to become more self-conscious and explicitly build that understanding into how we respond to critical events like impactful system failures. Systems have just gotten more and more complex and intertwingled, and in many cases older and more stressed by decreptitude from decades of drift as complexity is added and maintenance and restructuring is defered or ignored.
Maybe this is all obvious. What made me stop and think, was how unsettled I get when overreaction (and the call for consquences to meted out to those "responsible") to system failure is met with "complex systems will fail from time to time" -- which is true, but a bit too lais·sez-faire given where we're at and headed. We need to hasten to add "yeah, we do need to take stock -- we will keep pushing these systems into the danger zone and we're going to see more of these knock-on cascading failures that impact lives and livings. We need to adapt our education, our social processes, our business investments in civic responsibility and sustainability, and our engineering, to better address and ensure resilience." And a good place to start is with viewing ourselves as part of a broader web of responsibility than our charters and turfs typically bind us to. Social forces discourage acting outside our social frame, but our (technologically enabled) systems are so interwoven we need to catch our social expectations and reward/punishment mechanisms and more, up to this more intermeshed world.
Practical implications? More awareness of our cognitive limitations and biases, and how they trip every one of us up from time to time, and some strategies for dealing with our humanity -- (in)glorious as it is -- would be a good start. We might think of soft skills as essential survival skills when it comes to an increasingly richly enabled world... but one that is underpinned by technology that is imperfectly created and imperfectly sustained and evolved, while coupled into systems of systems to such an extent that interactions and side-effects are poorly understood, even as the systems age and decay... "Good citizenship" has brought us a long way. But we have to step it up. Way up. Climate change is an illustration of how clueless we are about what it means to really be good citizens, and something that will challenge our social and socio-technical systems in ways we can barely conceive -- when we try. And we avoid trying. To face what we need to face with our science and our engineering, we have to deal with our bias-fraught, fallible humanity. Hard skills, including the soft skills. Imagination, perception, empathy and consideration, ... negotiation and collaboration, ... empowerment and leadership...
We seek to cope, to control what we can enough to get by:
We need to tell ourselves new stories, stories that are less about the rare insular hero and more about efforts to be proactive and considerate of each other and our fellow creatures.
Rah rah boom de ah dah (NSFW, if your workplace is more PC than a high school....??? ;-)
7/18/15: We have limited lives, limited bandwidth, limited perception, and limited tolerance... Nevertheless. To "have nice things" like more complex and interwoven systems (internet of things?? Watson on the march, ...), we have to up the "maturity" of our civic sensibilities -- not just in the (don't) blame department, but in the proactively see other perspectives, proactively up the level of integrity of our systems, etc., departments. To create more resilient systems -- proactively, and also when something does go awry, learning from that, to improve the sytem's integrity and resilience. Which means improving (also) the ability to be proactive and considerate -- improving our imagination and ability to shift perspectives, to understand more, to see connections...
John Gall wrote Systemantics decades ago. He followed that up with "How to Use Conscious Purpose Without Wrecking Everything" (.pdf on Tom Gilb's site) a few years ago. There are no easy solutions, but all divide is not going to conquer. Not any more, if it ever did...
Later: Oh, here's a point made in the (nicely done!) “Principles for Building Resilience: Sustaining Ecosystem Services in Social-Ecological Systems” video -- "sanctioning or punishment that occurs when someone breaks a rule" is an example of a dampening feedback mechansim. That is, punishment, like social shaming, is a feedback mechanism... But morning overtook Shahrazad... I mean, punishment may be a mechanism that has unpredictavle action at a distance type consequences, and may damp desired behaviors like open conversations that foster understanding.
Perspectives, From Another Angle
Today is the 9th anniversary of my dad's death, so a day washed with memories and gratitude for and to him. So often I'm struck by the memories we have of our fathers. The lessons we learned from them. And my experience was somewhat different, for my mother went away for 4 years to study nursing, and my dad took care of us 6 kids -- while working. So he was, in a way, a "single mom" during that time, but also very much a dad. He was there when we were sick, he read and made up bedtime stories. He did all the things. He was both ordinary and extraordinary, for he was smart and sensitive.
My dad's dream, and the project he was working on, was a micro-investment approach -- it was, if you like, a little like Kickstarter or Kiva, but more like providing an opportunity for the everyman/woman to have access to the VC game, investing micro amounts and spreading the risk and potential gain. It would be a way to fund risky research, for example. When I brought it back up today, Dana mentioned something I hadn't thought of -- when it is a small amount, we are more likely to "vote" (that is, invest) our conscience, what we would like to see become true in the world, what we would like to help make possible. Which is a different driver than VC funding, it would seem... The was a lot to work through, in terms of laws. Then cancer.
I also write at:
- Bredemeyer Resources for Architects