A Trace in the Sand
by Ruth Malan
For 5 1/2 years, this journal has contained notes I've taken as I explored what it takes to be a great software, systems and enterprise architect. ... This is where I think "out loud" -- in my quiet way. I write to think, to learn, to nudge ideas around and find the insights they were hiding... So, a new (pop) characterization for my Trace emerges -- this is my own personal "maker" space, where what I am building through exploration, discovery and experimentation is myself, my point of view on architecture. Seen that way, it kind of argues for not making it public, doesn't it? I mean, ewwww!!! The learning lab/playground of a curious mind is... messy! Definitely NOT recommended!!
Point of view? My PoV is: where I see from, what I (seek out and) look at, and what I stand for.
Ok, alternately put, it is the narration and trace of my journey of discovery.
10/2/11 Brain Treats
Important post from Doug Newdick on climate change and IT!
10/3/11 Competition for Watson
Roll on Watson, there'll be a new apple of the media's eye:
"Make no mistake: Apple’s ‘mainstreaming’ Artificial Intelligence in the form of a Virtual Personal Assistant is a groundbreaking event. I’d go so far as to say it is a World-Changing event. Right now a few people dabble in partial AI enabled apps like Google Voice Actions, Vlingo or Nuance Go. Siri was many iterations ahead of these technologies, or at least it was two years ago. This is REAL AI with REAL market use. If the rumors are true, Apple will enable millions upon millions of people to interact with machines with natural language. The PAL will get things done and this is only the tip of the iceberg. We’re talking another technology revolution. A new computing paradigm shift."
-- Norman Winarsky, interviewed in Co-Founder of Siri: Assistant launch is a “World-Changing Event”, 3 October, 2011
We'll know more tomorrow at the [much anticipated] iPhone 5 announcement.
10/4/11: uh, make that the iPhone 4S announcement... media underwhelmed...
I like this alternative wording for relationship platform: (experience-based) engagement platforms (CEOs Must Engage All Stakeholders, Venkat Ramaswamy and Kerimcan Ozcan, October 4, 2011).
Relationship platform leaves the concept open to encompassing relationships between systems as well as engagement or relationships among people and groups or entities, so perhaps it is better for my purpose. Still, engagement platform conveys a defining element well.
10/4/11 From the Stream: Leadership and (organizational) Politics
“The second key is to see yourself not only as a fierce competitor, but also as a broad collaborator. Don't get me wrong: Competition is essential as a spur to innovation. But in a world of increasingly interdependent systems ... the Wild West of competition needs to be complemented and tempered by far more collaboration across old boundaries. Across academic disciplines ... and industries ... and nations ... and even among competitors.” -- Irving Wladawsky-Berger
10/4/11 Watson, meet Siri!
Just kidding! Low hanging humor fruit... But there is a point with respect to user interfaces in that gestural and voice interfaces will find ever more application, not just on mobile devices (though arguably largely driven by such devices because the interaction surface is so small).
And then, the biggie that was missed -- AI redefining search and serve. That has to be on the radar, at any rate. Big mucky data meets intelligent assistant and the world changes! I can't believe how ho-hum the reaction to the iPhone 4S has been. Ok, Steve Jobs wasn't there to rub the iPhone and have its genie appear, but really! Where's the imagination? Think about it -- Apple has given Wolfram/Alpha a voice on the iPhone. Among other things -- like context awareness, machine learning, natural dialog. This holds promise! This low-key introduction is ... downright foreboding! [First they serve and enable us, then they replace us!] And exciting! [First they serve us. ;-) ] We'll have to get our hands on one and see how far along it is on delivering on that promise. 10/12/11: The big question is -- will we recognize that Apple mainstreaming AI on a consumer device was Steve Jobs astonishing-achievement-capping swan song, and did Steve die thinking we were idiots not getting it, but also knowing we will, in time?
And of course there's Watson. Watson -- Doctor Watson to you, Watson to Sherlock Holmes -- the archetype of a sidekick, and a (medical) doctor at that. What a perfect name. A serendipity, since it was named for the founder of IBM, and perhaps also prescient -- as Watson's derivatives emerge, Watson will surely find a role as the intellectual sidekick to many a professional. First up, a physician's assistant.
Kinect helps us see Microsoft as interesting beyond the "productivity" space, but... Apple and IBM sure have the innovation limelight and its all about AI. Google decided it was all about social, so they're going to have to scramble... Although... you know I'd like an affordable robot to taxi my kids, clean my house and keep my yard in order. Who's working on that? Yes, as cars go, Google. But humanoid robots seem to be coming out of Asia...
In the US, we've focused on software that serves mind treats (social, search, Watson and Siri). But there is plenty of room for engineering of stuff, creating tangible products that would make people's lives more the way we want them to be... Smarter stuff -- ecologically and in the sense-respond and sense-inform sense too.
I know that's not an enterprise app... but just translate that to sensors and pulling the finger off the trigger on algorithms that sink stock markets... Not enterprise significant? How about commodities markets? Hm, word of social net and production forecasting? Ahhh. It's all in the analogy dear Watson. :-) The other day I joked to Dana that the Ruffian[-inverse]-Turing Test will be: when humans don't understand it but a robot can, we'll know it's a robot. I think that will go for analogies too. We're getting more binary and AI deals better and better with fuzzy and unstructured.
All told, we're hurtling through revolutions in how we conceive of our humanity (neuroscience, behavioral economics and more are revealing our stunning perceptual inaccuracies, among other flaws), extended humanity (digital information, AI), and ... replacements for humanity (manufacturing robots, digital assistants, ...)
We have to get our house in order with respect to sustainability, but there is so very much to be done, there's so much to be optimistic about! If we can just get the full workforce employed and spending... on sustainable lifestyles, of course...
"When you wonder where all the youthful creativity is; where good old “Yankee ingenuity” has gone, it’s still here. But not in formal education. Anyone who is looking to find the next generation of engineers, technologists and free-thinkers need only go to one of these Faires or visit the thousands of Hacker Spaces springing up across the country. It will leave you breathless…and hopeful."
-- Ira Flatow, DYI Sci, October 3, 2011
Ok, so I totally know the chances are good that this post is a tad exaggerated... but whether experience with the 4S proves the promise of Siri or not, we know that this is an unstoppable direction and we may as well get excited about it so we see what possibilities it holds. And we may as well get excited about regrooving for sustainability because it will create jobs, save money, and leave less impact on the planet and all its lovely creatures. (Uh, you may as well get excited. I'm bleeping terrified. ;-) Or I would be, if you didn't get excited. So let's all get excited and change this world!)
10/10/11: IBM has Watson, and a primary focus on vetted knowledge sources. Apple has Siri giving voice to Wolfram/Alpha -- though I expect the relationship will have its trials... It will be interesting to see what else Siri does, AI wise. Google trawls everything i-way with impressive diligence -- imagine what that could turn into! Amazon's Silk approach is interesting.
10/11/11: This explains more: Why is Siri So Important?
10/12/11: Dana, who cuts wood as well as code, tells me that there is a big furor about the tablesaw blade sensor thing in woodworking circles.
Apple's Siri Is as Revolutionary as the Mac,
James Allworth, October 13, 2011
A certain Professor Alexander told us in his strategy class that managers don't read, so use short sentences, short paragraphs, and write brief briefs. I broke every one of his rules. Got an A. Flout confining expectations. Break rules made to level. I did and still do. I break rules of grammar. When it suits me. I nest. Clauses. I iterate, and conceive in parallel. And expect you to too. I don't try to be colorful. But shucks (the word of the day today on Twitter; getting old already), no-one at all would read this Q<= were I not colorful! There are just too many bases on which to discount me. Oh. Right. I am discounted. Few read here. Rats. Short sentences then. That'll do it! Right? Oh.
The biggest key to success as a writer is not short sentences. It is writing. And living an interesting mind-life, so we have grist for our thought mill. An architect as writer needs to have something interesting and important bubbling under the surface. So interesting and so important that she -- or he -- cares enough to write it out, think it through, improve and learn how to pitch it. Or pitch it. Bob Dylan and Leonard Cohen both write, and walk away. From a lot of what they write. Writing is to learn. To conceive. To frame. To draw out. So we learn what we know, what we have within us melded from our experience and interacting with other's minds and their experience and sense-making. So we see what is important -- useful and a priority to draw attention to. And how to express it. Thinking it through shapes how we draw it. How we talk about it. What we draw attention to. And explain or justify and persuade others to help us do, or to do differently.
Our audience is busy. Much to attend to. So sure, simple factors. And being interesting and meaningful. And relevant.
Make your own rules, but don't feel you have to start from scratch -- here's a collection and a reminder:
As essayist, programmer, and investor Paul Graham has written, "Writing doesn't just communicate ideas; it generates them. If you're bad at writing and don't like to do it, you'll miss out on most of the ideas writing would have generated." --
Jocelyn K. Glei, 25 Insights on Becoming a Better Writer
And sure, code counts as writing. But code tends to obfuscate
architecture and we need to set architecture in bold relief so that we
can study and improve it, test it out not just in our mind's eye (with
that so-limited working memory) but with the interaction of our and
other minds on it.
The sad news reminds us of our mortality. We could not hold onto Steve Jobs, nor he onto life, despite all the will and money in the world. There are many ways we can make a difference and one of them is to see the great in what others do. If there is anything I admire Steve Jobs most for, it is that. The more greatness we see in the world, the more we have to draw on to make something new -- something "insanely great." But first to see and appreciate. Combining an aesthetic appreciation, the ability to wonder at, with a sense of possibility, the ability to wonder. Then to envision a new concoction of greatness to make something fantasmically new in the world. And then to demand greatness in execution of the idea, making tough choices to retain the essence that is great, and to appreciate it enthusiastically, even child-joyfully, as it comes to fruition.
10/6/11 Famine in Somalia
Children by the tens of thousand are dying from the tortures of famine -- "a tragedy unfolding" (thanks for the pointer Daniel). Drought-induced famine that is complicated by war. We can do something immediate -- the Red Cross, UNICEF and WorldVision, among others, are providing aid in the drought-hit region.
If climate change goes the way scientists are predicting, horrors
like this will mount. Our neighbor is an environmental physicist. He has
installed solar, cycles to IU, and drives a hybrid. His actions match
his words. Our tastes and priorities are changing, but
I feel like
conspicuous consumption and climate evil personified driving an SUV. We
are changing habits and purchasing choices, but on a budget some just
take some time to roll over. But. Too many "but"s. We have to get past
I also write at:
- Bredemeyer Resources for Architects
Architects and Architecture
- Todd Hoff (highly recommended)
- Anna Liu
- JD Meier
Architect Professional Organizations
Agile and Lean
Agile and Testing
Other Software Thought Leaders
- CapGeminini's CTOblog
CTOs and CIOs
- Werner Vogels (Amazon)
- Jonathan Schwartz (Sun)
CEOs (Web 2.0)
- Don MacAskill (SmugMug)
- Wired's monkey_bites
Social Networking/Web 2.0+ Watch
- Dan Roam
- David Sibbet (The Grove)
Strategy, BI and Competitive Intelligence
- Freakonomics blog
Um... and these
- CNN Money Business of Green videos
I see Doug Newdick illustrated what Conceptual Architecture is good for -- a Conceptual Desktop Architecture. Great post Doug! And (serendipitously) well timed!
Aside. I like Doug's use of hand-sketching on his diagrams. They are very useful illustrations that really add to his text. And the hand-drawn conveys the human touch so eloquently.
I believe Doug is right about Conceptual Architecture being important. And right that it is a neglected view. But not in our book. Well, our book is under substantial (complete!) revision. Which is why, among other things, I'm trying to get the framing (more) right -- or at least as right as it can be given the current set of understandings we (the field) have. So thanks for the carrying the conversation along -- even though Doug wasn't aware of my piece of the conversation/this Trace entry [which I have trimmed down now that it served its purpose, after its own fashion].
I liked this (very short) animated video: The Ride - by King & Country. These Rube Goldberg-styled "machines" so make me think of visualization of executing software, and this one is even more delightfully apt as it is a fantasy world with allusions to the real world -- a construction of the (analogical) mind, as software is.
And the visualization (right), from the video, of a system as interacting mechanisms is wonderful!
Isn't that just what our systems would look like, if we could take the skin off?... :-) Well, as I see it, it would have various defense shields. But our systems have to survive in a hostile world.
And so it goes.
Now you're convinced I've lost my marbles... ;-)
Image source: The Ride - by King & Country
10/6/11 EA KPIs
This appears in KPIs for Enterprise Architecture:
# Frequency of updates to the enterprise architecture
Now, would frequent updates be an indicator of goodness or badness?? Good -- responsive to emergent need. Bad -- was way off base to begin with... but good if responsive enough to fix it... And so it goes.
Well, indicators are just that -- signals to look deeper. Just so long as management understands that.
10/6/11 Why We Sketch
Via Peter Bakker:
"Up until now, they were talking about WHAT they were trying to do. Now, they could talk about HOW they would do it.
The WHAT was now on the whiteboard—and in everybody's head. For the first time, it was the same WHAT everywhere."
-- Jared M. Spool, Why We Sketch, Sep 22, 2010
10/6/11 Decisions of the Moment
Not all decisions are equal (in import, scope of impact and consequence). The architect needs to be able to assess (and given the authority to decide) which are architecturally significant, and when to make* those that are. Some need to be made early, just to get ground under the feet to move forward on.
"a long and rapid succession of suboptimal design decisions taken partly in the dark."
-- Philippe Kruchten, "The Architects -- The Software Architecture Team," Proceedings of the First Working IFIP Conference on Software Architecture (WICSA1). Kluwer Academic Publishing 1999
* collaboratively, of course, but with the authority to make a decision if the collaborative/consensus process stalls.
10/6/11 Great Talk!
Michael Feathers "Code Blindness" talk is wonderful, and he really positions software visualization well!
Michael Feathers: to get past code blindness:
- more monitoring of applications -- find indicators of problems in code bases
- boiled frog syndrome -- those closest to team are not able to tell where the problems lie
- planning for replacement
Michael Feathers: metrics are dangerous when you lose context, or when they become goals (in of themselves)
10/7/11 Opportunities to Influence
The Fall Ballet at IU tonight was wonderful! And our world is decked in resplendent color.
A good note to close on...
Tim O'Reilly held Google's feet to the fire in his "What Android can learn from Steve Jobs" keynote address at Android Open -- in the most gracious way, of course. His address was a mix of what Google is doing well -- where it expresses and aligns with its soul -- and a call to do more of that. It was a courageous (ok, Tim can afford to be more courageous than most, but still he didn't try to protect powerful relationships by pulling his punches) and inspiring keynote and I'm glad I was able to catch it on livestream (even though it puts me seriously behind on today's commitments!). The slides are here, but I recommend the video (scroll down to get to it) -- along with the slides.
The address is more pitched at Google than the (rest of the) Android community, but we can see Tim really set out to speak for the community, to call Google to be its best self, to be true to its stated, projected image -- to do no evil, to not be closely controlling and so close off others from leadership roles in the Android ecosystem.
It is a wise talk, full of strategic savvy, and I highly recommend it to architects wondering what strategy means to technologists. :-)
By extensively quoting the "founding fathers" of Google, Tim not only made his intended points, but built the image of the evolving "creation story" of Google, strengthening his "be true to the self you project" call. We do, after all, want to be known for the best we see and value in ourselves. As Tim points out (quoting Kurt Vonnegut's "You are what you pretend to be, so be careful what you pretend"), by stating how we want to be seen, we call upon ourselves to live the image we project. If we do not strive to live up to that projection, the world calls us on the disjunction, and we are seen as inauthentic and lose trust.
This, from a slide in Tim's talk, is an inspiring -- and daunting (will we fix the environmental mess, or make it worse) -- way to position what we're about:
The "hardship for us all" is a bit jarring coming from Sergey Brin, but he does make a good point all the same!
Showing how bright ideas can be really simple: water bottle lighting.
And dim ones can backfire: Netflix VP: Why We Moved "Too Fast," And Why "We Were Wrong" On Qwikster, Austin Carr, 10/10/11
At least they're saying: "We were wrong."
Ok, HP board -- your turn!
What happened at Netflix and HP sure demonstrates the power of social media.
10/10/11 Serendipity Serves
I relate this (my italics):
More than a week after Stalin’s death, Eisenhower was talking with speechwriter Emmet Hughes about the address. “Look, I am tired—and I think everyone is tired—of just plain indictments of the Soviet regime,” Ike said. “I think it would be wrong—in fact, asinine—for me to get up before the world now to make another one of those indictments. Instead, just one thing matters. What have we got to offer the world?”
-- The Origins of That Eisenhower 'Every Gun That Is Made...', Robert Schlesinger, September 30, 2011 (via Tim O'Reilly)
to Jobs' key point in a presentation to Apple employees in 1997 (via Tom Graves):
"To me, marketing is about values. This is a very complicated world, it's a very noisy world. And we're not going to get the chance to get people to remember much about us. No company is. So we have to be really clear on what we want them to know about us."
Me-too products (simply copying others' feature sets) lead to more complexity with no distinctive meaning and value, forcing a battle on price leading to ever more tightening of cost reduction screws.
What is common to those two Eisenhower and Jobs "great speeches", is the call to find and be true to distinguishing value.
Aside: I like the image of the "bite" out of the Apple apple that is Steve Jobs' profile, and the way the apple logo itself suddenly now seems to be a candle! Who did that? It's really impressive!
Looking again at modularity, I discovered by a neat serendipity that Bill Shackleton tweeted a set of links to very useful papers on modularity last night! Timed that well, he did! :-)
So via Bill:
I hadn't read the last of those, and my Modularity and what we can learn from Trek blog post was independently derived, but serves as a complementary companion, perhaps.
Now I need to read:
for its implications as we move to less hierarchically composed organizations. The section on modularity theory is a wonderful trace of (some of) the history.
My interest in modularity is from a software and system perspective, but I got pulled off course for a spell by Carliss Baldwin's article which intrigued me because we think that organizations will move in a direction that is more networked and "podular" with more fuzzy boundaries which raises interesting questions about transaction costs. Peter Bakker pointed to the wikipedia entry on community structure which is interesting.
Technology, Organization, and Society, Richard N. Langlois, Economics
Working Papers, 1999
It is interesting that we have moved from Douglas Engelbart's conception of augmented human intellect more to human-computer symbiosis (with roots in Licklider's work, but a referent, for example, in Tim OReilly's articulation of a network-mediated global mind). That is, implicitly and I think unselfconsciously, we have shifted from computing as enhancement to computing as peer. The next step in the trajectory -- human-augmented computing?
In important ways, we're already there. For example, in software visualization, we more sensibly depict system structure derived from dependency and semantic analyses (algorithms) when we factor in the (human) understanding of the team. That is, we're moving into the sphere where our (so-far uniquely) human proclivity for 'hunches, cut-and-try, intangibles, and the human "feel for a situation"' (Douglas Engelbart) is needed to augment the powerfully "rational cogitations" of compute intelligence.
Of course, all this is in service of humanity (environmental catastrophes aside). For a while yet. At least. ;-)
10/12/11 Agile Architecting
Our What it takes to be great paper played a cornerstone role in our field, building recognition that architects are not uni-dimensional technologists. I think the Getting Past ‘But’: Finding Opportunity and Making It Happen paper was likewise a scene-setter, but more, it is one of those inspire and enable pieces. I would write it a lot differently today. But that doesn't make it irrelevant or not worthwhile.
As for Fractal and Emergent... :-) Well, it pairs well with A Kodak Moment to Reconsider the Value of IT (Robert Plant, October 12, 2011) and, actually, Steve Yegge's Google Platforms Rant. Not to mention Greg Satell's A Radical Shift Toward Design (October 2, 2011).
10/12/11 Alistair Cockburn
I enjoyed Alistair Cockburn's Effective Software Development presentation (although I had to forward through the certification bits, and such; not enough time in a day for every distraction... oh yes, and work)! I wonder if Alistair adopted the accent intentionally, or if he just has a talent for absorbing context?
10/12/11 Get It While You Can!
Read this right now (you'll thank me!): Steve Yegge's Google Platforms Rant
10/12/11 Some Neat Visualizations
10/12/11 Sound Familiar?
'“architecture is experienced habitually, in a state of distraction.” Architects must then weave ways to flow people and resources through built environments' -- Danzico, Enforced Listening Moments, Dec 1, 2009
Isn't this what your system needs:
for Dummies, Jeremy Denk, Nov 30, 2009? Well, of course, not exactly that,
but that kind of enthusiastic and acute-but-sensitively informed telling of the
meaning and import and intent of its structure and mechanisms?
Visual Bits from the Tweet Stream
Via Peter Bakker, among others:
Some other sites were mentioned, but they are already on my list: Visualization in Other Fields.
Interesting piece of visual art history:
100,000-Year-Old Art Studio Discovered, Cynthia Graber, Scientific American,
October 13, 2011 (via Maria Popova/@brainpicker)
So Sorry to Hear That!
Dennis Ritchie died on October 8. It makes news today? Now that was a life that made a difference! Funny how we don't think of it that way, until it's in the rear view mirror. Well, it's humbling and enriching to think of what Dennis Ritchie gave us. Unix and C. And all that stood on the shoulders thereof.
10/14/11: Rob Pike's reflection is worth reading.
Of note: he doesn't take away from Steve Jobs to give to Dennis Ritchie. This is not a competition among dead heroes for our worship! This is our huge shout (and sad sigh) of thanks, and an expression of wonder at having lived when these men lived and made a difference -- changed the world, even!
Mortality sucks. But it reminds us to care for and appreciate one another because we blaze for such a short time. We have this chance -- this short lived opportunity -- to feel the presence of others in our lives. To connect through the rare act of seeing -- of putting aside the huge mirror of self that prevents us from seeing -- into the mind and heart of another. And by seeing, by putting our great big oofish self aside, we are enriched, in our view, our empathy, and the material we have to draw on to make fresh connections, lovely in their unique composite form -- more lovely because they drew from beyond the limits of our self.
We can choose how we see ideas and the people who associate themselves closely with their "thought children." We can see them as threats to our dominion, or we can see them as they are, especially as they are in their best light. I suppose we need warriors -- not convinced, mind*. But I do so much more lean to those who are compassionate, kind and generous. They make the world better in our experience of it.
"What unites the CIRTL Network universities is a commitment to developing a national STEM faculty better prepared to teach, through three core ideas: teaching-as-research, learning communities, and learning-through-diversity," -- Robert Mathieu, ACM Newsgram, 10/14/11
I was struck by that last: learning-through-diversity. I realized that I have an opportunity I wasn't valuing highly enough -- being in a distinct minority, everyone I work with is very different from me. That's a learning-through-diversity opportunity of huge proportion! Right? I get more opportunities than you do to work in contexts where styles and viewpoints are very different than my own. Rats! I should be so much better at this than I am. All that opportunity 'n all.
* Given that there are warriors, we need warriors, but
what if we were free of warriors?
Call for submissions for the SEI Architecture Technology User Network (SATURN) 2012 Conference. The SATURN 2012 Conference will be held in collaboration with IEEE Software magazine and will take place May 7-11, 2012 in St. Petersburg, Florida.
I enjoyed doing a tutorial at SATURN in 2010, but unfortunately the conference overlapped with Ryan's graduation (the culmination for most in his class of 9 years at the Montessori School, so a big deal -- with speeches!) so I missed the chance to sit in on other presentations and tutorials. Linda Rising, Rod Nord and George Fairbanks sat in on mine. Rod Nord is beyond compare -- wise yet child-open and joyful at discovery. He enriches experience, and it was a privilege! I've since watched Linda Rising on youtube, and she's the Ellen Degeneres of software, or something like that! I mean, she's an engaging and dramatic speaker.
Yeah -- Tell Meg!
And HP could make it happen! Turn them engineers loose on defining the future! Tell Meg Whitman I said so. That'll have some influence.
I mean, I love Apple and Asia and all, but the future needs options.
There's so much opportunity! To something like Watson or Siri, add
Computing has so far to go, even when we only take into account what we already know! Just think -- the "not your father's PC" of the future will not just sit there on your desktop, mute and inert! Between here and there, there's a whole lot of innovation to conceive, realize and popularize!
Or going the other direction (from more human-like computers to more computer-like humans), imagine adding this to something like this! It doesn't have to be like this. Makes Kinect look passé doesn't it?!
I know that's a far cry from the cut-throat business HP's computing business is in today. But competition is only cut-throat if you can't imagine -- and execute on -- being compelling!
Ryan told me he came across this (I'm paraphrasing) on the i-way yesterday: "With Lion, Apple reveals that it has given up on dragging us into the future, and has left without us."
But hey, it would be so much fun to make this movie today -- you know, a movie set in 2036, or thereabouts. The backstory is here: The Making of Knowledge Navigator, Hugh Dubberly, Mar 30, 2007. And here's a downer (not). :-) In that light though: Paul Allen: The Singularity Isn't Near. ;-) I mean, consider this: Robot Biologist Solves Complex Problem from Scratch, October 14, 2011 (via BillShackleton).
Two Kinds of Critics
"Any time you do something big, that’s disruptive — Kindle, AWS — there will be critics. And there will be at least two kinds of critics. There will be well-meaning critics who genuinely misunderstand what you are doing or genuinely have a different opinion. And there will be the self-interested critics that have a vested interest in not liking what you are doing and they will have reason to misunderstand. And you have to be willing to ignore both types of critics. You listen to them, because you want to see, always testing, is it possible they are right?"
-- Jeff Bezos, Amazon Shareholder Meeting, June 7, 2011
Read the whole transcript -- lots of lessons!
Follow Up Required
Note to self: need to follow up/verify this.
10/15/11: That is, I wanted to look up the PopSci reference -- here it is:
and also remind myself to dig out Ralph Merkle's take -- here:
Twitter Occupied Too?
Well, Ryan is finding this a good time to be singing Bob Dylan
songs from the '60's. The youth of today care. That matters!
Stuart Boardman's Monet Revisited post makes points along the lines of several I've made -- except he does so with eloquence and style (of the kind I think is iconically captured in the xkcd hat guy) and a unique slant lighting new insights! So, yeah, sure, I droll on about the value of sketchy (double entendre intended) and hand rendered in a very technology-glitzy and implicitly mechanically prescriptive world. And, of course, Stuart's wonderful post goes further too, to survey other innovations in terms of how we conceive and practice EA (and the very enterprises themselves).
Because I think what Stuart was saying in Monet Revisited is complementary to, and enhances, enriches, expands, deepens, provides a different angle on, some of the points I've made, I'll quote myself to add to the conversation -- by which I mean, I'm not trying to say "I told you so" but rather, I'm trying to say this is an "and" world, where we brighten each other's conceptions by holding conversations (albeit asynchronously and using our blogs as avatars that speak for us). In that spirit then, this from my Trace on July 5, 2011:
"Though just an anatomical study, it foreshadowed the sculptor's later efforts to reveal essence rather than merely copy outward appearance." -- wikipedia
"There are idiots who define my work as abstract; yet what they call abstract is what is most realistic. What is real is not the appearance, but the idea, the essence of things." -- Constantin Brâncuşi, wikipedia
I thought that was interesting. Software architecture is like that, isn't it? About finding and expressing the essential structure and nature of the thing. So we struggle with how to talk about it -- abstraction, compression, ...? To me, in advance (of the built system) we could say it is abstract (as in the mechanism of neo-modern art) though by reference or allusion (metaphor/analogy/symbol/imagery, ..., patterns) we draw in potentially huge influence and experience so compression plays a very important role. Once built (at all, and as the system evolves), compression (the very real, actualized meaning of whole chunks of the system gets compressed into the elements and mechanisms we represent) and abstraction (selectively eliding and occluding detail to reveal the essential) figure. In any event (expressing design intent or reflecting design as built), abstractions are central. I suppose, if I must, I could dance on the head of a pin and say these abstractions (entities in essential form) are compressions (they draw into compact form much meaning).
The neat thing is that whole fields of representation (including aesthetics and semiotics) swoop into relevance. :-)
Brâncuşi was, I gather, objecting to bundling his work into a class of art for which there is no evident relationship between what is real and the art; rather, he protested, his work captures the essence or the idea of the thing. It cleaves away the inessential and compresses the essential into a powerful expression of the essence of the reality. In other words, rather than convince me that his work is not abstract, Brâncuşi defined abstract for me, at least in so far as it applies to his work. For Brâncuşi was very much after identifying and capturing the essential identity and form (with its implications for function) of the thing he was sculpting. (See, for example, the evolution of his Bird in Space.) In discovering software abstractions, we can go from concrete instances to a generalization, eliminating detail from the concrete to find the more general, more abstract common form. But seeking the idea, the essence of the thing, supports more degrees of freedom in realizing concrete forms that retain the essence.
I was pitching that at software architecture. But, about a dozen years ago, when our IT architect clients were urging us to help them figure out what to do in the Enterprise Architecture space, they made the point that the way we think about architecture of systems speaks, with a context shift of course, to other kinds of systems -- those that are more software-intensive and others that are software-complemented-wetware intensive (the wetware, in my use, being a reference back to that Merkle piece).
I alluded to Brancusi, and Stuart to Monet, to make the point that detail, inherent in realism, obfuscates. My points are shaded differently than Stuart's, so, as I said, they are complementary, and add up to a more rich view.
What pops out freshly for me, from revisiting my thinking extended and freshly nuanced by Stuart's points, is that there is huge value in that "room for interpretation" that more abstract representation rather than less rigorously detailed specification leaves open. This also harks back to one of our earliest principles -- that of minimalism (see here and here). We want to fix in detail only what we need to, to focus on and enable strategic outcomes. For the most part, we want to simply set sufficient context so that we enable good, right things to happen. Where "good" (in our good, right, successful sense) is technically sound, and "right" is from a stakeholder point of view -- meets stakeholder needs, goals, alleviates frustrations, delights them, makes their lives more the way they want them to be, creates and fills aspirations, provides meaning, and so forth. Which also relates to the Fractal and Emergent which probably bears a closer reading now... ??? ;-)
Talking through some related ideas, Dana said "That's why Jesus taught in parables." In other words, stories play a similar role.
Key points I take from the interweaving conversations then:
And some other stuff. (I thought of a 3rd point while off line... that "memory almost full" phenomenon or the impact of just 4 hours sleep last night???)
Well, I could pull out other pieces (of my long and arduous conversation with myself that is this Trace) that relate to this vein of conversation, but onward! New fields to plough!
Aside: Remember that the Less is More paper was written in 2002. I would write it differently now (so don't be too critical of dated wording), but it still is ahead of many people's thinking about architecture...
But quickly, just to pull these into my notes:
Above and below -- studies in networks of influence. Both abstract. (Dana took the photo below in Canada a few weeks ago.)
Yep, Dana's been traveling here there and every where -- just in the past two months he's been in Africa, Canada, variously in the US, The Netherlands, etc.
But... Dana needs to go back to South Africa -- we're about out of rusks.
10/17/11: Daniel, most valued and awesome scout that he is,
pointed to this piece
on Gertrude Stein which well fits this thread. You know, Gertrude Stein of
The World is Round, and other things. ... I suppose I have either
irreparably sunk or redeemed myself in your estimation by
children's literature in the context of computing and enterprise
architecture. Sunk...? Oops. Well, at least... I'm .... courageous. ;-) But
The World Is Round has a poignant message in this world where the frenzy of
social connection is ... um... deep and important post could be written...
Drawing on the Human Side
Our Visual Architecting Process poster (below) has always been hand-drawn, reflecting our values around communicating organically and in high human-simpatico terms. Dana never turns the projector on most workshops, and I'll go days without turning it on. A few people hate it, being concerned we aren't "covering the material," but for the most part that responsive-to-the-moment works well -- for people who will give up the need to have some kind of paper control and trust the instructor to set context and create that learning crucible so that the architectural thinking lessons are drawn out and practiced.
The Role of the Architect
Depicting some dimensions of the role:
Lead -- raising the ceiling for others
Strategize -- thinking about how to solve the puzzle
Architect -- drawing the big picture view
Politics -- relationships (involves, like, taking showers and communicating and stuff)
Politics -- shielding the team from organizational crud
Just a taste of some of the flavors of the role. Not
exactly/entirely representative. But enough to cure most developers of
Uh. Actually, that's just a page from some sketchnotes I took back when. I have no idea what I was thinking at the time, so I came up with some words that seem to fit. Oh, yeah? Like you never do that?
Monty Python -Esque
Ryan was wondering who we have now, to fill the gap John Cleese left. I relayed some of Michael Feathers tweets of late. I was reminded of that, when I saw this:
It's really so much like this, isn't it?
I think Michael would take that as a compliment of the highest order. Certainly it is intended as such.
When I quoted "It's raining diligently?" the chortle was
delicious. So (sur)real. It's that amplify thing taken to the kind of peak where
exquisite and excruciating meet. Um. Um. No. I didn't mean that. Or... maybe I
Shallows or (dis)Connected Long Tail?
This brain-of-brains thing we've created with the internet has the potential to shift the state of humanity -- in good ways, sure. But also in bad. There are concerns that we and our interpersonal lives are becoming more shallow. We flit through zingy morsels of brain treats served up on the i-way, rather than reading books. We text and tweet rather than conversing and writing letters. We watch youtube on our computers and don't support live performing arts, pressing them to evolve away from their classical roots to be more electrifying. Our tastes shift to satisfying an addiction to the buzz of lots of little eurekas.
I think there's an alternative way to see what is happening, at least in part. Which is to say, for example, we communicate more frequently and across more channels, so if you look at any one channel, it will appear more "thin" but it's just a slice -- diced, moreover, into smaller chunks. But is the communication shallower or more rich, taken across all the parts and allowing, too, for emergence? Are we reading more shallowly, or are we following our interests more passionately, going deep but also leveraging the facility the i-way has given us to tie together sources that bring the everyman into the state of being a polymath or Renaissance man? Are we interacting less, or more with people with like interests, distributed globally, who encourage and fuel our thinking?
I think the potential is there. But we have to make better use of it.
I so liked Jeremy Denk's "Love is Complicated" post -- and Janet's wise comment (August 17), along with bratschegirl's (August 26), in response.
You might want to take a note of bratschegirl's observation (quoted below), bearing it in mind when next you're discussing your architecture with the management team:
"In the orchestra world, we have a
similar phenomenon when it comes to contract negotiations. It’s the inevitable
discussion of dress code. Janice Galassi had a wonderful take on this some years
ago, and what she said basically boils down as follows: Board presidents often
don’t understand the ins and outs of multiple doubles or why the principal horn
needs an assistant on a program containing the complete orchestral works of
Richard Strauss. But clothing? That they get, and therefore they bring it up.
Once it’s all over, of course, it’s rather amusing to recount such things as the
objection to velvet onstage because it’s “too black,” but it’s rather
tooth-gratingly hard to get through with a straight face at the time." --
bratschegirl, comment on Jeremy Denk's "Love
is Complicated" post
"You can see a lot just by looking." -- Yogi Berra
When I heard that, I thought about software visualization and how it enables us to see a lot (anomalies, patterns, relationships, etc.) just by looking. But, seeking the context in which Yogi Berra was supposed to have said that, I found this: She Was Seeing At Me, Mark Lieberman, Feb 9, 2008
The cartoon connects nicely with the Howl's Moving Castle reference (which I eliminated from this facade). The snippet from A Scandal in Bohemia is also marvelously architecturally significant!
Sometimes we look without seeing, and see without observing. And sometimes we're not even looking, until someone changes how we view something, and then we see. Like code we know like the back of our hand. But really, how well do you know the back of your hand? Its structure and texture, anomalies and emergent... omw! ;-)
On The Funny Side
pointed me to Greg Wilson and Jorge Aranda's article in the American Scientist titled "Empirical Software Engineering." And that article pointed me to this wonderful paper on sketching in software development, and the use of diagrams to understand, to design and to communicate:
Now I do think that reflecting on the results of one study done internally within one company and treating UML as a closed matter is... unfortunate. That the company was Microsoft is at least interesting, given rivalries and NIH... If the company studied had been an active user of UML, the motivation for and use of sketching may have been about the same, but the notation used less ad hoc. An interesting study would be to compare the retention and evolution of design diagrams in companies where UML is used versus those where it is not. And to compare measures of structural integrity over time.
One might further ask if the "awkward result" we should attend to is the rampant problem of the erosion of code quality over time, and the failure to instill a design discipline that values designs enough to evolve designs in lock-step with code -- maintaining designs and their context and rationale as first class citizens of the software body of work, valued for their contribution to more simple, more clean code designs. Useful design views mind you. Which is to say, as far as compute rendered goes, we need to be zoom in and out on levels of abstraction and drill into mechanisms with more levels of control over what is depicted.
It is unfortunate that we, as a field, got so enamored with the tool-as-shiny-object thing, and then so disenthralled with the overload of class-level models that we swung the pendulum the other way. We use UML in a lightweight way for simple sketches, and it would be useful I think, if developers were widely taught to match model rigor -- not to mention levels of abstraction -- to the demands of the occasion they face.