Known and unknown unknowns

Reports that say that something hasn’t happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns — the ones we don’t know we don’t know.         Donald Rumsfeld

Another world is possible

Another world is possible

Martin Weller has written a though provoking blog post on the possibility that sometimes it’s detrimental to have too much information. It builds on the work of James Surowiecki on the wisdom of crowds and applies to Tony Blair’s decision to invade Iraq. Martin argues that many people were opposed to the war because they had too little information. Blair, on the other hand, had too much and faltered under its weight. To quote Martin’s post:

Blair had an excess of information, while the crowd, deprived of all the intelligence reports he was privy to, had been forced to see the salient features of the war, and had instinctively judged it to be ‘wrong’. […] In short, Blair suffered from a deficit of ignorance, which enabled the crowd, lacking the vast quantity of (meaningless) intelligence to isolate the significant factors in the build up to the war.

One way of approaching this is to think of what happens when we don’t have too much information. One suggestion raised in Martin’s post, implicitly perhaps, is the role instinct, and its bedfellow intuition, play in decision making. It’s implied that an absence of information leads to an action based upon instinct and intuition rather than rational consideration: the people “instinctively judged it to be ‘wrong’”. Despite an acknowledgement that instinct plays a part in our daily life, there are clear problems with the irrational approach it creates; and I don’t think anyone would suggest that a war could be waged on a ‘hunch’. So clearly we need some information.

So how do we know when we’ve got too much? Perhaps the answer is that you can only have too much information when a decision is wrongly made because there was too much information… ah, hold on, that’s circular and won’t do. There will never be a case when it’s not true.

This tempts us to try to quantify the amount of information at which a mistake is inevitable. This is difficult. Try answering this kind of question: when does a man become bald? Is their a tipping point at which what’s recognised as a head full of hair suddenly becomes bald? Probably not, just as we can’t say Blair reading 99 documents is fine, as long as he doesn’t read 100. So we can’t put a number on it. But that’s ok, because we can often speak about things that we can’t quantify or define: we still have baldness and we still have art, respectively.

What’s more problematic is how far the idea of having too much information isn’t so much the product of deduction but of induction. In other words, it’s not a repeatable, observable phenomenon that based upon a rational investigative process. We can’t possibly hope to know even some of the many variables involved in the journey to war and so we can’t possibly suggest with any certainty that too much information is a root or central cause of Blair’s mistake. In a recent elaboration on the problem, Martin suggests that having too little information may sometimes lead to innovation because some are stifled by rules and expectations. (This is a different kind of information from that in the Blair example; I haven’t touched upon that difference here, despite its importance.)

But we can’t be sure that is the cause because we haven’t… got enough information. Besides, innovation has taken place in areas where a great deal of information is present (just as innovation doesn’t take place where there is too little information, too). The problem is when we’re finding out information is that there are things that we know; things that we don’t; things that we know we know and things that we don’t know we don’t know. We will never know some things and other things we’ll never know that we never knew them. This is why Rumsfeld, for all his faults, is on to something.

It is the job of politicians, aided by their public staff, to pursue the ends of knowledge in order that the effect of the unknowns does not unduly affect the result of the decision making process. So, Blair can’t leave it to chance that the thing he doesn’t know might tip the scales (what he knows he doesn’t know); or that there is an avenue he’s yet to pursue of which he’s so far unaware (what he doesn’t know he doesn’t know).

We could therefore argue is that Blair had too little information, for if he knew what the crowd knew then he wouldn’t have acted as he did. More probably is that he knew but ignored it, and that’s where individual judgement comes in. But in order to learn this doesn’t mean that he needed to unlearn everything else. The motives, thoughts and feelings of the crowd, like everything else, can be distilled and presented as evidence amongst a store of other evidence – which he can choose to listen to or ignore.

It’s implied that ignorance – in Martin’s case, the absence of exposure of the crowd to military intelligence – inculcates a more straightforward process of decision making, one that, unfettered by the complexities of competing paths, is somehow more authentically engaged with the truth. But I don’t think that such simplicity is true or desirable, not least because it will give ill-informed politicians a chance to wriggle out of uncomfortable questions about the decisions they make. Ignorance might be bliss but it’s not a firm footing for political decisions.

I’m clearly doing Martin’s post a disservice when I take issue with a number of points a short blog post like his cannot possibly deal with. I might be guilty, too, of focussing too intently on the idea of instinct, which wasn’t the point of Martin’s post I don’t think. In another blog discussion on Martin’s ideas which claims that lack of an adequate filter is the problem, not the information itself (but which amounts to the same thing – too much information) he suggest he might ‘expand on this in a blog post’ and maybe even a book. There is certainly room for one.

Advertisements

2 thoughts on “Known and unknown unknowns

  1. Good elaboration Phil. To clarify – more information and experience is nearly always better. Given the choice between a naive pilot (or surgeon) and an experienced one, I’d go experience every time! But, I think there are conditions when this isn’t the case. Usually that is when you are entering a new domain, for which existing information is not applicable. We see this in organisations a lot – too often what you learn by being in an organisation for a long time is all the reasons why you _can’t_ do something because you know how difficult it will be. You become an expert in how the organisation operates. But a newcomer doesn’t have that knowledge and often will push something through that a more experienced hand won’t.

  2. @ Martin Thanks Martin. Yes, I see your point about innocence being central to innovation in these circumstances. I can think of examples of people saying:

    We can’t do it like that.
    Why not?
    Because we never have.

    Sometimes there’s wisdom in that – experience is a great teacher and all that – but you’re right, when something completely new takes place there isn’t any information about the particulars of the innovation.

    I’ve been thinking about some experiments to ‘test’ the wisdom of the crowd compared to singular expertise which I plan to blog about.

    Phil

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s