Sunday, September 11, 2005

me no know

This is interesting.

Theory-Driven Reasoning About Plausible Pasts and Probable Futures in World Politics: Are We Prisoners of Our Preconceptions?
(Word document)
Philip Tetlock, (American Journal of Political Science 43(2): 335-66).

Political analysts are wont to lapse into tautological thinking and most knowledge claims are inherently politicised. It's something that I've long thought, but this paper gives the observation some analytical rigour.

It's difficult to know what to do about it, though. There is an intractable epistemological conundrum at the heart of human consciousness that we have yet to solve. For the time being the best work-around we have developed is based on Descartes' privileging of concrete experience, but the simple fact of the matter is that the world has too many variables, all changing too quickly, for empiricism to much more than a valiant attempt to impose rigour on thought. When it comes to analysing fast-moving, stochastic datasets and trying to extrapolate future trends, empiricism is just too cumbersome for the job. So political analysis largely comes down to sophisticated heuristics.

There's a problem here, though, as well. As any (serious) political analyst will tell you, fortune-telling is a mug's game, but it's also what the readers really want you to do. There's little alternative but to draw vague, ephemeral conclusions that delimit the possible range of future outcomes, all hedged about with weasel words. And even then, as Nassim Nicholas Taleb points out in a fascinating paper (pdf) on prediction (it's in draft, so don't quote it), even when people limit themselves to delimiting a range, they get it very wrong - worse, experts get it even more wrong, largely because they find it more difficult to believe that they could be.

The thing is, though, that we're all in business, which means that we can't really just throw our hands up in the air and say "Well, how the hell would I know?". Tempting as it is. So what we end up doing is playing an elaborate confidence trick on - or perhaps more accurately, with - the readers.

The ranges of possible future outcomes are selected heuristically - that is to say, in a rough'n'ready 'kinda fits' way - and then hedged about with words like 'perhaps' and 'possible' so that there's a get-out clause if we get it completely wrong. What our heuristics come down to, though, is a whole bundle of unarticulated biases, assumptions and beliefs about what the world is and how it works - that essential epistemological conundrum to which I refered earlier.

The thing is that the reader doesn't have any alternative mechanism by which to assess the validity of our truth claims. All the reader is able to do is to deploy precisely the same sort of heuristic reasoning to see if the analysis 'kinda fits' their own interpretation of the world. If it does, then the material is deemed 'good' and they'll probably read it again.

Which brings us to the great danger of this process. It's become glaringly obvious over recent years that the popular press in most countries has abandoned the old ideal of journalism and has cynically embraced this feedback loop of heuristic reasoning. People buy newspapers these days not so much to read news, but to have their own views recycled back to them clad in the pscyhologically imposing (qua 'authoritative') form of the printed word. It's symbiotic: they like the reinforcement of their own personal paradigms, and the papers employ people who share the paradigm so that they can write copy that the readers will like. The way that I'm describing it makes it sound very transparent, but it's not - there are several layers of bluff and counter-bluff going on there to ensure the suspension of disbelief, most of them to do with the 'personality' attributed to the outlet's editorial line. 'Edginess' or being 'hard-hitting' or being 'salt of the earth' are some of the more obvious traits cultivated to maintain that underlying suspension of disbelief.

And it seems pretty obvious that exactly the same thing is going on in all analytical professions, whether it's political analysis, business analysis or financial analysis: most analysts end up producing analysis that looks remarkably like what their peers and competitors are producing (not least because they spend so much time looking at their peers' and competitors' analysis). And if they get it wrong, then just about everyone gets it wrong. They do this even if they know deep down that they're just telling people what they want to hear - just look at the tech bubble. And it's only in rare instances, like the tech bubble, that this symbiosis will break down and people will actually be taken to task for perpetuating the myth that we can see into the future.

It's a bit like the early priesthood, in a way. The people in the funny costumes who speak in a funny dialect full of arcanery are the people who don't have to go out into the fields to work, because they're the guardians of 'the knowledge'.

Anyway. The problem is: knowing this, what are we actually going to do? Should we give up prediction? Should we abandon analysis? No, actually, I don't think we should. There are two things that make me think that it is still worthwhile pushing on in this line of work. One of which is the fact that when people make decisions, most of the time, again, they will be relying on heuristics. Because heuristics are fuzzy, the best way to analyse them quickly is through fuzzy thinking. So when it comes to analysing decision-making, being imprecise isn't necessarily a flaw - so long as you recognise that the person you're analysing may be imprecise too.

Which is what this comes down to, for me. There has to be an element of intellectual humility involved here, and a degree of empathy. It is possible to understand decision-makers so long as you recognise that their decisions may be driven by emotion as much as by rationality, and that they often get it wrong. Far too much political analysis assumes an incredibly high level of rationality, cohesion and purpose in decision-making, when what you're looking is really only marginally more organised than pure chaos. The thing is that it's tempting for analysts to perpetuate the impression of complexity because it maintains their position as the guardians of 'the knowledge' - which is precisely what conspiracy theorists try to do at the lunatic fringe of the practice - when actually the analysis would be better if it were that much simpler and that much more sincere.

The most important part of that would be admitting making mistakes without trying to wriggle out of it. Playing with counterfactuals can indeed increase our understanding of the world, and if they are proven wrong, we can learn even more from why. To err, after all, is human. Unfortunately, we're generally too damned proud to admit it when we do screw up.

0 Comments:

Post a Comment

Links to this post:

Create a Link

<< Home