Thursday, May 31, 2012

Experts 3: Conspiracy Theories and Cranks



We have seen how vulnerable we are to the cult of expertise. Whose interests do experts serve? Are they practicing groupthink, or blinded by their narrowness? Why should I trust them? We don’t like the feeling of having to accept received opinion, either. 

And expert knowledge has in the past been overturned by outsiders and crackpots – people who weren’t considered experts by the experts. Whatever impression they might give, experts are not infallible. There are gaps in all knowledge. There are unexplained phenomena, statistical oddities, theoretical impossibilities and unexplored fields. 

There are three common temptations that flow from the evident limitations of expert knowledge. The first of these is to foster a stubborn skepticism towards all expert knowledge. This is the temptation of the climate-change denier. Admittedly, the smugness of climate-change scientists regarding their claims can be galling to the ordinary person. Likewise, it is sometimes easy to associate them with ideological and political interests which may have a distorting effect on their science. It is epistemologically right to hold the claims being made for the science of climate change at arm’s length, for, like all human knowledge there is room for error. 

But the persistent skeptic points to the fact that there are holes in the theory in question and to the incompleteness or the complexity of the available knowledge and claims that this means that the whole theory must be bunk. Where there is not 100% certainty, there is room for doubt. But that room may be a very small alcove in which you can barely fit a pot-plant. Indeed: where there is a claim for 100% certainty, we should be skeptical of the maths involved! 

The second temptation– which is not unrelated – is to cling to conspiracy theories. If expert opinion is slanted in one direction, then the conspiracy theorist claims that the slant is due to some secret agenda or bias. Expert knowledge is extremely difficult to outflank, especially if there is a strong consensus. This leads to the disempowerment of those not in the know. The conspiracy theory is an attempt to subvert the power of expertise by claiming that the pose of ‘expert’ knowledge has been assumed in order to mask the real truth. It may be claimed, for example, that the 9/11 attacks on the World Trade Center never occurred, and that a conspiracy of government agencies pulled off an extraordinary fraud on the American public in front of the eyes of the world in order to provide a pretext for military action in the Middle East. Or, a young earth creationist may claim that the scientists who claim that the earth is old do so under the influence of a set of ideological assumptions that lead them to bracket out the real evidence – and that the worldwide scientific community, because of its bias towards atheism, places strong pressure on those who disagree. Perhaps this is better described as a ‘mass delusion’ theory, or a ‘mass ideological bias’ theory. Nevertheless, both are attempts to account for the way in which expert knowledge is weighted overwhelmingly against their view.

The fascinating thing about conspiracy theorists is the way in which they mimic the forms of expert knowledge that they are seeking to question. So, we find young earth creation ‘scientists’ parading their PhDs, and 9/11 conspiracy theorists consulting ‘experts’ and providing very detailed accounts of their ideas, complete with apparently plausible statistics and scientific information. Thus, conspiracy theorists both play the game of expert knowledge and at the same time seek to undermine it. If you’ve claimed that the whole guild of scientific experts is corrupt or fundamentally self-deceived, then why does having one of their qualifications matter a jot? 

The third epistemological temptation that is worth describing is the phenomenon of the ‘crank’. The crank is a person who holds stubbornly to their belief despite the overwhelming consensus of his or her contemporaries. The classic crank will spend years researching a theory that most experts in the field consider to be spurious or nonsense – becoming, in effect an expert in nonsense. Cranks are often oblivious to their misunderstandings of fundamental concepts in their chosen fields, but highly resistant to any attempt to clear these up. 

The trouble is, sometimes the cranks are right. Modern-day cranks love to compare themselves to Copernicus or Galileo as examples of scientists who were independent spirits in their own times, and were considered cranks by their contemporaries. There are more recent examples, too. One of these is J. Haren Bretz. In the 1920s, Bretz began to present evidence which challenged the prevailing view of geologists that the Earth’s features had been carved out by a gradual process of erosion and rock-forming over a vast span of time. What Bretz argued was that there were some significant events in geological history that could have an impact over a very short time. Volcanic eruptions or meteoric collisions could have dramatic, relatively instant effects on the Earth’s geological features. Bretz came to his conclusions after studying the Scablands in North-West America and determining that only a massive deluge of water could have accomplished the things he observed there. 

Now, Bretz did have a PhD in geology; but his original training had been in biology. When he first published his ideas, he was dismissed out of hand by the leaders in the field. Furthermore, when another geologist, Joseph Pardee, decided that Bretz was right, he was put under enormous pressure not to support him publically. Here was an egregious instance of groupthink at work. At a public forum of the Geological Society in Washington, Bretz was invited to present his theory. But he was opposed by an organized opposition of six expert, Ivy League geologists. It was an ambush, designed to humiliate Bretz and drive him from the field. It wasn’t until the 1950s, once the impact of the Ice Age had become more widely understood, that Bretz’s views were fully vindicated. When he was finally recognized for his contribution to geology in 1979 when he was 96, he rather ruefully commented: ‘my enemies are dead, so I have no one to gloat over.’

Bretz was a crank who turned out to be right, because sometimes cranks are right. The experts sought to bully him and to censor his views. The trouble is that examples like that of Bretz, do not give us warrant to believe cranks (and not experts) simply because they, like Bretz, stand against prevailing orthodoxy. The reason Bretz was right was that he studied the evidence more carefully than his opponents. That the crank is sometimes right doesn’t mean the crank is always right.

Experts 2: Questioning the Experts?


Monty Python’s The Meaning of Life opens with a woman giving birth in a hospital. The camera gives us her point of view as her trolley-bed is wheeled into the theatre and as the two doctors shout commands to one another and the hospital staff. At one point she bravely asks ‘what can I do, doctor?’ – to which the doctors instantly reply ‘Nothing, dear – you’re not qualified!’ The point is clear: we aren’t supposed to resist, or even to try to help, the expert. We just have to lie back and let them do what they are trained and paid to do. Even something as normal and natural as giving birth is now carried out under the supervision of white-coated experts.

The problem with the cult of expertise is that we have no way really to check the expert's advice. We can get another opinion, but then it is just a case of expert against expert, and we are not necessarily better off. We just have to believe them: which means they have an incredible amount of power over us. I might possibly gain consolidation by finding my own area of expertise, so that at least I have something to bring to the social table. But this does not mitigate the feeling that I have that if the expert is wrong, I am liable to suffer for their mistakes – and there’s nothing I can do about it.

The case of climate change is a good example. Is it the case that the climate is changing because of human created environmental damage? Well, many of the experts are agreed that this is the case. I might say ‘science says so’; but when I do, I am just saying that the expert scientists are saying this at the moment. I personally have no idea whether it is or not in a direct sense. I am completely unable to evaluate the scientific data for myself – and, if I could, I wouldn't have time. So I am in the hands of the experts, who will decide government policies and how I should bag my rubbish. 

Now, you could have an attempt by experts to democratize their knowledge so that it becomes the possession of the masses. Al Gore’s lecture-film An Inconvenient Truth (2006) was an example of precisely this strategy. Clearly, Gore realized that blather of scientific jargon wasn’t going to win over non-experts at a deep level to the degree that their behaviours were transformed. So he packaged the expert information in a mass medium and used clever diagrams and animations to communicate his message.

But it isn’t easy to do successfully. Gore’s film was inevitably open to the accusation that he had bent the truth to fit the medium. While he was, according to scientists who reviewed the film, accurate in his main thesis and in many of the details, in some areas he was clearly guilty of rhetorical excess. The connection between hurricanes and climate change, for example, is a contentious matter amongst scientists. Yet the images of the devastating impact of Hurricane Katrina provided an alarming backdrop to Gore’s message – one that was unjustified on the basis of the scientific evidence. Or at least: that’s what the experts I consulted said… 

So, the attempt to communicate expert knowledge in a non-expert way does not give a non-expert grounds to challenge the experts. I couldn’t justifiably contradict a climate-change scientist on the basis of a viewing of An Incontrovertible Truth. The film gives me knowledge that is agreed to be largely trustworthy, but watching it does not convey to me the kind of status and authority that we accord an expert.

But clearly, experts are not always right. For a start, it is almost in the nature of expert knowledge that it is disputed. Expert disagreement is to be expected, because the level of detail and the sophistication required of expertise, in almost any area, means there is room for differing interpretations. This is more likely to be the case in more speculative fields of knowledge, of course. Expert economists will make widely varying, even contradictory, pronouncements about the future. And clearly they can’t all be right, even though they are acknowledged experts in their field. Though they have all the qualifications in the world, they may be exactly wrong. 

What’s more, cognitive neuro-scientific studies have argued that the possession of expert knowledge may in fact make experts more prone to certain sorts of error. Psychologist Itiel E. Dror argues that through their long years of training, experts develop certain ‘cognitive architecture’ which enables them to quickly access the kind of knowledge they need. That’s what makes them expert. However, Dror writes:

These information processing mechanisms, the very making of expertise, entail computational trade-offs that sometimes result in paradoxical functional degradation. 

That is: all the short cuts that experts develop in their brain structure may result in a vulnerability to certain kinds of error. Experts rely on quickly surmising the context of information; but these very habits of mind 

restrict flexibility and control, may cause the experts to miss and ignore important information, introduce tunnel vision and bias and can cause other effects that degrade performance. Such phenomena are apparent in a wide range of expert domains, from medical professionals and forensic examiners, to military fighter pilots and financial traders.

Thirdly, experts may be as prone to groupthink as any other group of human beings. They may act in a collective way, such that a powerful set of underlying assumptions is held to be unchallengeable. This may have financial consequences in the area of research grants and publications and so forth – which makes for a powerful pressure on individual experts to conform to the assumptions of the group. Or it could be that a pre-existing ideological commitment leads to a skewing of the evidence away from what is in fact the case. The impact of Marxism on the academic world of Eastern Europe in the middle of the twentieth century is a case in point. 

Given the trust we place in expert knowledge, the consequences of expert error are often woeful. Consider the case of Sir Roy Meadow, the British professor of pediatrics who made his name as an expert in the field of child abuse. His name became associated with ‘Meadow’s Law’, which stated that ‘one sudden infant death is a tragedy, two is suspicious and three is murder, until proved otherwise’. Meadow appeared as an expert witness in a number of trials, and his testimony was instrumental in gaining convictions of a number of people who had lost babies to cot death. Among these was Sally Clark, a solicitor who was tried in 1999 for the murder of her two babies Christopher and Harry. Meadow testified that the odds of a family experiencing two cot deaths were 73,000,000 to 1. A far more likely explanation, he claimed, was that Clark had smothered her children while suffering from post-natal depression. This was later shown to be a statistical nonsense. Clark’s conviction was eventually overturned on a number of grounds and she was released from prison in 2003; but the damage to her was done, and she died of alcohol poisoning four years later. 

The flaws in Meadow’s expertise were discovered, but only through the rigorous application of expertise against him. One problem was endemic to the cult of experts: Meadow was an expert in his own field, but ranged in to an area – statistics - in which he was clearly only an amateur. Yet because of the aura of expertise, he was believed by intelligent and well-meaning people, from the police to the lawyers and the juries involved, whose job it was to scrutinize the evidence available. A person who knows a lot about one thing may easily convey the impression that they know a lot about everything, when in fact they do not at all.

But there’s no way around this impasse other than the rigorous application of…expert knowledge. The Meadow case shows us that expert knowledge has a narrowness which may give it blind spots. Exposure to other fields of expertise can remedy some of the damage an expert error may cause. However, it is a slow process of checking and re-checking data and of allowing public discussion of ideas. The best we can do is equip ourselves with the kind of healthy skepticism that prepares us for the discovery that what we considered to the considered consensus of expert opinion may turn out to be entirely mistaken. We can do no more that make ourselves aware of the humanness of all knowing.

Wednesday, May 30, 2012

Experts 1: Knowledge is power


It is not a novelty to observe that 'knowledge is power' - Sir Francis Bacon epigrammed it some time in the seventeenth century. But it is one of those truths that has become somehow more true since it was first uttered. In an age in which information is thought of as the one thing most worth having, knowledge - which implies being able to get something useful out of all that data - becomes a priceless commodity. Billions of dollars have changed hands for the right to own knowledge.

But this has become one of the most overwhelming aspects of contemporary life. The sheer volume of information that I can readily access is completely overwhelming. The internet has, it goes without saying, intensified this feeling many times over. How can you know anything when there is so much to know? And if knowledge is power, and the one thing I know is how much I don't know, then it is pretty disempowering, isn't it? The mystique of Google is in part due to the way in which they have mastered the organization, calibration and delivery of knowledge, and (perhaps most of all) made it saleable. 

And so, we turn to the experts. 

An expert is a person who we recognize as having a very high degree of knowledge or skill in a particular area. They would usually have both experience and learning combined. For example, an airline pilot is not only someone who knows how to fly an Airbus. He (it is most often a he) has also recorded several thousand flying hours at the helm of a commercial passenger jet. Usually, the expert pilot has the experience of being the co-pilot to a more experienced pilot, so that he can gain the experience he needs. 

What’s more, the airline pilot, like other experts, has a licence to fly that is given to him by the statutory authorities. A person may or may not be skilled and knowledgeable in some area, but if he or she is not recognized to be such, their expertise may be useless. An unlicensed pilot simply does not get to fly an Airbus 330, unless he is involved in some terrorist action. (And, by the way, airlines have made sure that it is never the case that they have to make an announcement to the passengers like ‘does anyone know how to fly a plane’?) The point is: recognition is vital, because it conveys the authority and status to the expert that is needed to operate as an expert. There is a social dimension to expertise, in other words. 

We have a number of social institutions that are set up in order to recognize expertise so that we ordinary folks can quickly and with confidence place our trust in the hands of the expert. The professional guild is a group of experts whose job is to recognize the expertise of others and admit them to their number. Governments have regulatory authorities which combine peer assessment with the development of bureaucratic criteria which help mediate to the public the expertise of the expert. These bodies can have the power to disendorse the expert by removing his or her credentials. 

Imagine that I need to see a brain surgeon because (heaven forbid) a tumour has been discovered growing in my brain and I am having terrible headaches. I turn up to my appointment at the right time only to discover that the surgery is a very seedy part of town, that the receptionist is grumpy and sits at her desk painting her toenails, and that the magazines in the waiting room are several years old. When I meet Dr Thanatos – an unpromising name for a surgeon since it means ‘death’ in Greek – I notice that he has very shaky hands, bloodshot eyes and the faint whiff of alcohol on his breath. But I do notice that on his wall he has qualifications from the best schools of brain surgery. And my GP, who I have been seeing for years, has warmly recommended Dr Thanatos as a fine brain surgeon. 

The concept of expert knowledge is functioning here on a number of levels. In the first place, I have no idea even about how I would go about measuring the expertise of a person who called themselves a brain surgeon for myself. My degrees in English and Theology just won’t stack up. I simply could not with any credibility stand in judgment over the various aspects of brain surgery – unless there are rather obvious facts on the table, such as a pattern of terrible deaths. Dr Thanatos does not make a favourable first impression, to be sure. He won’t make eye contact, and he constantly checks his emails while I am talking to him. But his qualifications and the recommendation of my doctor testify loudly that this man is a very good choice for me to make in determining which doctor I want to cut the top of my head off and poke around inside. In fact, as he is a brain surgeon in good standing with the relevant government authorities and ultimately accountable to them, most people would be inclined to trust him without too much reflection. The whole system of trust in expertise operates as a kind of social short-hand, meaning that the time-consuming and inaccessible research is done for us already. In the case of the airline pilot, we don’t think it strange that we never meet or even learn the name of the person in whose hands we happily place our lives. We just take it as a given that the person up the front wearing a peaked cap is expert enough to fly 12,000 metres high at several hundred kilometres an hour for several hours, possibly in the dark.

Uniforms are another way in which a social convention conveys authority and status on a particular person in such a way that it can be instantly recognized. We really only consider the way in which this happens when someone uses the convention to take advantage of us. But this is really evidence that, in the overwhelming majority of cases, the convention works very well. In a crowded railway station looking for the toilets, I instantly look for a person who is wearing a railway employee’s uniform. At a restaurant, a waiter will indicate his or her status to act as a waiter by carrying a white cloth – or even by subtle gestures and body language. Travelling on an overnight ferry recently I approached a lady who was wearing a lanyard around her neck and asked her what time the onboard cinema was showing its first film. It turned out that she was simply a member of the public – and the lanyard, had I inspected it more closely, contained nothing that would indicate that she was a steward at all.
What would we do without experts? We rely on them to fix our cars, to give us financial advice, to heal our bodies, to manage our diets, and to tell us that the computer is completely kaput because of the faulty motherboard - which I didn’t know I had until mine broke. Our dependence on the experts is endemic. We can’t begin to operate in any meaningful in our uber-urban high-tech environment without the assistance of those who really know what they are talking about. Do-it-yourself is dead: the idea that I might fix my own car, for example, is now passé. There’s far too much complex gadgetry beneath the bonnet for that.

The extraordinary exploits of Frank Abagnale, Jr. were made into the film Catch Me if You Can, directed by Stephen Spielberg in 2002. Before he was even 19 years old, Abagnale had manage to con his way onto over 250 Pan American flights and travelled to 26 countries. His method? He impersonated an airline pilot complete with uniform and ‘deadheaded’, which is when airline staff are ferried free of charge to meet flights that they subsequently work on. Abagnale claimed that pilots frequently offered him the controls of the plane, and that once he actually flew an aircraft for a brief period at 30,000 feet. He said later that he was ‘very much aware that I had been handed custody of 140 lives, my own included...because I couldn't fly a kite’. He also impersonated a doctor and served as a resident pediatrician in a Georgia hospital; and took the Louisiana bar exam so that he could work for the Attorney General’s office in that state. Abagnale’s confidence tricks relied on the huge faith that we place in experts and the readiness with which we trust them. We are practiced at believing them, and we know that life goes more smoothly when we just do. In performing his remarkable feats of impersonating experts, Abagnale of course earned himself the authority of an expert in security and fraud, and has made a very nice living since his release from prison as a security consultant. 

The spread of the cult of expertise is a feature of a highly specialized and high-tech economic system. This is in contrast to pre-industrialized societies in which most people would function as subsistence farmers; and even in contrast to many industrialized systems where a great number of people work in factories in which there may be specialized skills but not many of those regarded as ‘expert’. The post World War Two prosperity and the silicon revolution of the 1970s have combined to ensure that Western people aspire to career in which highly specialized skills and an in-depth education are necessary. Many of us spend years of our lives becoming experts – a process that takes hard work and focused study, but which is rewarded by the recognition of our society that we have authority in our area. The economic value of higher education is a measure of nothing less than the importance of knowledge in the psyche of Western individuals. Our expertise is a commodity we can sell in order to gain access to the advice of other experts. And one should not underestimate the importance of this factor: it helps us feel proud at dinner parties when people ask ‘what do you do’? 

The most observant if not the most lucid of social critics of expert knowledge was the French philosopher Michel Foucault (d. 1984). He created several wonderful neologisms in an effort to describe how knowledge and power mutually constitute one another in human societies and especially in the modern era. It was Discipline and Punish, his work on the practice of judicial punishment in the modern era, that led him to observe the way in which gathering, analyzing and ultimately knowing information about people could be an instrument in changing their behaviour. The prison systems that developed in Western nations during the nineteenth century employed a series of ‘professionals’ – psychologists, criminologists, medical officers and so on – who were given power over the prisoner which was exercised not by brute force but simply by observing and categorizing behaviours. This was what Foucault labelled ‘power/knowledge’. And as far as he was concerned, what occurs in prisons lies on a continuum with the rest of society. He noted resemblances between the various institutions of social control – schools, the military, psychiatric institutions and hospitals – and described the way in which a culture of behaviour is inculcated in these merely by the collecting and reporting of information by people who have formed themselves into professional guilds – each of which accrues to itself an aura of authority and status. 

Think of how difficult it is to resist the word of an expert. I recall myself once talking my small son, then about a year old, to the casualty ward of the local hospital with a persistent high temperature. When after several hours we were admitted to the hospital itself ‘for observation’ it was clear that whatever the ailment was, it was very minor. But the regime of the hospital demanded that he be kept in overnight. When we went up onto the ward, the scene of chaos which greeted us was very distressing: another small child had just emerged from an operation on his leg and was being attended to by at least ten relatives – some of them also upset, some of them simply talking very loudly to each other. It took all the will power at my disposal (as a person who normally conforms to social expectations) to demand of the hospital staff that we be allowed to take our son home to his own bed. The series of release forms I then had to sign were clearly designed to intimidate me as much as possible by reminding me of my own amateurism. And yet: I was certainly making the right decision.

But Foucault would say that the aura of expert knowledge even extends to the way in which I govern my own actions and habits. I am not always being watched and evaluated, but the things I do even in private are patterned by the feeling that somewhere someone is. The reporting of expert findings in newspapers, magazines and other popular media achieves this effect very well. Particularly this is the case in the area of what we physically consume. Our choices about what to purchase, cook and eat are frequently driven by the concern that we ought to conform to what the experts say about what we ought to eat, or by the feeling that a certain size of body is the healthy norm for a person of my age and gender. An expert dietician and an expert fashionista will probably say very different things about food; but both exert an influence on the non-professional world through their expertise.

Tuesday, May 22, 2012

Does it work?



Another condition which effects what we say we know and believe is the prevailing impact of utilitarianism on Western culture. That is, the criteria for evaluating whether a proposition is to be believed or not is not actually whether it is true but whether it works. Perhaps I have put the contrast too strongly: it would be hard to imagine that some element of ‘truth’ is irrelevant in most people’s thinking about what they believe. Pragmatism is a very powerful consideration, and increasingly so since we think we can measure what works in a public and objective way and thus have an actual discussion about it with others. 

This may seem surprising since we live in the age where our heroes and saints are scientists, who seem to have a grasp on tangible reality more than the rest of us and so can pronounce about the truth of the material with authority. But the real power of the scientific paradigm is that it appears to work so successfully in making our lives more pleasurable. It offers us a great deal on truth. ‘The Truth’ itself is, in any case, some distance off from the ordinary person, who cannot possibly hope to evaluate the claims and counter-claims of specialists. 

Now, at the same time, many Westerners have found this materialist pragmatism crass and disheartening – and unspiritual. Counting the bottom line is objective, but it isn’t pretty. And at that level, it doesn’t ‘work’ as it claims it should. It certainly doesn’t produce the kind of happy lives that it promises. And so, many Western people have found themselves attracted to Eastern religions and philosophies. The irony is that Westerners cannot even at this point rid themselves of their utilitarianism – because they find that Eastern religion ‘works’ for them. Adopting an Eastern religion isn’t for Westerners usually a matter of believing various metaphysical propositions, but rather of experiencing the benefits of religious practices: a better work-life balance, a more peaceful inner world, opportunities for doing good works and a healthier lifestyle. 

What matters for people is what will work in the living of life. In a sense this is indisputable: who doesn’t want a life that actually works? What I think needs challenging is the criteria by which we think we rightly evaluate whether things are ‘working’ for us. What are these criteria? How can we accurately measure whether they are working or not? Over what period of time should we consider whether something is working?

There’s two important things that religious beliefs introduce into the discussion here which make an evaluation by the simple criteria of ‘does it work’ even more complex. The first of these is that human suffering might be redemptive in some way. That is, something that might appear not to be ‘working’ might, through divine intervention or whatever means, turn out to be the epitome of human action. This is an alarming thought for relatively comfortable Westerners to confront. And I’d hasten to add that the religions have very different ways of accounting for suffering. Some in the Eastern traditions see suffering as a result of negative karma. Christianity, for its part, names suffering as evil in and of itself, but capable of being woven into the divine plan for extraordinary good. 

The second element of most traditional religious belief is that of the afterlife. Again, the religions differ massively on the details of the afterlife. But an afterlife of any kind offers a point of evaluation for human life that lies beyond this world. It speaks of a perspective on human events that is beyond our knowing at this point. It relativises our contemporary evaluations of the viability of the lives we are leading. We cannot yet see what it mean for life to ‘work’ and so we need a framework that gives us more than pragmatic answers to our most profound questions.