Saturday, 24 January 2009

A worry about democracy

Let me at once say I think democracy is broadly a good thing -- and optimistically, a very good thing. My worry here would be better expressed in terms of a distrust of utilitarianism and baldly utility-based decision-making. One of the problems with utilitarianism, I think, is that people are often too incurious about a) other people's interests b) other alternatives to those presented, to make well-informed utility decisions. Democracy (both in the act of election and the distribution of interests by elected officials) is at a basic level, a simple utilitarian calculus. In simple language, it's basically a '51% thing'. Market crashes and the election of questionable governments by popular majority -- both occurrences that happen time and again in history and the present -- seem to be good evidence that crowds are not always wise about their own interests, let alone those of others.

Thursday, 22 January 2009

Appealing to authority

This is one of the most basic logical fallacies, yet the frequency with which it is wittingly and unwittingly made is astonishing. Appeal to authority is basically the fallacy committed when an individual is cited as 'proving' a position, merely by the weight of his or her authority, and assumed competency on the subject matter. It takes this form:
  1. Person A is (claimed to be) an authority on subject S.
  2. Person A makes claim C about subject S.
  3. Therefore, C is true.
It's a superficially appealing, and not infrequently strong argument. But appealing to authority is an argumentation tactic that is subject to some very important qualifications. There are several instances when it is simply wrong (logically and factually). Examples:
  • Where the supposed authority has cultural weight, but lacks technical authority on the particular question.
    So, if Stephen Fry holds an opinion on a question of politics, he may well attract agreement simply for his cultural weight, while being quite wrong or unconvincing in his argument or opinion. Celebrity endorsements of just about anything, are often examples of this.
  • Where the speaker has moral weight
    Elder statesmen and, of course, figures such as senior clergy and people who have endured suffering, are quite frequently treated in this way. While their opinions may have emotive weight, they are not immune to critical assessment.
  • Where the authority has a cognitive or ideological bias
    Here, an authority may very well be competent and able to speak on the matter with all the right professional credentials, and actually have deservedly recognised expertise in the field on which he speaks. However, it's rare that 'experts' are immune to cognitive and cultural biases. Noam Chomsky's occasionally dubious political opinions come to mind here. It's also a serious problem in newspaper commentaries, where the 'expert' is playing to the gallery or being very selective with the facts in order to prove their pre-decided idea. Judges, historians on sensitive matters such as the Holocaust and the Arab-Israeli conflict are also particularly vulnerable to deliberate as well as subconscious bias in presenting or even recognising salient facts
  • Where there is no consensus on the topic among the 'experts' in the field
    Frequently, the opinion being advanced is one of many in a field with validly competing viewpoints. Politics and philosophy and even science, will frequently have issues where there is not yet any consensus among the community as a whole, about the right and wrong answers.
  • Where they are just wrong!
    Experts frequently get it wrong. Wrong in light of new information, or simple human error. Very often, in a given discipline, information is accrued systematically -- on autopilot, you might say. Economists, and scientists do this all the time. They work using what Donald Rumsfeld once described as "known knowns", and occasionally, are able to factor in "known unknowns". But very rarely, if ever, does systemic epistemology factor in what he described as unknown-unknowns. The famous black swans .
To overcome the expert problem, the best strategy to adopt in information gathering and argument construction, is to avoid ultimate commitment to a single position. Each argument, no matter how great the authority making it, must be traced to its sources. Consider, then, when listening to an 'expert':
  1. What is his or her argument, and is it correct (i.e. both valid, and containig true premises)? IS there one?! (experts are often let off with rhetoric, because nobody bothers to challenge them -- and they KNOW they can get away with it)
  2. What are his or her sources? Are they reliable and are they the kind that can prove the claim? And that a source was from a 'reputable newspaper' for instance, is not what I mean by being 'reliable'. Instead, for any claim that relies on proof, that proof must be, if not verified or verifiable, then at least entertained on a high probability of plausibility.
  3. Does he or she account properly for local conditions? Gathering information in a war zone, for instance, is fraught with political distortions, and epistemic confusions.
  4. Statistics are seldom proof of anything. A statistical model is like a drinks dispenser: it puts out a reflection of what's been put into it, and what it is programmed to put out.
  5. Finally, information is not a 'thing': it's a description of the usability of data, and must be treated as dispensable or provisionally true.
The dominance of experts is always going to create some tension between the values of empirical progressiveness and 'open access' to knowledge on the one hand, and importance of systematic inquiry and experience on the other. It's an inevitable result of the sociology of information that castes and 'professionals' will tend to dominate certain fields of study and inquiry. However, a willingness to too readily treat these castes as authority-in-itself can lead to just as dangerous a tyranny as any oligarchy; for, these castes control access to reason itself.