RSS

Experts and Risk

I hold that risk estimates by ‘Experts’ are unreliable.  Let me count the ways, blathering a bit along the path, then tell of two cases where I changed my mind about social choices that deal with risk.

MORAL:   If you want to convince somebody, first join his tribe, then don’t over-argue your point: leave a little room for him to add to it.

1.  Experts tend to low-ball risk estimates affecting the tribes that employ them.  How well do you trust a general who is asked what to do about Afghanistan?  How well do you trust the Chairman of the Fed who discounts the likelihood of a meltdown?

There is an exception to the sense of this bias, however, among physical scientists who warn of upcoming disasters.  Bland risks don’t get published.  They [we] have a bias towards disasterphilia: send us some money and we’ll study it.

2.  The myth has it that ‘rational behavior’ mitigates against accepting added risk.  But the presence of lotteries argues that there exists a population that is seeks risk.  When experts warn of new risks, some [many?] of us are predisposed to welcome them.  “There is nothing more exhilarating than being shot at without effect.”  .. W.S. Churchill

Another reason not to trust the generals.

3.  Economists who attempt dollar estimates of risk get caught up in “discount curves”.  That is, by how much should one discount a risk of 1M$ damage at 10, 30, or 100 years from now, compared with the same damage tomorrow?  There exists no rational way of estimating this.  Economists appeal to consensus, which is both subjective and volatile.

Non-dollar risks [to life, health, beautiful beaches, ..] are imponderable.  Risk analyses that include them are subjective. Risk analyses that exclude them are just plain wrong.

4.  The insurance industry estimates dollar-risks retroactively, looking backwards at the incidence and cost of previous risks and previous payout.  But risk regimes change suddenly, with new technology.  “Black Swans”.

There is an emerging discipline of professional risk analysis for future technologies, where no historical data are available to keep us honest.  One notorious misuse of formal risk analysis was a study by a prestigious professor at MIT some years ago, who attempted to quantify a fault tree for potential failures at a nuclear plant.  When examined retroactively, after Three-Mile- Island and Chernobyl, his estimates turned out to be low by a factor of 100.  Similarly, with NASA estimates before the Challenger accident.  Similarly with BP.

The missing factor in these fiascos was “Pilot Error”, a generic term [not restricted to pilots] for irrational human behavior.  As with BP, for a very current example.  What, I wonder, was the basis for “expert” estimates of deep-water blowouts in the Gulf of Mexico?

What is the probability that some disgruntled employee will throw a stick of dynamite into the swimming-pool storage for spent nuclear fuel elements at the Indian Point reactor, north of New York City?  With early morning drainage winds down the Hudson Valley, much of the Bronx, Queens, Brooklyn, and Manhattan would become uninhabitable by rational persons, for many decades.  Perhaps centuries.  How do you estimate this risk?  How do you insure against it?  How do you estimate a ‘fair’ charge for insurance?

Answer: we ignore it.  Should we trust the experts who assure us of safety?  Should we trust Ben Bernanke, for example, who assured us three weeks before our recent financial meltdown that our fiscal economy was healthy?  Either he lied, he’s a fool, or he’s human.  Certainly, not an expert in fiscal catastrophe.

5.This brings us to the problem of uncertainties that go along with very rare events that are very damaging.  The ‘mean expected damage’ is the integral over the product of the probability of the various possible damaging events times the amount of damage they may individually inflict. As you approach the limit of very low risk per unit time, but high damage over very long times, the uncertainty associated with this integral approaches zero times infinity.  That is, it is ‘classically indeterminate’, and it equals or exceeds the integral itself.

What does a ‘rational’ expert do when confronted by a risk estimate that is “Low per unit time” but “Very large over longer times”?

Answer: “ILG&YLG” [“I’m long gone and you’re long gone.”]  He makes decisions affected by other concerns.  Again, is it irrational for the public to distrust him?

To summarize: I hold that distrust of experts is reasonable and rational.  If that is so, how do we actually decide issues of risk, in the presence of rational distrust?  How do we change our minds?

Answer: we select our experts according to other rules than their apparent credentials.  High among our criteria is the source: do we know the person ‘personally?’ A relative?  A friend?  A neighbor?  A respected teacher or minister?  Or [God help us] a politician?

Were I really-truly trying to influence you, I would drink a lot of coffee with you, chatter about our children, dogs, and the Burke Gilman bicycle trail.  That is, I’d try to establish trust first.  Then, later and sneakily, I’d slip in my secret

agenda and corrupt your mind.  Works every time.  Mostly.  And sometimes I’d even change my own mind.  Fair is fair.

OK.  I promised two examples where I changed my mind about social choices with risk.

1.  I came to the University in 1970.  Up until about 1980 I was a mild skeptic about ‘global warming’.  That skepticism arose from uncertainties about urban heat-island effects and imprecise and noisy data, which could then fairly be challenged on statistical grounds.  But in 1980 I read two remarkable articles, written in 1901 and 1903, by Svante Arrhenius, who calculated [before computers] the effect of doubling atmospheric CO2 on surface temperatures.  That is, the ‘Greenhouse Effect’. He got an answer near 3 degF.

More importantly, he pointed out that nighttime minimum temperatures should rise more than daytime maxima, that winter temperatures should rise more than summer temperatures, and that polar temperatures should rise more than equatorial.

Confirmation of all these predictions became apparent in the late

70’s and early 80’s.

IT IS ALWAYS MORE CONVINCING WHEN PREDICTIONS PRECEDE    THE OBSERVATIONS.

Arrhenius was a chemist [I was too] and he had won a Nobel       in chemistry. [Credentials.]

I had studied Arrhenius’ electrochemistry as a grad student, where my professors and my colleagues [that is, my ‘tribe’] much admired his work.  [Note: trust established AHEAD of time.] And I, too, had tried for a Nobel.  [Again, my tribe.]

So.  Since ~1980, I’ve been a convert.  But how much of this conversion derived from independent analysis [some], and how much from trust in an expert [some]?  Hard to tell, [a bit of both?] though certainly I was predisposed.

2.  In my callow youth I joined in the then widespread community conviction that nuclear energy was the cat’s pajamas.  I remember glibbly asserting that it would become too cheap to meter.

Then .. sometime near 1990 .. I attended a lecture at UW by Hannes Alfven, another Nobelist, on the safety of Nuclear Energy.  He spoke reasonably about its virtues: it worked, it was affordable, it was ‘reasonably’ safe [pre-Chernobyl!] compared with coal, to which ‘many’ premature deaths are attributed each year, and it emits no CO2.  He then spoke of the risks: the waste material is extremely poisonous, it requires custody for very long times, the industry is intimately connected with very destructive weapons, and   given time ..the technology will inevitably diffuse into irresponsible hands.

We are witnessing this now in Iran.

Alfven did not talk about formal risk analyses, but sitting in the audience

I realized that the ‘expected insurable damage’ from the nuclear industry was of the type I have discussed above: over longer times it would be large, and its uncertainty would be larger than the damage itself.  No ‘expert’ could allay this.

I was converted.  But note again that two things came together in this conversion: that I was predisposed to trust the speaker, and that I added something from my own expertise to augment his arguments.

MORAL:   If you want to convince somebody, first join his tribe, then don’t over-argue your point: leave a little room for him to add to it.

‘Nuff ..

Halstead

Experts and Risk” Halstead Harrison

Prof emeritus, UW

June 19th, 2010

harrison@atmos.washington.edu


4 Comments Add Yours ↓

  1. theaveeditor #
    1

    Welcome to The-Ave. Maybe we need to have a store called the coffee-house or pub for this sort of discussion?

    My first reaction to skimming this is that you offer no alternatives. If one does not trust experts, who do you trust?

    It seems to me that the answer is frightening .. you trust Sara Palin?

    I would argue that coffee part of your argument is not such a bad one. In may case I have a lot of folks who know more about somethings than I do. I base my evaluation of them on subjective criteria you described and I call “coffee.”

  2. Halstead #
    2

    The liberal left, with whom I mostly hang,
    cry indignant that experts are not believed
    about global warming, and much else. I hold
    that this is rational: experts are as fallible as the rest of us.

    How do we actually form opinions?

    We choose experts from among our own tribe,
    then add a little bit to the argument from our own experience.

    Halstead

  3. theaveeditor #
    3

    Ted

    As one expert to another, I suggest that your problme is that you are uncomfortable with the mantle of expert. You want others to accept your truth because it is correct, not because it is your word.

    Saying humans (or liberals) should not trust experts makes no sense. We .. that is humans … need teachers, it is part of our unique species, just as humans spontaneously develop speech and form themselves into tribes.

    Of course the other tribe in our political dichotomy, the Republicans, does the same thing.

    They too have their authorities. Tea Baggers claim to be following the facts as revealed by Reagan, Palin, and Jesus. Even on global warming, they insist that they have their own experts.

    Your appeal to individualism is very, well very much a tribute to our shared heritage in the enlightenment. Ayn Rand and Jefferson both believed in the intellectual power of the individual but in the real world we all turn to experts because, as humans, we have evolved to be students to learn from others. The French Republic, lacking faith in their Adamses, Washington, Hamilton, and Franklin, failed.

    Why do most societies have a version of blasphemy laws?

    Back at the practical, you and I belong to a priesthood. The community of scientific experts evolved from the traditions of scholars called by different names in Confucian, Jewish and Islamic societies. In Judaism and Islam, the claim was that the Book of Law, the Torah or Koran, was a the direct word of God. Teachers (Rabbis and Imams) claimed their authority not from some God-given aura, but from learning from their predecessors. That authority, that mantle of experthood, was the basis for their societies.

    Buddhism took a separate track. Divorced from the unknowable reality of a Deity, the Buddha used science .. pure logic .. to deduce rules of life. His rules became the dharma, the Law, and again were maintained by a tradition of experts.

    Perhaps the most doctrinaire of all expert led societies was the Soviet Union. Like the Buddhists, the heirs of Lenin depended on the authority .. the scientific authority .. derived from the insights of a human, Marx.

    Rather than cavil at the failure of experts, I would suggest you take course in Talmud. Seriously, Judaism has been led by our rabbinate for 2000 years. In that time, however, the community of experts has evolved its ideas. I recommend four great teachers …

    Hillel, given the reality of Torah, Hillel argued, that resistance to Rome must be passive. Much of what is good in Christianity came from Hillel’s amazing ideas but he would claim that all he was doing was based on Torah.

    Maimonides, living in Andalusia and then in Egypt, said that both Torah and scientific reality see God’s truth. Since science and Torah must be both true and science is a direct revelation, then any discrepancy must be because we misread the Torah.

    Spinoza said the same thing as Maimonides, except that Baruch Spinoza went the final step of denying that the Torah or any written word could be accepted on an equal footing.with science.

    Soleveitchik, a physicist and rabbi of the last century, saw science and Torah as separate disciplines. His ideas are rather Buddhist because Rabbi S. believed that Torah and Science were different but valid disciplines.

    What unites Hillel, Maimonides, Spinoza, and Solevietchik is the mantle they all claimed , that of expert.

    ********************

  4. Charles Tuna #
    4

    Experts can be sued. And .. .. if he is an expert from a university, the university may be sued too.

    Erroll Davis, Chancellor of the University of Georgia, was hired by BP to serve on its board and on the 4 member safety committee because the guy was an EXPERT. Professor Davis served the University And BP 12 years, BUT RETIRED from BP just five days before the oil rig exploded.

    Doesn’t the University of Georgia have some responsibility for its Chancellor’s activities? What happened if UG is named in the class action suit against the
    four-member safety committee?

    NIKE?



Your Comment