This site may contain copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in an effort to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. we believe this constitutes a ‘fair use’ of any such copyrighted material as provided for in section 107 of the US Copyright Law.
In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more information go to: http://www.law.cornell.edu/uscode/17/107.shtml
If you wish to use copyrighted material from this site for purposes of your own that go beyond ‘fair use’, you must obtain permission from the copyright owner.
FAIR USE NOTICE FAIR USE NOTICE: This page may contain copyrighted material the use of which has not been specifically authorized by the copyright owner. This website distributes this material without profit to those who have expressed a prior interest in receiving the included information for scientific, research and educational purposes. We believe this constitutes a fair use of any such copyrighted material as provided for in 17 U.S.C § 107.
FAIR USE NOTICE FAIR USE NOTICE: This page may contain copyrighted material the use of which has not been specifically authorized by the copyright owner. This website distributes this material without profit to those who have expressed a prior interest in receiving the included information for scientific, research and educational purposes. We believe this constitutes a fair use of any such copyrighted material as provided for in 17 U.S.C § 107.
Here we will discuss the problem of denialists, their standard
arguing techniques, how to identify denialists and/or cranks, and
discuss topics of general interest such as skepticism, medicine, law and
science. I’ll be taking on denialists in the sciences, while my
brother, Chris, will be geared more towards the legal and policy
implications of industry groups using denialist arguments to prevent
sound policies.
First of all, we have to get some basic terms defined for all of our new readers.
Denialism is the employment of rhetorical tactics to give the
appearance of argument or legitimate debate, when in actuality there is
none. These false arguments are used when one has few or no facts to
support one’s viewpoint against a scientific consensus or against
overwhelming evidence to the contrary. They are effective in distracting
from actual useful debate using emotionally appealing, but ultimately
empty and illogical assertions.
Examples of common topics in which denialists employ their tactics
include: Creationism/Intelligent Design, Global Warming denialism,
Holocaust denial, HIV/AIDS denialism, 9/11 conspiracies, Tobacco
Carcinogenecity denialism (the first organized corporate campaign),
anti-vaccination/mercury autism denialism and anti-animal testing/animal
rights extremist denialism. Denialism spans the ideological spectrum,
and is about tactics rather than politics or partisanship. Chris will be
covering denialism of industry groups, such as astroturfing, and the
use of a standard and almost sequential set of denialist arguments that
he discusses in his Denialist Deck of Cards.
5 general tactics are used by denialists to sow confusion. They are conspiracy,
selectivity (cherry-picking), fake experts, impossible expectations
(also known as moving goalposts), and general fallacies of logic.
Throughout this first week we’ll be discussing each of these 5
tactics in turn to give examples of how they are used, and how to
recognize their implementation. We’ll also introduce our handy little
icon scheme that we’ll attach to each post discussing denialists. If you
just can’t wait a whole week, well, visit our old blog’s definition to see what we’re talking about.
Finally, some ground rules. We don’t argue with cranks. Part of
understanding denialism is knowing that it’s futile to argue with them,
and giving them yet another forum is unnecessary. They also have the
advantage of just being able to make things up and it takes forever to
knock down each argument as they’re only limited by their imagination
while we’re limited by things like logic and data. Recognizing denialism
also means recognizing that you don’t need to, and probably shouldn’t
argue with it. Denialists are not honest brokers in the debate (you’ll
hear me harp on this a lot). They aren’t interested in truth, data, or
informative discussion, they’re interested in their world view being the
only one, and they’ll say anything to try to bring this about. We feel
that once you’ve shown that what they say is deceptive, or prima-facie
absurd, you don’t have to spend a graduate career dissecting it and
taking it apart. It’s more like a “rule-of-thumb” approach to bad
scientific argument. That’s not to say we won’t discuss science or our
posts with people who want to honestly be informed, we just don’t want
to argue with cranks. We have work to do.
Second, denialism isn’t about name-calling or the psychological
coping mechanism of denial. The first reaction of any denialist to being
labeled such is to merely reply, “you’re the denialist” or to redefine
the terms so that it excludes them (usually comparing themselves to
Galileo in the process). However, denialism is about tactics
that are used to frustrate legitimate discussion, it is not about simply
name-calling. It’s about how you engage in a debate when you have no data (the key difference between denialists and the paradigm-shifters of yesteryear). There are a few more common defenses that we’ll discuss in time.
So while the denialists will inevitably show up and suggest my belief
in the validity of carbon dating shows I’m a Bible denialist, or my
inability to recognize the wisdom of some HIV/AIDS crank shows I don’t
understand biology, we won’t tend to engage them. They’re cranks and we
aim to show how you can instantly recognize and dismiss crank arguments.
Finally, just because some people believe in stupid things, doesn’t
make them denialists. A lot of people get suckered in by denialist
arguments and benefit from having the record corrected or being shown
how to recognize good scientific debate versus unsound denialist
debates. We aren’t suggesting everybody who has a few wacky ideas is a
crank, part of the reason denialists abound and are often successful in
bringing the masses over to their side is that their arguments don’t
necessarily sound insane to the uninitiated. Denialist arguments are
emotionally appealing and work on a lot of people. We’re trying to
inform people about denialism and how to recognize denialist arguments
so that ultimately they will be less effective in swaying those that may
not be fully informed about science. Hopefully, by creating awareness
of the ground rules of legitimate scientific debate, citizens, policy
makers, and the media may better distinguish between sound and unsound
scientific debate.
What are denialist conspiracy theories and why should people be
instantly distrustful of them? And what do they have to do with
denialism?
Almost every denialist argument will eventually devolve into a
conspiracy. This is because denialist theories that oppose
well-established science eventually need to assert deception on the part
of their opponents to explain things like why every reputable
scientist, journal, and opponent seems to be able to operate from the
same page. In the crank mind, it isn’t because their opponents are
operating from the same set of facts, it’s that all their opponents are liars (or fools) who are using the same false set of information.
But how could it be possible, for instance, for every nearly every
scientist in a field be working together to promote a falsehood? People
who believe this is possible simply have no practical understanding of
how science works as a discipline. For one, scientists don’t just
publish articles that reaffirm a consensus opinion. Articles that just
rehash what is already known or say “everything is the same” aren’t
interesting and don’t get into good journals. Scientific journals are
only interested in articles that extend knowledge, or challenge
consensus (using data of course). Articles getting published in the big
journals like Science or Nature are often revolutionary (and not
infrequently wrong), challenge the expectations of scientists or
represent some phenomenal experiment or hard work (like the human genome
project). The idea that scientists would keep some kind of exceptional
secret is absurd, or that, in the instance of evolution deniers, we only
believe in evolution because we’ve been infiltrated by a cabal of
“materialists” is even more absurd. This is not to say that real
conspiracies never occur, but the assertion of a conspiracy in the
absence of evidence (or by tying together weakly correlated and
nonsensical data) is usually the sign of a crackpot. Belief in the
Illuminati, Zionist conspiracies, 9/11 conspiracies, holocaust denial
conspiracies, materialist atheist evolution conspiracies, global warming
science conspiracies, UFO government conspiracies, pharmaceutical
companies suppressing altie-med conspiracies, or what have you, it
almost always rests upon some unnatural suspension of disbelief in the
conspiracy theorist that is the sign of a truly weak mind. Hence, our
graphic to denote the presence of these arguments – the tinfoil hat.
Another common conspiratorial attack on consensus science (without
data) is that science is just some old-boys club (not saying it’s
entirely free of it but…) and we use peer-review to silence dissent.
This is a frequent refrain of HIV/AIDS denialists like Dean Esmay or Global Warming denialists like Richard Lindzen
trying to explain why mainstream scientists won’t publish their BS. The
fact is that good science speaks for itself, and peer-reviewers are
willing to publish things that challenge accepted facts if the data are
good. If you’re just a denialist cherry-picking data and nitpicking the
work of others, you’re out of luck. Distribution of scientific funding
(another source of conspiracy from denialists) is similarly based on
novelty and is not about repeating some kind of party line. Yes, it’s
based on study-sections and peer-review of grants, but the idea that the
only studies that get funded are ones that affirm existing science is
nuts, if anything it’s the opposite.
Lately, there’s been a lot of criticism of the excess focus on
novelty in distribution of funding and in what gets accepted into
journals. I encourage all scientists and those interested in science to
watch this video of John Ioannidis giving grand rounds at NIH on how
science gets funded, published, and sadly, often proven wrong. I put it
up at google video. He is the author of “Why most published research findings are false”
published in PLoS last year. It’s proof that science is perfectly
willing to be critical of itself, more than happy to publish exceptional
things that often turn out wrong, but ultimately, highly
self-correcting.
I realize it’s an hour long, but it’s really a great talk.
For our next installment of the big five tactics in denialism we’ll
discuss the tactic of selectivity, or cherry-picking of data.
Denialists tend to cite single papers supporting their idea (often you
have to squint to see how it supports their argument). Similarly they
dig up discredited or flawed papers either to suggest they are supported
by the scientific literature, or to disparage a field making it appear
the science is based on weak research. Quote mining is also an example
of “selective” argument, by using a statement out of context, just like
using papers or data out of context, they are able to sow confusion.
Here at denialism blog we’ll use the cherries to denote the presence of
selectivity in a denialist screed.
Examples abound. Such as when HIV/AIDS denialists harp about Gallo
fudging the initial identification of HIV (a famous dispute about
whether or not he stole Montagnier’s virus) to suggest the virus was
never actually identified or that the field rests on a weak foundation.
Jonathan Wells likes to harp endlessly about Haeckels’ embryos to
suggest that the tens of thousands of other papers on the subject of
evolution, and the entire basis of genetics, biology and biochemistry
are wrong.
One of the main reasons this is such an effective tactic to use on
science is that when something is shown to be incorrect, we can’t
“purge” the literature so the bad papers stay there forever. Only when a
paper is retracted is the literature actually restored, and there’s a
lot of research and researchers that got things wrong on the way to
figuring out a problem. It’s really just the nature of research, we make
mistakes, but the self-correcting nature of science helps get us
incrementally closer to some form of scientific truth. It is up to the
individual researcher to read and quote more than the papers that
support their foregone conclusion, as one has to develop theories that
effectively synthesize all the data and represent an understanding of an entire field, not just quote the data one likes.
Then there is the issue of selective quotation of perfectly good
science or scientists. For example, see our post on how the Family
Research council misrepresents data on contraception to promote their political agenda. Talk Origins has an entire quote-mine project devoted to documenting how creationists misrepresent scientists to advance their agenda.
This tendency towards quote-mining and misrepresentation of science
is really the clearest proof of the dishonesty inherent in denialist
tactics (with the possible exception in the case of Intelligent Design
Creationism of the wedge document
– but an internal statement of denialists’ goals is usually hard to
come by). Selectivity is exceedingly common, and proof that many
denialists aren’t just intellectually, but morally bankrupt.
You know who they are – those organizations that have words like
“freedom” and “rights” “choice” and “consumer” in their names but always
shill for corporate interests…those occasional MDs or engineers
creationists find that will say evolution has nothing to do with
science. They are the fake experts.
But how do we tell which experts are fake and which are real?
To figure out who is a fake expert you have to figure out what a real
expert is. My definition would be a real expert is someone with a
thorough understanding of the field they are discussing, who accurately
represents the scientific literature and the state of understanding of
the scientific enterprise. There has been some other other discussion on
scienceblogs from Janet at Adventures in Ethics and Science, it also reiterates some of the same points in relation to what she
feels comfortable discussing as an expert. It also stresses the
importance of context in evaluating the validity of expert opinion. But
I’m not the god of the dictionary so let’s consider some other
definitions.
The OED gives the definition
simply as “One whose special knowledge or skill causes him to be
regarded as an authority; a specialist. Also attrib., as in expert
evidence, witness, etc.”
I don’t think this is adequate to describe what we really mean
though, that is, how do you identify a trusted source of scientific
information?
Legally (in the US), scientific expertise had been defined by whether
the testimony the expert provided conforms to the so-called Frye rule
from 1923 until 1993 when the Daubert vs. Merrel Dow Pharmaceuticals
case changed the definition to be consistent with the federal rules of
evidence. The Frye rule was that scientific testimony was valid if the
theory it was based on was “generally accepted”, that is it was
admissible if the theory on which the evidence was based had a somewhat
arbitrary critical mass of followers in the scientific field.
In many ways Daubert was a big improvement, although it puts more
onus on the judge to determine if the science presented should be
considered valid as it merely stated that experts were defined by the
federal rules of evidence which allow the judge to determine:
If scientific, technical, or other specialized knowledge
will assist the trier of fact to understand the evidence or to determine
a fact in issue, a witness qualified as an expert by knowledge, skill,
experience, training, or education, may testify thereto in the form of
an opinion or otherwise.
(A good article on this issue here from the NEJM and a more updated article.)
Luckily the justices didn’t just leave it at the federal rules of
evidence and Blackmun created a set of guidelines for judges to
determine if the expert was “reliable”. They require the theory
presented by the witness to have undergone peer review, show
falsifiability, empirical testing, reproducibility, and a known error
rate for a scientific theory to have some validity in addition
to the general acceptance rule of Frye. While the individual states
remain a patchwork of Frye, Daubert, and Frye-plus rules for
admissibility of evidence, at least federally this is the new
requirement (although it still does suffer from being a bit vague).
The experts that present such evidence must have some credentials
and/or experience with the discipline, and the evidence they present
must pass these tests. It’s actually not a half-bad way to identify a
trusted source, in particular if the judge is intellectually honest
about the witness meeting these requirements. Although the law currently
allows a lot of latitude on this as it’s really up to the judge to
determine if the expert testimony satisfies the Daubert requirements.
The commonalities between the different accepted definitions are that experts have experience in their field,
and they can provide answers that are consistent with the state of
knowledge in that field that are useful. The legal definition appears
more stringent, in that it requires the expert to speak in a clear
fashion and discuss science that actually meets Popperian requirements
of epistemology(falsifiability, testing, etc.) – but I’m not about to
jump into that quagmire today.
Clearly, the exact definition of what an “expert” is still eludes us,
but it becomes readily apparent from the legal, dictionary and common
practice definitions employed by scientists what experts are not.
They aren’t merely an empty set of credentials and they aren’t merely
people who have at some point published in some random field. Even the
rather silly expert wiki would seem to agree on this.
Therefore I would say a fake expert is usually somebody who
is relied upon for their credentials rather than any real experience in
the field at issue, who will promote arguments that are inconsistent
with the literature, aren’t generally accepted by those who study the
field in question, and/or whose theories aren’t consistent with
established epistemological requirements for scientific inquiry. Sheesh.
I just described Michael Egnor, Bill Dembski, Michael Fumento, Patrick
Michaels, Steven Milloy, Richard Lindzen…
So, in honor of the false experts hired by everyone from creationists
to global warming deniers, I present to you, the thinking chimp. Our
mascot of the false expert, who isn’t as good at telling you accurate
information about science as he is at flinging poo.
**Janet points us to another post of hers discussing how to identify a trusted source.
I’m sorry for mixing terminologies. But moving goalposts isn’t
adequate to describe the full hilarity of the kinds of arguments
denialists make. For instance, the goalposts never have to be moved when
they require evidence that places them somewhere in the land before
time. What I mean is the use, by denialists, of the absence of complete
and absolute knowledge of a subject to prevent implementation of sound
policies, or acceptance of an idea or a theory.
So while moving goalposts describes a way of continuing to avoid
acceptance of a theory after scientists have obligingly provided
additional evidence that was a stated requirement for belief, impossible
expectations describes a way to make it impossible for scientists to ever prove anything to the satisfaction of the denialist. They’re related though so we’ll group both together.
Let’s take the example of the global warming deniers. One finds that
they harp endlessly about models, how much models suck, how you can’t
model anything, on and on and on. True, models are hard, anything
designed to prognosticate such a large set of variables as those
involved in climate is going to be highly complex, and I’ll admit, I
don’t understand them worth a damn. Climate science in general is beyond
me, and I read the papers in Science and Nature that come out, blink a
few times, and then read the editors description to see why I should
care. But with or without models, which I do trust the scientists and
peer-reviewers involved to test adequately, that doesn’t change the fact
that actual measurement of global mean temperature is possible, and is
showing an alarmingly steep increase post-industrialization.
The next thing the global warming deniers harp on is about how we
don’t have enough records of temperature to make a educated statement
about whether our climate is really heating up that much as the
instrumental record only goes about 150 years back. Then you show them
proxy records that go back a thousand years, and after they’re done
accusing people of falsifying, they say it’s still not enough, then you
go back a few tens of thousands of years, and it’s still not enough,
then finally you go back about 750 thousand years and they say, that’s
just 0.0001% of the earth’s history! That’s like a blink of the eye in
terms of earth’s climate. Then you sigh and wish for painless death.
I’ll let real-climate
fight the fights over proxy records and CO2 lag, because, simply, they
know a lot more than me, and if you really want to argue with global
warming denialists I recommend reading A Few Things Ill Considered’s Faq
first. But what I can recognize is the tendency of the global warming
deniers to constantly move the goalposts back and back, and once they
whip out the argument we’ve only got proxy measurements for a fraction
of earths life (a mere few hundred thousand years), you know they’ve
graduated to impossible expectations.
A person who wasn’t just obviously stonewalling would say after
you’ve shown them this much data that maybe we should take the data as
is before we’re all under water. You don’t need to know the position of
every molecule of air on the planet, throughout the entire history of
earth to make a prudent judgement about avoiding dramatic climate
change. (If they say that we don’t know what ideal is say, “yeah, but
Florida will still be under water). You don’t need to know the
position of every molecule in the galaxy before deciding you need to
jump out of the way of a speeding train. Similarly, we don’t need to
have a perfect model of the earth’s climate to understand that all the
current data and simulations suggest decreasing carbon output is of
critical importance right now, and not when humans have obtained some
impossible level of scientific knowledge.
The honorary gif for making these tiresome arguments is – the goalpost (and no Chris you may not animate it).
P.S. This does not mean that I endorse all efforts to model complex
systems. In the future I’ll probably complain about some modeling
implementation of systems biology which I tend to think is total BS.
I’ll explain the difference then.
P.P.S To see an example of some really hilarious creationist goalpost moving see our post on Michael Egnor demanding biologists provide an answer for something he can’t even define.
Almost everybody knows about the fallacies of logic,
formal and informal, that are routinely used in arguments with
denialists. While these fallacies aren’t perfect examples of logic that
show when an argument is always wrong, they are good rules of
thumb to tell when you’re listening to bunk, and if you listen to
denialists you’ll hear plenty. I wish they’d teach these to high school
students as a required part of their curriculum, but it probably would
decrease the efficacy of advertisement on future consumers.
The problem comes when the denialists get a hold of the fallacies then accuse you, usually, of ad hominem! It goes like this.
Denialist says something wacky…
Commenter or blogger corrects their mistake…
Denialist says same thing, changes argument slightly…
Commenter or blogger again corrects their mistake…
Denialist says something even wackier, says it disproves all of a field of science…
Commenter or blogger, exasperated, corrects it and threatens disemvowelment…
Denialist restates original wacky argument…
Commenter or blogger’s head explodes, calls denialist an idiot.
Denialist says he won because commenter or blogger resorted to ad hominem.
The thing to remember about logical fallacies is that their violation
isn’t proof or disproof of the validity of the opponent’s argument.
Your opponent might just be an idiot, but ultimately right. Some people
just don’t know how to argue or keep their temper. Logical fallacies are
rules of thumb to identify when portions of arguments are poorly
constructed or likely irrational. They are dependent on context, and
aren’t really rigorous proofs of the validity or invalidity of any
argument.
Further, some fallacies, like ad hominem are poorly understood, so
when an opponent says you’re wrong because of this this and this
therefor you’re an idiot, the poor victim of the ad hominem feels like
they can claim victory over the argument. When in reality ad hominem
refers to the dismissal of an argument by just insulting the person.
Time and time again you see someone exasperated by the crank who won’t
turn despite being shown again and again where their error is, and
finally just call the guy an idiot. That’s actually not an ad hominem.
That might be totally true and highly relevant to the argument at hand.
Sometimes people are just too stupid or too ignorant to realize when
they’ve been soundly thrashed, and true cranks will stubbornly go on,
and on and on…
But that doesn’t mean the fallacies of logic aren’t useful
as rules of thumb for detecting the BS. The ones you hear most are
arguments from metaphor or analogy (prime creationist tactic), appeals
to consequence (creationist and global warming denier), appeals to
ignorance (all – see moving goalposts), appeals to authority (all),
straw men and red herrings.
For instance, the classic creationist example of using the analogy of
the mouse-trap to suggest “irreducible complexity” as a problem for
biology. Fallacies let you dismiss this instantly by saying, analogies
aren’t science pal, how about some data. Analogies are often helpful for
getting concepts across, but you routinely see them used by denialists
as evidence. And more frequently you see their analogies aren’t
even apt. For instance the mouse-trap is perfectly functional as its
constituent parts. It’s a platform, a spring and a hook, just because
they’re not assembled doesn’t mean they’ve lost their function. They
just can’t kill mice anymore unless you throw them with sufficient
velocity at rodents. Similarly the watchmaker analogy, the jet airplane
analogy, or when a few months ago I saw this endless silly analogy about
arsonists and design. Uggh. Pointless. Don’t even bother, you see
things like this being used to challenge actual honest to goodness data?
You’re done. If you spend too much time piecing together looking for a
method to the madness you’ll end up like our poor robot. He’s the mascot
for logical fallacies.
Poor guy. One too many fallacies, now he’s broken.
Well, I’ve outlined what I think are the critical components of
successful crankiness. Ideally, this will serve as a guide to those of
you who want to come up with a stupid idea, and then defend it against
all evidence to the contrary.
Here’s how you do it:
Step one: Develop a wacky idea.
It is critical that your wacky idea must be something pretty
extraordinary. A good crank shoots for the stars. You don’t defend to
the death some simple opinion, like Coke is better than Pepsi. You’ve
got to think big! You’ve got to do something like deny HIV causes AIDS,
or relativity, or reject an entire field of biology, or deny the earth
is older than 6000 years. If you can’t think of anything, try reading
the Bible for claims that are now obviously ludicrous – like the
possibility of climbing into heaven using a ladder. Insist on its
literal truth.
The thing you deny has to be something that’s so obvious to the
majority of people that when they hear it, they want to hear an
explanation, if only because it’s clearly going to be nuts.
This is critical to all successive steps. If you don’t say something
outrageous and contrarian, no one will ever see you as the iconoclastic
genius that you are.
The presentation of this idea is also important. Remember that really
important people with really important ideas don’t have time for
grammar or spelling. Also try interesting use of punctuation!!!!,
CAPITALization and textcolor. When you EMPHASIZE things people will inevitably take your more seriously.
Make sure that you develop new physical laws, name them after
yourself, and if you must cite anything, either cite your own name or
work, or that of another crank. If you’re feeling bold cite some famous
scientist, like Einstein, but don’t list a specific passage, just assume
that they said or did something that supports your idea. After all
you’re both geniuses, you must think alike!
It’s also important during your research of this new idea, never to
be worried about preserving the original intent of other authors you
quote or cite. If any words they say can be construed to mean something
else, that’s ok too. Academic license is part of academic freedom.
Whenever possible try to include figures. Line drawings and diagrams
with complicated mathematical symbols are ideal. Remember, most people
don’t know calculus, include equations you find in other books to prove
the mathematical or physical relationship you have discovered. The type
of people who will believe your idea aren’t big into checking others’
work for consistency, so it will be OK. Those that do would never
believe you anyway, but by the time they get around to that, you’ll have
a cult following.
Step two: Disseminate your idea
This can be done many ways.
The old-school method is to spend your day job writing angry letters
to politicians, newspaper editors, and anyone else that you thought
might listen to you.
Cranks with independent wealth can self-publish their own book (I
have many of these provided courtesy of an astronomer friend whose
institute regularly receives such works and places them in their “crank
file”). A book lends credibility, especially to other cranks who think
that anyone who could actually focus their intellects for long enough to
write a book, must be onto something. Ideally, send your book
to scientists in the field you are trying to undermine, they’ll know
just where to put them. If your idea has a more mainstream appeal, send
it to church leaders and various pundits who might give it some play in
their pulpits.
These days, technology has provided us what is known as a blog. Your
target audience, despite the improvements in technology, are just as
likely not to care as before. Less so, because now they don’t even have
to experience the inconvenience of opening your crank letter or having
to file your crank book. The secret to generating traffic then is
exploiting the fact that the internet gives access to all sorts of
people who will be irritated by your mere presence. Leave comments in
others blogs that describe how you have solved this big problem, where
everyone else has failed. Ideally, get a minion to constantly extol your
virtues and genius. If one is lacking just sockpuppet yourself from
another computer. It’s not even necessary to leave comments at science
blogs or (real) skeptic sites. Any site will do, bother cat fanciers,
tech geeks, whoever. Traffic will inevitably follow.
Technology has also made it easy to make videos and DVDs, and
provided internet radio outlets for crankery. Do you have a new idea for
how the twin towers fell? Well put it up on Youtube and embed it in
your blog like so:
Podcasts also serve this function nicely – and since none of your
critics will waste their time transcribing the nonsense you say in order
to debunk it, videos and podcasts tend to be a good way to avoid excess
criticism.
Do you have access to a religious mailing list? Send out your
informational DVD on your new proof that all science is a lie to those
that might receive it as gospel.
If you’re very adventurous, try submitting a paper to a scientific
journal. First try big, Science and Nature are ideal. If it’s medicine
try the New England Journal or JAMA – they are pretty good examples of
the stodgy orthodoxy who will no doubt persecute you. When they reject
your paper, remember, you’re just like Galileo, or Einstein. They
rejected your ideas because they’re just not ready to accept them. Remember, you’re a skeptic! You’re one of those people keeping science honest by making them consider new ideas
(except when they’re very old ideas recycled). Don’t let them brush you
off easily, resend your manuscript multiple times. If they reject it
claim victory! It means you’re a true original. You’ve come up
with something the scientific establishment just can’t deal with because
of their small-mindedness and bigotry. Ideally keep sending it to
publications, to editors at their home addresses, to their children’s
school etc. If they get a restraining order claim victory! You’ve been
persecuted! You now are a true heir to Galileo.
If you want your manuscript (it may make you sound smarter to call it
your “treatise” or “monograph”) to actually get published, try
something like Medical Hypotheses.
Journals with an impact factor of less than 1 might actually be
desperate enough to publish something cranky, especially if you can
jargonize it enough to make yourself sound smart, or create enough fake
data to trick the editors. If it has to do with global warming consider a Wall Street Journal Op-Ed. The Creation Research Quarterly
is perfect for anything disproving some facet of evolution, geology,
astronomy, or physics. You don’t have to be a creationist for them to
like your crank theory, anything that pokes holes in dastardly consensus
science is a victory.
Then try journals that don’t require real experiments, rigorous trial
design, peer review or anything that actually indicates actual science
has been done. Other cranks in your “field” may have started just such a
journal – like the Journal of 9/11 studies.
There are about as many places that will publish crank work as there
are crank ideas, don’t stop trying! If you get your ideas published in
such a journal claim victory! You have mainstream acceptance and a
publication record now.
There are also many message boards that might like your idea. If you have a crazy new ideology about evolution try the International Society for Complexity Information and Design. If you have a new idea for what causes AIDS, a great starting point is the Dissident Action Group.
Search for forums that might be amenable to your idea and post it
there. Make sure to re-post it after every ten replies or so, so people
can read it again. Another good starting place is Newiki
which has the stunning tagline “If Copernicus or Galileo were alive
today, this is where you would find their work.” They clearly love the
crankery.
Finally, don’t forget other cranks are an excellent resource! Cranks
usually like to hear about other cranks ideas, even if they conflict
with their own crank ideas (9/11 conspiracy cranks might be an
exception). Remember, intellectual consistency doesn’t matter
as long as you are both criticizing the orthodoxy. These other cranks
can mention your idea. They will undoubtedly find it “interesting” if
they mention it, even if they don’t agree with all aspects of it. See our recent post on Denyse O’Leary and the Creation Museum,
a perfect example. Ideally they will link your site, join your webring,
mention your ideas, and many other cranks will promptly arrive to
acknowledge your genius (sorry, only other cranks will ever do this –
ever). Don’t forget this means you will have to help them promote their
crank ideas.
Cranks also have a major presence on radio – both internet and terrestrial. Are you anything like this crank? Or or this one? Maybe they’ll have you on their radio show to discuss your new crackpot theory.
Follow these steps and soon your idea will be a topic of discussion
everywhere. Don’t forget to routinely make claims that the views of
orthodox science are imperiled by the threat of acceptance of your
ideas, it will make people more likely to believe your later claims of
persecution and visit your site to see if you’ve figured out you haven’t
changed anything. Suggest that the valid scientific theory is debunked,
or will be within a decade frequently, routinely declare victory over
the mainstream theory.
Step three: (Not) Responding to Criticism
All great minds will be criticized by peon scientists who have grown fat
and bloated with public grant funds. They’ve been feeding at the public
coffers for so long, they wouldn’t know an original idea if it fell out
of the ether and struck them on their thick skulls. Here are some
simple responses to common criticisms:
Accusation: “You haven’t published in a real peer-reviewed journal”
Response: Either say “Peer review is just an old-boys network for peon
scientists to pat each other on the back”, or accuse journal editors of
persecuting you. Compare yourself to Galileo.
Accusation: “You don’t have solid proof”
Response: Either restate what you said already, restate it slightly
differently, call your accuser a name, or suggest they are part of the
conspiracy to hide the truth. Compare yourself to Galileo.
Accusation: “Because of X, Y, and Z, your theory is false and you’re an idiot”
Response: Yell “That’s Ad Hominem – I win the argument” (and that they’ve persecuted you).
Accusation: “Because of X, Y and Z, you are wrong”
Response: If they fail to call you an idiot, there are a few ways to
respond to this. Either nitpick an aspect of their argument so that you
can ignore the rest while diverting the discussion into a meaningless
tangent. Or cut and paste large sections of print or references to
papers that may or may not agree with you (the exhaustion strategy).
Finally, it’s always a good idea to just ignore them and restate your
original argument. Alternatively demand they provide you with *scientific* evidence that their theory is the correct one. If they do, ignore it and restate your original argument.
Accusation: “No credible scientists or scientific agencies believe this theory”
Response: “That’s because they’re part of a conspiracy to hide the
truth!” In addition assert motives for the conspiracy like maintaining
control over the populace, spreading materialistic atheist dogma,
acquiring grant money, etc. Don’t forget to challenge orthodoxy and
compare yourself to Galileo! He was persecuted by the orthodoxy too!
Remember, whenever a majority of scientists believe anything, that means
it’s wrong. Cite Kuhn, compare yourself to Galileo again.
If they show up at your blog and leave comments, remember to delete
anything critical at all, dissent must not be tolerated on your home
turf. Anything critical might damage the proof of your unassailable
intellect, and the absence of critique will make it appear as if your
critics are afraid to engage you on your own turf.
You see? It’s easy! All you have to do is ignore anything that contradicts your theory, nitpick others’ arguments, force them
to explain themselves, accuse them of lying, accuse them of conspiring
against the truth, exhaust them with dumps of links or citations, repeat
yourself, and compare yourself to Galileo, because he had problems
convincing the orthodoxy too. Also, don’t forget to call yourself a
skeptic, or dissident, or iconoclast.
Step four: Get Persecuted!
You haven’t graduated to being a full crank until you’ve been persecuted. Here are some suggestions:
If you are faculty at a university, make sure to write a book about
your crank idea. When the other members at the department decide to deny
you tenure because of your moronic ideas or call you an idiot claim
persecution!
If you work at an office, make sure you spend your time promoting
your crank idea. Tell everybody about it. Send mass emails about it.
Leave copies of your “monograph” where your boss and others can find it –
like the breakroom. If you’re fired for pursuing your crankery on the
job claim persecution!
If someone shows up at your website or forum and points out the flaws in your argument claim persecution!
If anyone calls you an idiot, a moron, a pseudoscientist, a crank, or denialist claim persecution!
If people don’t immediately accept your idea upon hearing it claim persecution!
If they won’t teach your idea in public schools as fact claim persecution!
If they won’t teach the controversy over your ideas in public schools claim persecution!
If people criticize journals for publishing your papers claim persecution!
If people circulate petitions against teaching your ideas claim persecution!
If a journalists covers only the scientific side and doesn’t cover yours claim persecution!
If no one visits your site or listens to you claim persecution!
If no one persecutes you claim persecution!
In this modern world there is such a thing as “parity of ideas”.
Everything must be balanced against its opposite. If anyone says
anything that contradicts you, it is your right to be able to counter
what they say for “balance”, even if you don’t have proof or
credibility. If they don’t do this you are being persecuted.
You see? It’s easy to be a crank. Just follow these simple guidelines and remember, you’re never wrong. No matter what.
I’d like to thank Chris Noble (not for being a crank or anything but for this idea) and lab lemming’s psuedoscientific method for inspiration for this post.
*Update* I’ve added some additional material based on comments (Thanks Pat, Marc and Mongrel)
Here at denialism blog, we’re very interested in what makes people
cranks. Not only how one defines crankish behavior, but literally how
people develop unreasonable attitudes about the world in the face of
evidence to the contrary. Our definition of a crank, loosely, is a
person who has unreasonable ideas about established science or facts
that will not relent in defending their own, often laughable, version of
the truth. Central to the crank is the “overvalued idea”. That is some
idea they’ve incorporated into their world view that they will not
relinquish for any reason. Common overvalued ideas that are a source of
crankery range from bigotry, antisemitism(holocaust deniers), biblical
literalism (creationists – especially YEC’s), egotism (as it relates to
the complete unwillingness to ever be proven wrong) or an indiscriminant
obsession with possessing “controversial” or iconoclastic ideas. Some
people just love believing in things that no one in their right mind
does, out of some obscure idea that it makes them seem smart or
different.
The OED definition of a crank seems to be a little old-fashioned:
5. colloq. (orig. U.S.). A person with a mental twist; one who is apt
to take up eccentric notions or impracticable projects; esp. one who is
enthusiastically possessed by a particular crotchet or hobby; an
eccentric, a monomaniac. [This is prob. a back-formation from CRANKY,
sense 4.] Also attrib. and Comb.
The OED etymology suggests it’s been in use for about 180 years, but I
don’t think it was defined well until that Nature quote in 1906 (which
very poetically describes the problem) that the definition seems to take
shape. Cranks aren’t interested in debate, nor do they respond to
reason, they’ll just blather on about their idiotic pet theory until
everyone in the room has fled or opened a vein. Another take on that
quote might be that a crank can only be turned one way, which would fit with the mechanical metaphor and suggest they’re only ever interested in spouting one line of reasoning.
Cranks overestimate their own knowledge and ability, and underestimate that of acknowledged experts.
Cranks insist that their alleged discoveries are urgently important.
Cranks rarely if ever acknowledge any error, no matter how trivial.
Cranks love to talk about their own beliefs, often in inappropriate
social situations, but they tend to be bad listeners, and often appear
to be uninterested in anyone else’s experience or opinions.
Now, in our terminology not every denialist is a crank, but cranks
use pretty much exclusively denialist arguments to make their point.
Cranks are a bit more deserving of pity, a bit closer to delusion and
mental illness than the pure denialist, who knows that they are spouting
BS to sow confusion.
Most people have a pretty good gestalt for what one is, and the
standard definitions are pretty accurate. But we’re more interested in
how people, sometimes perfectly reasonable people, turn into cranks. An
interesting resource to understand the phenomenon is this article in the Journal of Personality and Social Psychology
by Justin Kruger and David Dunning about how people who are incompetent
not only have an inflated sense of their own competence, but are also
incapable of even recognizing competence. Take for example this figure
from the paper (it’s not Wiley so hopefully I won’t be sued). It’s
pretty self-explanatory
What’s even more amazing is that when they then shared the
performance of other participants with the people who performed poorly
(hoping that they would then adjust their self-perception downward)
people who scored poorly failed to adjust their self-perception of their
performance. In other words, they are completely unaware of their own
competence, and can’t detect competence in others.
Now, doesn’t this explain a lot? It explains the tendency of cranks
not to care if other cranks (and denialists in general for that matter)
have variations on their own crazy ideas, just as long as the other
cranks are opposing the same perceived incorrect truth. Cranks and
denialists aren’t honest brokers in a debate, they stand outside of it
and just shovel horse manure into it to try to sow confusion and doubt
about real science. They don’t care if some other crank or denialist
comes along and challenges the prevailing theory by tossing cow manure,
as long as what they’re shoveling stinks.
For instance, you notice that Dembski doesn’t spend a whole lot of
time attacking Ken Ham, nor does the DI seem to care a great deal about
any kind of internal consistency of ideas. Michael Behe, for example, is
a raging “Darwinist” compared to Michael Egnor, who any scientifically competent person would recognize as well, this:
So what we have from the DI, the other denialists and their
organizations, is evidence of people with no competence in understanding
science, who overestimate their own abilities, and are incapable of
recognizing competence in others.
Next time I think we’ll discuss how people might start out as
reasonable people and then become cranks because they’re more interested
in being “right” than actually pursuing any kind of scientific truth.
Some cranks seem to be defined not so much by incompetence, but by their
obsession with their overvalued idea that ruins their ability to think
rationally.
P.S. I wrote this piece over the weekend and then PZ published this piece on a crank named Gilder
on Sunday. He’s a real textbook case. Crazy, gibbering, throwing out
lingo and jargon, and clearly not competent to recognize that he’s
completely wrong about information theory.
**Unskilled and Unaware of It: How Difficulties in Recognizing One’s
Own Incompetence Lead to Inflated Self-Assessments, Justin Kruger and
David Dunning, Cornell University, Journal of Personality and Social
Psychology, vol 77, no 6, p 1121-1134 (1999)
It’s good news though! A description of the tactics and appropriate response to denialism was published in the European Journal of Public Health
by authors Pascal Diethelm and Martin McKee. It’s entitled “Denialism:
what is it and how should scientists respond?” and I think it does an
excellent job explaining the harms of deniailsm, critical elements of
denialism, as well as providing interesting historical examples of
corporate denialism on the part of tobacco companies.
HIV does not cause AIDS. The world was created in 4004
BCE. Smoking does not cause cancer. And if climate change is happening,
it is nothing to do with man-made CO2 emissions. Few, if any, of the
readers of this journal will believe any of these statements. Yet each
can be found easily in the mass media.
The consequences of policies based on views such as these can be
fatal. Thabo Mbeki’s denial that that HIV caused AIDS prevented
thousands of HIV positive mothers in South Africa receiving
anti-retrovirals so that they, unnecessarily, transmitted the disease to
their children.1 His health minister, Manto Tshabalala-Msimang,
famously rejected evidence of the efficacy of these drugs, instead
advocating treatment with garlic, beetroot and African potato. It was
ironic that their departure from office coincided with the award of the
Nobel Prize to Luc Montagnier and Françoise Barré-Sinoussi for their
discovery that HIV is indeed the case of AIDS. The rejection of
scientific evidence is also apparent in the popularity of creationism,
with an estimated 45% of Americans in 2004 believing that God created
man in his present form within the past 10 000 years.2 While successive
judgements of the US Supreme Court have rejected the teaching of
creationism as science, many American schools are cautious about
discussing evolution. In the United Kingdom, some faith-based schools
teach evolution and creationism as equally valid ‘faith positions’. It
remains unclear how they explain the emergence of antibiotic resistance.
In particular I found their inclusion of a tactic of inversionism interesting:
There is also a variant of conspiracy theory,
inversionism, in which some of one’s own characteristics and motivations
are attributed to others. For example, tobacco companies describe
academic research into the health effects of smoking as the product of
an ‘anti-smoking industry’, described as ‘a vertically integrated,
highly concentrated, oligopolistic cartel, combined with some public
monopolies’ whose aim is to ‘manufacture alleged evidence, suggestive
inferences linking smoking to various diseases and publicity and
dissemination and advertising of these so-called findings to the widest
possible public’.9
This is in a subsection on their coverage of conspiracy in denialism
and it rings very true. Often those who function with a conspiratorial
mindset project their motives, tactics, and style of thinking on their
opponents. It’s nice to have word for it, but I usually think of
inversionism as the tendency of some to readily believe anything that
inverts a commonly held belief. A tendency which many scientists
manifest, probably because the science and facts often contradict
intuition and “common-sense” beliefs. Maybe we can think of a better
word for this, or maybe simply refer to it as projection.
I also enjoy their conclusion:
Whatever the motivation, it is important to recognize
denialism when confronted with it. The normal academic response to an
opposing argument is to engage with it, testing the strengths and
weaknesses of the differing views, in the expectations that the truth
will emerge through a process of debate. However, this requires that
both parties obey certain ground rules, such as a willingness to look at
the evidence as a whole, to reject deliberate distortions and to accept
principles of logic. A meaningful discourse is impossible when one
party rejects these rules. Yet it would be wrong to prevent the
denialists having a voice. Instead, we argue, it is necessary to shift
the debate from the subject under consideration, instead exposing to
public scrutiny the tactics they employ and identifying them publicly
for what they are. An understanding of the five tactics listed above
provides a useful framework for doing so.
Excellent! I couldn’t have said it better myself.
Many of the letters in reply are also pretty fascinating. You see a lot of feelings of persecution:
Clearly, no dissent is allowable from the doctrines of
tobacco control in Diethelm’s and McKee’s perspective. This perspective
brands hundreds of reputable scientists throughout the world as
denialists, no different from Holocaust deniers. While I disagree
wholeheartedly with these scientists, I will stand up for their right to
express their dissenting opinions without having their characters
assassinated because of the direction, rather than the scientific
reasonableness, of their positions.
Criticism of scientifically untenable positions is suppression of
dissent! It’s punishing heresy! Pointing out that denialists use
dishonest methods is like Hitler! This is the classic example of
self-persecution you always see when it’s made clear the methods of
denialists are not different from holocaust denial to evolution denial.
This, whether they like it or not, is a factual statement.
Denialism is pretty predictable and consistent in form no matter what
the topic. The response is inevitably “You’re comparing me to a
holocaust denier!”, when in reality all we’re doing is comparing the
tactics. Holocaust deniers and tobacco/cancer denialists are both human
beings, is that an unfair comparison? Or, one could argue that those who
deny tobacco smoke causes cancer might actually be worse than holocaust deniers, as holocaust deniers, while they are despicable bigots, are not defending an ongoing campaign of death.
The authors’ reply is perfect.
Finally, D&M consider Galileo as a reference that
tobacco “denialists” should refrain from citing in support of their
unacceptable views. What is not understood here is that the problems
related to tobacco and drug research and policy are very similar. From
there, Galileo is and will remain a universal reference [13].
The ego on these people is astounding. There might be one or two
people who are as paradigm-shifting as Galileo in a generation, or even a
century, but all these cranks seem very comfortable in assuming his
mantle. Some humility please. Their reply again
is perfect. But I worry. At some point this will just devolve into
arguing with cranks; something to be avoided at all costs. And when you
consider one of the complaining letters is from a guy who doesn’t even
think nicotine is addictive, well, what’s the point of arguing?
Diethelm, P., & McKee, M. (2008). Denialism: what is it and how
should scientists respond? The European Journal of Public Health, 19
(1), 2-4 DOI: 10.1093/eurpub/ckn139
No comments:
Post a Comment