Skip to content
tom-radetzki-idRpmZVNs30-unsplash
10 September 2020| doi: doi:10.5281/zenodo.3964396

How Conspiracy Theorists Get the Scientific Method Wrong

Philipp Hübl on the characteristics of conspiracy theories the motivation behind their spread, and rational thought as a shield against it.


SCIENCE VERSUS PSEUDOSCIENCE

Most conspiracy theorists make abstruse claims: The world is flat and run by an alien species of “reptiloids” covered in human skin (Ronson 2001). Governments use airplanes to diffuse chemicals into the atmosphere, known as “chemtrails”, in order to regulate the size or behavior of the population. The Rothschild family has been in charge of global finance and trade for centuries. 9/11 was an inside job. And Bill Gates is developing a vaccination against the coronavirus in order to secretly implant chips under our skin that can be used to control our minds. Especially during the current Covid-19 pandemic, new varieties of widespread conspiracy ideas have emerged. The idea behind conspiracy theories can be summarized as follows: Nothing is what it seems, and there is a master plan behind all major events in world history (Butter 2018; Barkun 2013). Conspiracy thinking underlies entertaining conspiracy theories, or at least, it is a disposition to holding conspiracy theories to be true.

At first glance, conspiracy thinking bears a striking resemblance to scientific thinking. Working in science, in order to make a name for yourself, you have to attack the standard view and come up with an original alternative explanation. You have to cultivate a skeptical stance and find a general principle underlying apparent heterogenous phenomena (like the law of gravitation or the structure of the DNA). This is precisely what a conspiracy theorist does, or so it seems.

At second glance, however, the differences between the conspiracy mentality and scientific thinking are even more striking. In fact, philosophers of science like Karl Popper, who coined the term ”conspiracy theory”, and others went to great length to pinpoint the demarcation between science on the one hand and pseudoscience, a mere dummy or superficial imitation of real science, on the other (Popper 1945). This enterprise is commonly called the ”demarcation problem” (Hansson 2008).

Conspiracy theories often incorporate elements from pseudoscience, for instance by employing scientific terms or referring to ”data”, ”research”, ”sources” and statements from ”experts”. Yet they typically shy away from employing the scientific method. Though there is no final consensus about the list of criteria for the demarcation between science and pseudoscience, there are some clear candidates: Scientific claims are based on systematic observation and/or experimentation, they should be consistent with other findings, and ideally expressed in precise logical and mathematical terms. Those statements need to be falsifiable, in other words, it must be possible to show that they are not true (Popper 1935). And the theories have to be ontologically parsimonious, typically expressed by Ockham’s razor: ”don’t introduce entities without need”, or put more colloquially ”the simplest explanation is the best” (on parsimony, see Quine (1948)). Through critical self-examination, like peer review, scientific theories can be improved and enhanced. Scientific process and societal progress in technology, economics and social conditions go hand in hand.

By contrast, pseudoscience rarely emerges from systematic observations and it is rarely expressed in precise terms, let alone based on qualitative or quantitative studies (for a discussion, see Popper (1935), Bunge (1982), Hansson (2008), Thagard (1978)). The claims show no regard for consistency, are often immune to falsification and introduce mystical powers and forces instead of employing parsimony. Pseudoscience lacks peer review, shows neither scientific nor subsequent technological, economic or social progress.

THE COGNITIVE FALLACIES OF CONSPIRACY THEORISTS

Conspiracy theorists, too, rarely have formal training in science, or more broadly, in rational thinking, yet they entertain a penchant for the extremes. A newspaper didn’t publish the ”exact“ numbers of participants of a political demonstration? Then the entire ”mainstream media” must be lying. The wrong person was arrested after the terrorist attack 2016 in Berlin? The government is obviously behind it. The philosopher Jerry Fodor characterized this appetite for excess: “Apparently the rule is: if aspirin doesn’t work, try cutting off the head” (Fodor 1986 p. 1)

As research in psychology shows, conspiracy theories are fuelled by two attitudes, one cognitive and one motivational: namely fallacies in reasoning on the one hand and certain emotional attitudes and personality traits on the other. Together they constitute the conspiracy mentality (Imhoff 2015).

First let’s look at the cognitive fallacies (Kahneman 2011). Consider the above-mentioned conspiracy theory about reptiloids, mighty alien lizards controlling the world. This story is reminiscent of John Carpenter’s 1988 film They Live, in which the entire upper class consists of extraterrestrials who disguise themselves as human beings, forcing the world population to work for them. The film is an allegory about capitalism, hidden power and critical outsiders who ultimately blow the rulers’ cover. A conspiracy theory often starts from a point of healthy skepticism towards authorities, but then overshoots the mark, turning into the absurd opposite.

Studies show that conspiracy theorists are likely to have an intuitive thinking style that is guided by gut feelings, with little to no knowledge about statistics. They are typically young, have a low level of education, are more religious and spiritual than the average, believe in the supernatural, and are generally prone to extreme views (Bartoschek 2015). As some experiments indicate, conspiracy theorists don’t find it problematic to have two contradicting beliefs at the same time. For example, in one study, subjects with a conspiracy mentality considered it more likely that Lady Di was killed by the secret service than was killed in a car accident (Wood 2012). At the same time, they considered it more likely that she is still alive, living under a secret identity, than being dead from the accident. In other words, conspiracy theorists find it more likely that Lady Di is simultaneously dead and alive. Schrödinger’s Lady Di, so to say. Conspiracy theorists also fall prey to fake news more easily than the average, according to research on 2.3 million Facebook users (Mocanu 2015).

When it comes to pseudoscientific beliefs, the conspiracy theorist’s naïve and fallacious theory of causation is particularly striking, namely the idea that a small elite secretly guides the destiny of mankind. Instead of accepting that human suffering often results from complex and elusive political, social and economic circumstances, they rather opt for personalized causes, namely a group of powerful Svengalis.

A theory of causation says something about cause and effect (for an overview, see Schaffer (2016)). We all tacitly employ a folk theory of causation, even if we cannot frame it in explicit terms. For example, we know that dropping the vase causes it to break on the kitchen floor. We know that insulting people makes them angry, in other words, that insults cause anger. And we know that sunburns are caused by rays of light.

Assumptions about causation are not only at the heart of the sciences, but so essential for the everyday thought and action that Immanuel Kant considered the principle of causation (“Every event has a cause“) one of the basic principles of reason (Kant 1781/1787). We cannot try to understand the world without assuming that causes are followed by their effects. Many causes are complex, invisible, and have indirect and thus often distant effects. An Azores anticyclone is causing cloud formations over Paris. Hundreds of wrong decisions caused the financial crisis in 2008. And viruses like Covid-19, too tiny to be detectable by the naked eye, can cause a pandemic killing hundreds of thousands worldwide. Although causation is the “cement of the universe”, as David Hume pointed out, it is challenging even in science to corroborate causal claims through experimental or other evidence.

Many people have a hard time dealing with distant, complex and invisible causes, because from our experience, we are only familiar with everyday causes in our vicinity occurring to medium-sized objects, like persons, cars and vases. The most salient causes we know from experience are human agents, or more precisely, their actions. This is probably the reason why the ancient Greeks regarded Zeus as the one who threw lightning bolts and why members of many ethnic religions still blame demons and witches for inexplicable happenings, for example for their house collapsing or for their child getting sick (Boyer 2001).

Conspiracy theories employ similar reasoning by making a small group of agents responsible for the suffering of the world: the CIA, the Freemasons, the Zionists, or simply “the elites“. For them, agent causation is closer to home than introducing complex and abstract explanations that take into account multiple events as well as the dispositions and powers of the objects that appear in those events.

If you have formal training in dealing with probabilities you know that if two people have a fatal accident on the same day on two different continents, it is almost certainly due to pure chance, even if a few decades before, they had both worked side by side for the government. However, the conspiracy theorist will reject mere coincidence, and suspect a hidden message or a higher plan concocted by a person or a group. Psychologists call this tendency hyperactive agent detection (Barrett 2000).

Still, how do some people come up with the insane idea that 9/11 was an inside job? The government of the United States attacking the World Trade Center killing over 3,000 US citizens in order to have a reason to invade Afghanistan? This is where a second fallacy comes into play, the cui bono principle, which says: “To whom is it a benefit?”. The principle is attributed to the Roman philosopher Cicero, who used it in a criminal trial to uncover the motive of an actual perpetrator, thus proving the innocence of his client. From an evolutionary perspective, the principle makes sense. When it comes to actions, it is quite plausible to first ask about the motive, because this is the only way to explain our deeds and those of others (Davidson 1980). The police, for example have a hard time solving murders, if there is no motive, as we know from television shows.

From the point of argumentation theory, however, the principle is fallacious. The consequences of an act being useful for someone does not imply that they committed it. Incidentally, common sense alone can refute the conspiracy theory behind the 9/11 attack on the World Trade Center. The US could have invaded Afghanistan on a less dramatic pretext. They could have simply said “The country has weapons of mass destruction.”

THE MOTIVATION BEHIND THE CONSPIRACY MENTALITY

In addition to cognitive fallacies and biases, the conspiracy mentality has a second aspect that typically leads to motivated cognition stemming from protecting one’s moral identity, especially when it comes to authority and power (Bruder 2013; for more details, see Hübl (2019)). In this regard, the conspiracy mentality is similar to Theodor W. Adorno’s concept of an authoritarian character, a person who is particularly receptive to right-wing extremist ideas (developed by Adorno as well as Max Horkheimer and Erich Fromm, see Adorno et al. (1950)). Such a person is not only drawn to authority and power, he or she is also prone to superstition, belief in destiny, a social dominance orientation towards outgroups and the belief in evil (Hübl 2018a).

Since people with a conspiracy mentality have similar characteristics, it is apparent that the groups largely overlap. Conspiracy theories are particularly popular among right-wing extremists, and many conspiracy theories employ topics from right-wing extremism (Bruder 2013 p. 10). However, the conspiracy mentality can also be found among left-leaning liberals and progressives (Miller et al. 2016). Progressive conspiracy theories often regard topics of nature, health and medicine, such as genetically modified food or vaccination being linked to evil plans by “big pharma” or “the government”.

At any event, the majority of conspiracy theorists employ right-wing attitudes and are concerned with power and authorities. Since they entertain an authoritarian thinking style, they want to dominate others. Hence, they are particularly bothered if authority lies in the wrong hands (van Prooijen 2013). Moreover, those who follow a conspiracy theory tend to think that other conspiracy theories are true, as expressed by psychologist Stephen Lewandowsky’s tongue-in-cheek “NASA faked the moon landing, therefore, climate science is a hoax” (Lewandowsky 2013). This stance indicates a general suspicion towards out-group authority. From an evolutionary point of view, we may have evolved to be sensitive and distrusting towards power in our social group, especially since power is often complementary. When I have no power, others typically have it. But the inference from “others have power” to “others use their power against me” is a fallacy. It comes as no surprise that according to many studies, conspiracy theorists typically suffer from a feeling of powerlessness and inferiority (Goertzel 1994). They have the impression that nothing can be done against those in charge of politics and the economy. Often, they also feel socially excluded, disintegrated and alienated (Graeupner/Coman 2016).

One can even intensify this feeling in experiments generating conspiracy thoughts. Students interviewed in a classic anxiety situation shortly before an important exam made stronger conspiracy assumptions than the relaxed control group (Grzesiak-Feldman 2013). In another experiment, after being presented with unsolvable puzzles, subjects had to look at randomly generated pictures (Whitson/Galinsky 2008). They recognized significantly more “patterns” in these images than those subjects from the neutral control group. This suggests that when people are unable to bring order to the world around them, they will seek order and structure elsewhere.

The experiments could also explain why conspiracy theorists often present themselves as having a gesture of superiority (Imhoff/Lamberty 2017). They think that they have a secret knowledge that is kept from others and enjoy the feeling of being special. While other people are naïve and need to wake up from their dogmatic slumber, conspiracy theorists view themselves as skeptics who don’t fall for deception. Plus, they always have a surprising story to tell.

As a coping strategy, this feeling of superiority could have two functions: On the one hand, it gives conspiracy theorists the impression of power as a substitute for their lack of power.

As opposed to the blind majority, they are “red pilled”, seeing through the veil of deception. On the other hand, the narrative of mighty puppeteers serves as a rationalization of their own shortcomings (Imhoff/Bruder 2014). The second attitude would also explain why anti-Semitism is a frequent element in conspiracy theories. Anti-Semitic conspiracy theorists consider Jews inferior, but still powerful (Fiske 2002).

RATIONAL THINKING AS BULLSHIT RESISTANCE

The current pandemic is characterized by a loss of control due to an invisible threat. Both aspects promote conspiracy thinking. While full-fledged conspiracy theorists are often self-proclaimed skepticists and critical thinkers, they typically confuse science as a combination of method and positive scientific knowledge with the current knowledge alone. Science, as a method, is the systematic discovery of the truth, while the current scientific knowledge is by its nature preliminary. Conspiracy theorists take revisions in science (e.g. initially considering first-hand contact as main pathway of coronavirus infections, and later aerosols) or ignorance in certain fields of study (e.g. about the origins of the coronavirus) as indicating that the methodological enterprise of science itself is questionable.

What conspiracy theorists don’t see is that autocorrection is an integral part of the scientific method, a fact that is often not stressed enough in the popular communication of science. This method has proven its worth for several reasons. First, it is based on the insight that knowledge is fallible: what you believe to be true can be proved to be wrong later (Popper 1935). Second, scientific theses are always empirically undetermined, no matter how good and large the data set is (Quine 1960). Later discoveries can throw a new light on your approach. Third, humans are fallible beings: we make mistakes. And even if we take all precautionary measures, we tend to employ typical heuristics and biases. This is why peer review and criticism from colleagues is such an important corrective in science.

Treating knowledge as preliminary and theories as undetermined is challenging for many in everyday life. Even people who are resilient towards conspiracy thinking, prefer to know what is going on instead of constantly dealing with ambiguity and indeterminacy. Why is that the case? Evolutionary psychologists tell something of the following story: Our everyday thinking is not geared towards virology, statistics and the philosophy of science, but has evolved in order to survive in a harsh and dangerous environment (Stanovich 2010). Living during the Stone Age in small groups, the thought patterns of our ancestors were advantageous even though they contradicted the principles of science. It was better on average, for instance, to follow those who appeared confident, because every full-bodied boast about the close-up area was easy to unmask: Those who could not correctly interpret the track of the wild boar immediately lost their reputation (Pinker 2018 p. 354 f.).

Yet, group cohesion might have been more important than the truth (Shaver/Sosis 2018). The group’s identity included superstitious causal explanations about how the world worked, such as gods hurling lightning bolts. Those who questioned the standard assumptions of the tribe were ostracized or even killed. Moreover, the safest assumption was: correlation implies causation. If the companion ate a toadstool and died afterwards, then the natural assumption was that the mushroom had caused his death. It would have been absolutely ruinous to conduct a double-blind study with fly agarics and placebo mushrooms to systematically test the hypothesis “fly agarics are poisonous“.

Today, we are still susceptible to those archaic thought patterns. However, they are harmful when it comes to pandemics and the like, since science does not work that way. Researchers are revising their assumptions and carefully formulating their hypotheses, which makes them appear uncertain and fickle to laypeople. Causes are rarely agents, but in the case of pandemics invisible entities like bacteria or the Covid-19 virus. To investigate them, you have to conduct experiments. In science, at least on average, anyone who disproves the group’s mainstream view with good arguments and solid data is rewarded with fame, prizes and a tenured position instead of being outlawed. While science is arguably not as open and transparent as it could be, and while researchers, too, sometimes misuse their power, only very few got rich or powerful from their discoveries. And finally, correlation does not imply causation. For instance, the decline of the number of storks and the birth rates in Europe have been strongly correlated for decades now, as are margarine consumption in the USA and the divorce rate in the state of Maine (Matthews 2000). But those correlations tell us nothing about causation. In order to bring the nature of things to light, you need experimental and statistical methods. Claims from conspiracy theorists such as “the experts lie“ or “Bill Gates is behind it“ fall short of this standard.

We are made from crooked timber, as Kant has pointed out (Kant 1784). Empirical research on heuristics and biases can corroborate and specify this observation. We are prone to prejudices and mental shortcuts, because evolution has equipped us with an intuitive thinking apparatus that is not tailor-made for the modern world of science and technology. Fortunately, however, evolution has also given us reason, that is the ability to distance us from ourselves, and to think critically and scientifically, so that we can shield ourselves against mental shortcuts and recognize our own prejudices (Stanovich 2016). This is tiring, since it requires attention and intensive training, but it can save lives and prevent us from making fools out of ourselves in believing ludicrous fake news and conspiracy theories. Hence, in schools and universities and all essential political, social and scientific institutions we need obligatory courses in “critical thinking”, or, as I like to call it: bullshit resistance (for more details, see Hübl (2018b)).


Philipp Hübl is a philosopher working in the philosophy of science, philosophy of mind and about moral identity. He is author of the books „Die aufgeregte Gesellschaft“ (2019), „Bullshit-Resistenz (2018), „Der Untergrund des Denkens“ (2015), among others, and has published articles on political and social topics in media like DIE ZEIT, FAZ, taz, NZZ, Die Welt, Republik, and El País. Hübl taught theoretic philosophy at the RWTH Aachen university, the Humboldt University in Berlin, and as Associate Professor (“Juniorprofessor”) at Stuttgart University. In 2020, he is fellow at the Weizenbaum Institute in Berlin.


REFERENCES

Adorno, Theodor W. et al. (1950) The Authoritarian Personality. Harper und Brothers: New York

Barkun, Michael (Ed.) (2013) A Culture of Conspiracy. Berkeley: University of California Press

Barrett, Justin L. (2000) “Exploring the Natural Foundations of Religion“ Trends in Cognitive Sciences 4, 1: 29–34

Bartoschek, Sebastian (2015) Bekanntheit von und Zustimmung zu Verschwörungstheorien. Hannover: JMB-Verlag

Boyer, Pascal (2001) Religion Explained. The Evolutionary Origins of Religious Thoughts. New York: Basic Books

Bruder, Martin et al. (2013) Measuring Individual Differences in Generic Beliefs in Conspiracy Theories Across Cultures: Conspiracy Mentality Questionnaire“ Frontiers in Psychology 4: 1–14

Bunge, Mario (1982) “Demarcating Science from Pseudoscience” Fundamenta Scientiae 3: 369–388

Butter, Michael (2018) “Nichts ist, wie es scheint.“ Über Verschwörungstheorien. Frankfurt am Main: Suhrkamp

Davidson, Donald (1980) Essays on Actions and Events. Oxford: Oxford University Press

Fiske, Susan T. et al. (2002) “A Model of (Often Mixed) Stereotype Content: Competence and Warmth Respectively Follow from Status and Competition“ Journal of Personality and Social Psychology 82: 878–902

Fodor, Jerry A. (1986) “Banish Discontent“ in Jeremy Butterfield (Hg.) Language, Mind, and Logic. Cambridge: Cambridge University Press

Goertzel, Ted (1994) “Belief in Conspiracy Theories” Political Psychology 15, 4: 731–742

Graeupner, Damaris und Coman, Alin (2016) “The Dark Side of Meaning-making: How Social Exclusion Leads to Superstitious Thinking“ Journal of Experimental Social Psychology 69: 218– 222

Grzesiak-Feldman, Monika (2013) “The Effect of High-Anxiety Situations on Conspiracy Thinking“ Current Psychology 32,1: 100–118

Hansson, Sven Ove (2008) “Science and Pseudo-Science” in Zalta, Edward N. (ed.) (2008) The Stanford Encyclopedia of Philosophy (Fall 2008 Edition)

Hübl, Philipp (2018a) “The Power of Political Emotions. On Political Camp Formation and the New Right-Wing Populism” in Tillmans, Wolfgang und Oetker, Brigitte (2018) What is Different? Jahresring 64. Berlin: Sternberg Press

Hübl, Philipp (2018b) Bullshit-Resistenz. Berlin: Nicolai

Hübl, Philipp (2019) Die aufgeregte Gesellschaft. Wie Emotionen unsere Moral prägen und die Polarisierung verstärken. München: C. Bertelsmann

Imhoff, Roland (2015) “Beyond (Right-wing) Authoritarianism. Conspiracy Mentality as an Incremental Predictor of Prejudice“ in Bliewicz, Alexandra et al. (2015) The Psychology of Conspiracy. London: Routledge, S. 122–143

Imhoff, Roland und Bruder, Martin (2014) “Speaking (Un-)truth to Power: Conspiracy Mentality as a Generalised Political Attitude“ European Journal of Personality 28: 25–43

Imhoff, Roland und Lamberty, Pia Karoline (2017) “Too Special to be Duped: Need for Uniqueness Motivates Conspiracy Beliefs“ European Journal of Social Psychology 47: 724–734

Kahneman, Daniel (2011) Thinking: Fast and Slow. London: Allen Lane

Kant, Immanuel (1781/1787) Kritik der reinen Vernunft

Kant, Immanuel (1784) “Idee zu einer allgemeinen Geschichte in weltbürgerlicher Absicht“ Berlinische Monatsschrift, November 1784: 385–411

Lewandowsky, Stephan (2013) “NASA Faked the Moon Landing—Therefore, (Climate) Science Is a Hoax. An Anatomy of the Motivated Rejection of Science“ Psychological Science 24, 5: 622–633

Matthews, Robert (2000) “Storks Deliver Babies (p = 0.008)” Teaching Statistics 22, 2

Miller, Joanne M. et al. (2016) “Conspiracy Endorsement as Motivated Reasoning: The Moderating Roles of Political Knowledge and Trust“ American Journal of Political Science 60: 824–844

Mocanu, Delia et al. (2015) “Collective Attention in the Age of (Mis)information“ Computers in Human Behavior 51, B: 1198–1204

Pinker, Steven (2018) Enlightenment Now. The Case for Reason, Science, Humanism, and Progress. New York: Viking

Popper, Karl (1935) Logik der Forschung. Tübingen: Mohr;

Popper, Karl (1945) The Open Society and Its Enemies, Book II. London: Routledge and Kegan Paul.

Quine, Willard Van Orman (1948) “On what there is“ in Quine, Willard Van Orman (1953) From a Logical Point of View. Cambridge (MA): Harvard University Press, 1–19

Quine, Willard Van Orman (1960) Word and Object. Cambridge (MA): The MIT Press

Ronson, Jon. 2001. „Beset by Lizards“. The Guardian, 17. March 2001. https://www.theguardian.com/books/2001/mar/17/features.weekend [accessed on 28. July 2020].

Schaffer, Jonathan (2016) “The Metaphysics of Causation” in Zalta, Edward N. (ed.) (2016) The Stanford Encyclopedia of Philosophy (Fall 2016 Edition)

Shaver, John und Sosis, Richard (2018) “Costly Signaling in Human Cultures“ in Callan, Hilary (Hg.) (2018) International Encyclopedia of Anthropology: Evolutionary and Biosocial Perspectives in Anthropology. London: Wiley-Blackwell

Stanovich, Keith E. (2010) Decision Making and Rationality in the Modern World. Oxford: Oxford University Press

Stanovich, Keith et al. (2016) The Rationality Quotient. Toward a Test of Rational Thinking. Cambridge (MA): MIT Press

Thagard, Paul R. (1978) “Why Astrology Is a Pseudoscience” Philosophy of Science Association 1: 223–234

Van Prooijen, Jan-Willem und Jostmann, Nils B. (2013) “Belief in Conspiracy Theories: The Influence of Uncertainty and Perceived Morality“ European Journal of Social Psychology 43: 109–115

Whitson, Jennifer A. und Galinsky, Adam D. (2008) “Lacking Control Increases Illusory Pattern Perception“ Science322, 5898: 115–117

Wood, Michael J. et al. (2012) “Dead and Alive. Beliefs in Contradictory Conspiracy Theories“ Social Psychology and Personality Sciences 3, 6: 767–773

This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

Sign up for HIIG's Monthly Digest

and receive our latest blog articles.

Further articles

two Quechuas, sitting on green grass and looking at their smartphones, symbolising What are the indigenous perspectives of digitalisation? Quechuas in Peru show openness, challenges, and requirements to grow their digital economies

Exploring digitalisation: Indigenous perspectives from Puno, Peru

What are the indigenous perspectives of digitalisation? Quechuas in Peru show openness, challenges, and requirements to grow their digital economies.

eine mehrfarbige Baumlandschaft von oben, die eine bunte digitale Publikationslandschaft symbolisiert

Diamond OA: For a colourful digital publishing landscape

The blog post raises awareness of new financial pitfalls in the Open Access transformation and proposes a collaborative funding structure for Diamond OA in Germany.

a pile of crumpled up newspapers symbolising the spread of disinformation online

Disinformation: Are we really overestimating ourselves?

How aware are we of the effects and the reach of disinformation online and does the public discourse provide a balanced picture?