Since the dawn of the big-brained homo sapiens, we have conjured up explanatory frameworks that attempt to make sense of our existence and environment. Although mythology, religion and natural theology helped us immensely in deciphering reality and bettering ourselves, science’s explanatory power has truly blown the gates wide open. Due to its successes and innovations, science has long been revered in society.
The scientific method aims to find the truth, but prides itself for being wrong, knows it’s fallible and encourages those who engage in this quest to remain highly critical of themselves, results and conclusions1. Ideas become testable theories, theories become paradigms, and paradigms break over time, giving rise to more accurate explanations. Nevertheless, due to the method’s complexity and lack of transparency, it is also widely distrusted. To make matters worse, pseudoscience gained ground partly due to problems within the scientific community.2 RationalWiki.com describes it as: “any belief system or methodology which tries to gain legitimacy and authority by wearing the trappings of science, but fails to abide by the rigorous methodology and standards of evidence that are the marks of true science.”
Spot the Difference
Although not everybody believes there is any distinction to be made between science and pseudoscience – most notably, Austrian philosopher Paul Feyerabend – physicist Rory Coker provides 22 points on how to distinguish between the two.3 Unfortunately, for the uninformed public, pseudoscience and true science are hardly distinguishable. This difficulty is further illustrated by Oreskes & Conway in their book and documentary Merchants of Doubt, in which they demonstrate how billion-dollar companies and governments with crooked agendas readily make use of deliberate pseudoscience in order to sway or nudge the public.4 On top of that, the sensationalism and greed with which certain media outlets misreport on typical scientific discoveries have certainly chiseled away at the trust of laypeople in science.5 It is often the case that in a short period of time, many paradoxical findings about a given subject are presented within the public domain. For example, the notion that drinking wine is incredibly bad for you by researcher X, while researcher Y says that drinking wine is good for your heart. This is often capitalized on by a multitude of clickbait articles or sponsored content that portray themselves as being truthful. Ransohof & Ransohof explain that both science and the media are often liable for this sensationalism found in the reports on scientific advancements.6
Billion-dollar companies and governments with crooked agendas readily make use of deliberate pseudoscience in order to sway or nudge the public.
Singer & Bernassi write that when knowledge is lacking, people (un)knowingly adopt beliefs that cannot possibly be true.7 They describe them as occult beliefs, of which many fall within the domain of pseudoscience. They argue that the rise of such beliefs on a societal level are possibly caused by media distortions, social uncertainty – such as the social shifts in religion and gender roles – and deficiencies in rational human reasoning. They explain how occult, or false beliefs, can only form under two conditions: environmental uncertainty and low cost of the superstition. This means that people are more likely to form superstitious beliefs whenever they cannot control what they are doing due to a lack of knowledge.
However, in the 21st century, we are not confronted with a lack, but with an abundance of knowledge. In 2006 alone, an estimated 1.3 million scientific papers have been published of which the majority have also been published online in open-source databases.8 Google handles around 1.2 trillion search inquiries a year. In 2014, the internet contained about 1 billion websites and more than 3 billion people have instant access to the world’s largest known encyclopaedia in the world: Wikipedia.
If all this information is out there on the internet, why are we still so prone to believe in false beliefs? The problem lies in the fact that lay people, as well as experts and scientists, find these numbers staggering. Our brains are not capable of rationally interpreting such vast amounts of data. Although the information is out there, it is not knowledge. However, many of these pieces of information do actually reach people, albeit in incoherent flashes, only to be misinterpreted due to their cognitive limitations or tampering of the media.
The boundaries between professional and public communication have become more porous, turning science communication inside-out.
Trench suggests that the birth and growth of the internet was mostly instigated by scientific research communities who were able to easily share sensitive information.9 Because of the global expansion of the internet, it has not only been an important tool for professional and private scientific communication, but has also made communication on science and technology accessible to the public. Through these online developments, the boundaries between professional and public communication have become more porous, facilitating public access to previously private spaces, and therefore turning science communication inside-out. This is amplified by the rise of peer-to-peer knowledge sharing and user-generated content (UGC). Unlike Wikipedia, social platforms such as Facebook, YouTube, Twitter and Instagram let users upload their own knowledge.
Whereas the flow of ‘to be trusted’ information used to be mostly vertical (from expert scientist or journalist to layperson – after which it would then spread horizontally by word of mouth – people are now more easily becoming ‘experts’ themselves, or at least have the capability to independently spread information personally deemed worthy of spreading. Many of these platforms where people create UGC have little to no quality control, and only moderate the most extreme forms of user-generated content such as porn, violence and racism.
It seems that the advent of the internet and the popularity and gratification users obtain from viewing or creating UGC, as described by Shao, has given birth to online echo chambers of islands-on-their-own-knowledge.10 This knowledge can come from private spaces (online scientific journals), can be picked up on by the media or someone with access to such papers, is misinterpreted, and then spreads through public spaces, acting as legitimate scientific knowledge from such private spaces.11 In the long run, this can lead to parallel reality bubbles of truth about certain subjects that flow from the echo chambers into off-line conversations and spread throughout the population, giving more merit and credibility to the pseudoscientific ideas.
But why are so many as gullible to accept such (un)obvious false beliefs of others? It might be because they do not readily recognize that they are. The average human is not very sufficient in self-knowledge or meta-cognition but is very inclined to hold on to their current set of beliefs. This happens due to what psychologists know as cognitive dissonance. Straying too far away from what you believe or think you need to believe may cause a great deal of distress. These psychological underpinnings of attitude change have been carefully described and summarized by Petty & Brinol.12 People generally enjoy confirmation of what they already know, dislike being wrong or to be presented with conflicting information. Since most of the subjects that people now hold (pseudo) scientific beliefs over are highly complex and contain great environmental uncertainty, it is almost impossible for a non-expert to form a justified belief based on truthful knowledge, as it would imply a long and painful trip of being wrong, whilst being wrongfully right is much more appealing.
Being wrongfully right is much more appealing than forming a justified belief based on truthful knowledge.
Although the plague of pseudoscience seems to be a modern phenomenon, Lakatos warned us about the effects of unchecked pseudoscience as early as 1970, and wished to establish legislation to control it, for it could “destroy our cultural environment even earlier than industrial and traffic pollution destroys our physical environment“.13 Gardner was on the same page when he wrote the book Facts and Fallacies in the Name of Science, which tackled 24 popular misconceptions and false beliefs.14 He also stated that pseudoscience is a problem, but is not yet dangerous, as most pseudo-scientists work in almost total isolation from their colleagues and tend to be paranoid. Through the lens of our hyper-connected 21st century, these words seem to reflect something more. Pseudoscientists or those that propagate pseudoscientific ideas and false beliefs were not dangerous, as they often lived in isolation, and had lacked a platform to spread their attractive, yet bad explanations. The internet, however, has given them just that, which means that ‘little pseudoscience,’ can now evolve into ‘big pseudoscience’ by mere replication.
Holtzman claims that the only way to stop this trend is to invest in better communication between scientific institutions and the media, in order to accurately report on the significance of the discovery.15/sup> As most of the public gets its opinion on science through mass media instead of buying subscriptions to journals and reading lengthy papers themselves. Furthermore, we must heavily invest in teaching our future generations the basic principles behind the scientific method, especially aimed towards those who have less affinity with science.
For as Carl Sagan states,
“If we teach only the findings and products of science — no matter how useful and even inspiring they may be — without communicating its critical method, how can the average person possibly distinguish science from pseudoscience? Both then are presented as unsupported assertion.”16
 Peter Achinstein (2004). “General Introduction” (pp. 1–5) to Science Rules: A Historical Introduction to Scientific Methods. Johns Hopkins University Press.
 Theocharis, T., & Psimopoulos, M. (1987). Where science has gone wrong. Nature, 329(6140), 595-598.
 Feyerabend, P. (1975). How to defend society against science; Coker, R. (2001). Distinguishing science and pseudoscience. Retrieved December, 9(2002), 62-73.
 Oreskes, N., & Conway, E. M. (2010). Merchants of doubt.
 Moynihan, R., Bero, L., Ross-Degnan, D., Henry, D., Lee, K., Watkins, J., … & Soumerai, S. B. (2000). Coverage by the news media of the benefits and risks of medications.New England Journal of Medicine, 342(22), 1645-1650.
 Ransohoff, D. F., & Ransohoff, R. M. (2001). Sensationalism in the media: when scientists and journalists may be complicit collaborators.
 Barry Singer and Victor A. Benassi. “Occult Beliefs: Media Distortions, Social Uncertainty, and Deficiencies of Human Reasoning Seem to be at the Basis of Occult Beliefs.” American Scientist , Vol. 69, No. 1 (January-February 1981), pp. 49-55.
 Bjork, B. C., Roos, A., & Lauri, M. (2009). Scientific journal publishing: yearly volume and open access availability.Information Research: An International Electronic Journal, 14(1).
 Trench, B. (2008). Internet: turning science communication inside-out?
 Shao, G. (2009). Understanding the appeal of user-generated media: a uses and gratification perspective. Internet Research, 19(1), 7-25; Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G., … & Quattrociocchi, W. (2016). The spreading of misinformation online. Proceedings of the National Academy of Sciences, 113(3), 554-559.
 Ladle, R. J., Jepson, P., & Whittaker, R. J. (2005). Scientists and the media: the struggle for legitimacy in climate change and conservation science.Interdisciplinary Science Reviews,30(3), 231-240
 Petty, R. E., & Brinol, P. (2010). Attitude change.Advanced social psychology: The state of the science, 217-259.
 Lakatos, I.(1970). Falsification and the methodology of scientific research programmes. In Lakatos and A.Musgrave (Eds.) Criticism and the Growth of Knowledge. Cambridge: Cambridge University Press, pp. 91–195.
 Petty, R. E., & Brinol, P. (2010). Attitude change.Advanced social psychology: The state of the science, 217-259.
 Holtzman, N. A., Bernhardt, B. A., Mountcastle-Shah, E., Rodgers, J. E., Tambor, E., & Geller, G. (2005). The quality of media reports on discoveries related to human genetic diseases.Public Health Genomics, 8(3), 133-144.
16 Sagan, C. (1996). Does truth matter? science, pseudoscience, and civilization.Skeptical Inquirer, 20, 28-33.