Review of Awesome Creation

July 30, 2014

The Cosmos is all that is, or was, or ever will be. Thus spake the late astronomer and author Carl Sagan, expressing his belief that reality consists of nothing but matter and energy. Sagan’s atheist slogan may have been borrowed from the ancient Greek philosopher Heraclitus (circa 500 BCE): “This cosmos, the same for all, was neither made by God nor man, but was, is, and always will be.” Heraclitus was conveying a notion that held sway for millennia and was endorsed by modern science until very recently – that the universe has always existed. He is quoted in Awesome Creation, a Study of the First Three Verses of the Torah, by Rabbi Yosef Bitton (Gefen Publishing House, 2013).

Rabbi Bitton eloquently presents the Jewish response to claims of the universe’s eternity: Bereshis bara! The universe was created by God out of nothing; it has not always existed. And Big Bang cosmology, now accepted by the overwhelming majority of the scientific community, involves a rather reluctant acknowledgement by many scientists that a cherished philosophical notion had to be forsaken.

In recent decades, a veritable cottage industry has arisen within the Torah community, with authors claiming to harmonise the respective viewpoints of the Torah and Science on the question of ultimate origins (of the universe and humanity). These authors make it their business – a lucrative business at that! – to pander to readers whose point of departure is, “How do you reconcile Judaism and Science?” without realising that they are not asking a question but rather expressing a prejudice. It never occurs to them to ask, “Are Judaism and Science necessarily reconcilable?” Having decided at the outset that the two viewpoints must always coincide, these authors proceed to make sure – if necessary by torturing classical sources until they confess – that Torah sources submit to political correctness.

Awesome Creation, for the most part, avoids this pitfall. It does an excellent job of elucidating the key terms in the first three verses of the Torah. What does tohu really mean? And bohu? How about raqia’? Does darkness mean the mere absence of light or is it a tangible entity? Rabbi Bitton analyses these words on the Torah’s own terms, using the Hebrew text, Chazal and classical authorities. And in pursuing the legitimate meanings of obscure terms, the author is sufficiently confident to criticise well-known writers – Rabbi Aryeh Kaplan, for example – for mistranslating certain phrases. Mostly, the author succeeds in sticking to his objective that “Science is used in this work only to the extent it contributes to the understanding of the Biblical text, which is the main goal of this book.”

But not always. Rabbi Bitton, too, sometimes succumbs to the urge to show that, as he puts it, the “Biblical Creation story… is completely compatible with science’s modern discoveries.” Nu nu… At any rate, writing that Ramban anticipated a post-Newtonian conception of physics or that Rambam identified primeval darkness as… [an] invisible form of energy was unnecessary. Overall, however, Rabbi Bitton’s scholarship is dispassionate and focused.

Awesome Creation makes the occasional innocent mistake. The famous astrophysicist Arthur Eddington was not an uncompromising atheist (he was a committed Quaker, and, because of his pacifism, faced imprisonment in 1918, when he was 35 years old and still subject to conscription in WWI) and citizens of the “eternal and stationary universe of Aristotle” did not ponder elliptical orbits (nobody did that until Kepler). But I quibble. On scientific matters, Awesome Creation is almost always accurate and informative.

One of the novel features of Awesome Creation is its willingness to cite lesser-known sources (a point the author acknowledges in a recent interview (see http://www.jewishpress.com/indepth/interviews-and-profiles/modern-science-is-discovering-what-the-torah-said-thousands-of-years-ago-an-interview-with-rabbi-yosef-bitton/2013/09/17/2/)). This is innocuous and even illuminating, except for the odd occasion when this practice goes overboard. Do we really need Jorge Luis Borges to tell us that Nature and the Bible are two books written by the same Author? Galileo said so 400 years ago (and Rabbi S.R. Hirsch used the same idea in his 18th Letter). But again, this is nitpicking. Mostly, Awesome Creation sticks to standard sources. And even when novel rabbinic sources are cited, they are not there to convince the reader that radical and fringe views are legitimate Torah viewpoints, because in hashkafa anything goes. [See, however, note 4 on page 63].

Awesome Creation is stimulating, original and accessible and I warmly recommend it to anyone who is interested in the Torah’s account of Creation.

Article on Aish.com on Free Will

June 23, 2014

My article on Darwinism, Morality and Free Choice is now (Monday, 23rd June 2014) posted on the Aish HaTorah website,

http://www.Aish.com.

This is the URL:

http://www.aish.com/sp/ph/Darwinism-Morality-and-Free-Choice.html

 

Article on Aish.Com

May 25, 2014

My article on the wider significance of brain scans is now (Sunday, 25th May 2014) posted on the Aish HaTorah website,

http://www.Aish.com.

This is the URL:

http://www.aish.com/ci/sam/Brain-Scams.html

 

Vestigial Organs

December 1, 2013

 

An abridged version of my article on vestigial organs appeared in the Chanukah issue (number 29) of Kolmus, a supplement to Mishpacha Magazine. Below is the full article, with notes.

*** 

A newspaper here in Johannesburg recently published an article on the alleged imperfections of the human body. The author of the article would not have had to search far to find material for her article. The Internet buzzes with sites that carry lists of God’s Great Mistakes[1], and Discover Magazine delights in articles that disparage miscellaneous parts of the human body.[2]

 The article is downright silly at times. In describing the human ear, it points out that its exquisite sensitivity is a liability because a sudden explosion could destroy the tiny crystalline rods in the ear whose function is to amplify the vibrations impinging on the eardrum. What next – proof that the human skeleton is shoddily constructed because of its inability to remain intact following a plunge from the tenth storey?[3] But the ear – amazing as it truly is – is not my subject.[4] In this essay, we will focus on the mascot of the vestigial organ movement, the human appendix, with which the article begins:

The human body is a wonder of nature: our brains react faster than a computer, our hearts beat without the need for rest. But it’s not perfect. The appendix, for instance, seems to have no real function yet can be the cause of appendicitis and, if left untreated, life-threatening peritonitis (inflammation of the abdominal lining) – so it could be simpler to get rid of it.

 Seems to have no real function, huh? In order to make that statement, one must be sure that the appendix really does not have a function, and making a negative argument like that is often dodgy.[5] In 1981, the Canadian biologist Steven Scadding argued that although he had no objection to Darwinism, “vestigial organs provide no evidence for evolutionary theory.” The primary reason is that “it is difficult, if not impossible, to unambiguously identify organs totally lacking in function.” Scadding cited the human appendix as an organ previously thought to be vestigial but now known to have a function.[6] What function does it have?

In October 1999, a Scientific American reader submitted the following question to the Ask the Experts column: “What is the function of the human appendix? Did it once have a purpose that has since been lost?”[7] The journal appointed Loren G. Martin, professor of physiology at Oklahoma State University, to answer the query. Professor Martin began by pointing out the contribution made by the appendix before a human being is even born:

For years, the appendix was credited with very little physiological function. We now know, however, that the appendix serves an important role in the fetus and in young adults. Endocrine cells appear in the appendix of the human fetus at around the 11th week of development. These endocrine cells of the fetal appendix have been shown to produce various… compounds that assist with various biological control (homeostatic) mechanisms. There had been little prior evidence of this or any other role of the appendix in animal research, because the appendix does not exist in domestic mammals.[8]

 Professor Martin then notes that in adulthood, the appendix continues to be an important player:

Among adult humans, the appendix is now thought to be involved primarily in immune functions. Lymphoid tissue begins to accumulate in the appendix shortly after birth and reaches a peak between the second and third decades of life… During the early years of development, however, the appendix has been shown to function as a lymphoid organ, assisting with the maturation of B lymphocytes (one variety of white blood cell) and in the production of the class of antibodies known as immunoglobulin A (IgA) antibodies. Researchers have also shown that the appendix is involved in the production of molecules that help to direct the movement of lymphocytes to various other locations in the body.[9]

 Professor Martin also notes that “the appendix probably helps to suppress potentially destructive humoral (blood- and lymph-borne) antibody responses while promoting local immunity… This local immune system plays a vital role in the physiological immune response and in the control of food, drug, microbial or viral antigens.”

 Finally, Professor Martin validates the adage that if something ain’t broken, you shouldn’t try to fix it:

In the past, the appendix was often routinely removed and discarded during other abdominal surgeries to prevent any possibility of a later attack of appendicitis; the appendix is now spared in case it is needed later for reconstructive surgery if the urinary bladder is removed. In such surgery, a section of the intestine is formed into a replacement bladder, and the appendix is used to re-create a ‘sphincter muscle’ so that the patient remains continent (able to retain urine). In addition, the appendix has been successfully fashioned into a makeshift replacement for a diseased ureter, allowing urine to flow from the kidneys to the bladder. As a result, the appendix, once regarded as a nonfunctional tissue, is now regarded as an important ‘back-up’ that can be used in a variety of reconstructive surgical techniques. It is no longer routinely removed and discarded if it is healthy.[10]

So we know that the appendix very definitely does have function. Enter William Parker.[11] Parker is a professor of surgery at Duke University School of Medicine who was sceptical of claims that the appendix is vestigial. His hypothesis was that it serves as a nature reserve of sorts for beneficial bacteria in our guts. When struck by severe gut infections such as cholera – an all-too-common scourge in human history – the beneficial bacteria in our guts are depleted. The appendix serves as a sanctuary for beneficial bacteria, which can ride out a bout of diarrhoea that completely evacuates the intestines, and emerge afterwards to repopulate the gut.[12]

In October 2007, Science Daily reported on this research.[13] “While there is no smoking gun, the abundance of circumstantial evidence makes a strong case for the role of the appendix as a place where the good bacteria can live safe and undisturbed until they are needed,” said Parker, who conducted the analysis in collaboration with R. Randal Bollinger, Duke professor emeritus in general surgery.

 The gut is populated with different microbes – and there are more of them than there are human cells in a typical human body – that help the digestive system break down the food we eat. In return, the gut provides nourishment and safety to the bacteria. Parker believes that the immune system cells found in the appendix are there to protect, rather than harm, the good bacteria.

Science Daily reported that for the previous ten years, Parker had been studying the interplay of these bacteria in the bowels, and in the process had documented the existence in the bowel of what is known as a biofilm. This thin and delicate layer is an amalgamation of microbes, mucous and immune system molecules living together atop the lining of the intestines. “Our studies have indicated that the immune system protects and nourishes the colonies of microbes living in the biofilm,” Parker explained. “By protecting these good microbes, the harmful microbes have no place to locate. We have also shown that biofilms are most pronounced in the appendix and their prevalence decreases moving away from it.”

“Diseases causing severe diarrhea are endemic in countries without modern health and sanitation practices, which often results in the entire contents of the bowels, including the biofilms, being flushed from the body,” Parker said. He added that the appendix’s location and position – the appendix is a dead-end sac that hangs between the small and large intestines – is such that it is expected to be relatively difficult for anything to enter it as the contents of the bowels are emptied.

“Once the bowel contents have left the body, the good bacteria hidden away in the appendix can emerge and repopulate the lining of the intestine before more harmful bacteria can take up residence,” Parker continued. “In industrialized societies with modern medical care and sanitation practices, the maintenance of a reserve of beneficial bacteria may not be necessary. This is consistent with the observation that removing the appendix in modern societies has no discernible negative effects.”[14]

Parker’s idea implies that individuals with their appendix should be more likely to recover from severe gut infections than individuals without an appendix. His hypothesis was tested a few years after he floated it. In its December 2011 issue, the journal Clinical Gastroenterology and Hepatology published a study entitled “The Appendix May Protect Against Clostridium difficile Recurrence”.[15]

Ideally, in order to test Parker’s idea, scientists would compare the fates of individuals who suffer gut infections and have an appendix to those of individuals who suffer the same gut infections and do not have an appendix. But such a study would be easiest in developing countries where cholera and similar diseases are prevalent, and those same regions are the ones where medical records (of appendectomies, for example) tend to be least detailed.

James Grendell, chief of the division of Gastroenterology, Hepatology and Nutrition at Winthrop University Hospital in New York, solved the problem, together with his colleagues. They studied a pathogen called Clostridium difficile. Thisdeadly organism is often encountered in hospitals, particularly when patients must be treated by prolonged courses of antibiotics. It does not appear to compete well with the native biota of patients’ guts, but when the native biota is depleted (as is the case after several courses of antibiotics) Clostridium difficile can grow quickly and take over. It is most dangerous when, after treatment, it recurs, which is to say when the native fauna of the gut and immune system cannot, together, prevent it from reinvading. If Parker was right, individuals without an appendix should be more likely to have a recurrence of Clostridium difficile than those individuals with an appendix.

The researchers were able to find 254 patients at the hospital who met the requirements of their study: there was evidence of their having been infected by Clostridium difficile, and the presence or absence of an appendix was known or discernible. The rest was easy. They compared whether individuals without their appendix were at a higher risk of recurrence from Clostridium difficile. The results were dramatic. Individuals without an appendix were four times more likely to have a recurrence of Clostridium difficile, exactly as Parker’s hypothesis predicted. Recurrence in individuals with their appendix intact occurred in 11% of cases. Recurrence in individuals without their appendix occurred in 48% of cases.

The results do not unequivocally prove Parker’s thesis, but they do provide further strong circumstantial evidence for the hypothesis that the human appendix plays an important role in human health. 

***

So, through ingenious science, a number of important functions have been found for the appendix, and human knowledge has advanced. Still, if that’s all there was to it, the topic would remain esoteric, limited to specialists in academic medicine. But that’s not all there is to it.

The uselessness of the appendix is a long-standing urban legend, going back at least as far as Charles Darwin. Darwin argued in The Origin of Species that the widespread occurrence of vestigial organs – organs that may have once had a function but are now useless, mere vestiges of the past – is evidence against Creation. “On the view of each organism with all its separate parts having been specially created, how utterly inexplicable is it that organs bearing the plain stamp of inutility… should so frequently occur.” But such organs, he argued, are not only explained by his theory, but would even have been predicted by it: “On the view of descent with modification, we may conclude that the existence of organs in a rudimentary, imperfect, and useless condition, or quite aborted, far from presenting a strange difficulty, as they assuredly do on the old doctrine of creation, might even have been anticipated in accordance with the views here explained.”[16]

This argument was then bequeathed to subsequent generations of biologists. It was endorsed as recently as 2001 by Ernst Mayr, one of the leading biologists of the twentieth century: “Every shift into a new adaptive zone leaves a residue of no longer needed morphological features that then become an impediment. One only needs to think of the many weaknesses in humans that are remnants of our quadrupedal and more vegetarian past, for instance… the caecal appendix.”[17]

When making his overall argument, Darwin used some examples of apparently vestigial organs which, I think, are uncontroversial. Some cave-dwelling fish are completely blind, although they possess eyes; others have no eyes at all. The species Astyanax mexicanus is born with eyes, but, as it matures, skin grows over the eyes and they degenerate completely – there is no need for sight in the dark world of a troglodyte. The eyes of blind cavefish are vestigial, and this appears to have happened through disuse. Darwin wrote that “It appears probable that disuse has been the main agent in rendering organs rudimentary. It would at first lead by slow steps to the more and more complete reduction of a part, until at last it became rudimentary, as in the case of the eyes of animals inhabiting dark caverns…”[18]

But as evidence for biological evolution, sightless cave-dwelling fish are weak. Firstly, as everyone who has built towers with blocks in the company of a two-year-old knows, there is a big difference between constructing and demolishing. A toddler is quite capable of bashing down impressive edifices, but cannot pile more than two or three blocks on one another. The fact that disuse can lead to the atrophying of an organ is by no means indicative of a purposeless, natural process being capable of constructing an organ where none previously existed. Whatever process leads to the loss of an organ’s abilities must be demonstrated to be capable of creative activity, if vestigial organs are to serve as evidence for biological evolution. As the biologist Lynn Margulis put it, “Natural selection eliminates and maybe maintains, but it doesn’t create.”[19]

Secondly, in the case of eyes, there is no doubt as to their function – to provide sight – which is superfluous in pitch-black caves. Even if cave-dwelling fish had perfectly-functioning eyes, they would still be useless, because there is no light in the caves these fish inhabit. It is therefore reasonable to posit that the eyes of blind cavefish functioned in the distant past, but atrophied over time due to lack of use. But when it comes to other apparently-vestigial organs, whose function in the distant past we do not know with certainty, the conclusion does not inexorably follow that they are really vestigial.

The parathyroid gland is a good example of this last point. It was discovered in humans in 1880. All through the early twentieth century biologists surmised that these glands were among the many useless parts they believed existed in the human body. Not anymore. The parathyroid is now known to regulate calcium-phosphorous metabolism,[20] and also plays a role in magnesium metabolism by increasing its excretion.[21] [A similar narrative is associated with tonsils: the removal of tonsils may lead to a higher incidence of throat cancer, but for decades doctors never suspected that this “useless” tissue might actually have a use that escaped their detection.[22]] In the decades when it was claimed that the parathyroid glands are vestigial, the argument rested on ignorance. Not only did nobody know what the parathyroid does now, nobody knew what the parathyroid was ever supposed to have done. To have claimed that these glands are vestigial was to argue from ignorance. We now know better.

Back to the appendix. In The Descent of Man, Darwin cited the human appendix as an example of a vestigial organ.[23] As we saw earlier, Darwin was wrong. In his time, immunology and endocrinology can barely have been said to exist. How wise was it, in retrospect, to label the appendix vestigial? And if we now confront an organ that appears to have no function, how wise is it to label it vestigial? And once we agree that labelling organs as vestigial is premature unless we know with certainty what their function was in the distant past, what happens to these organs’ role as evidence for biological evolution?

A final point. Not only does the appendix not provide evidence for biological evolution, it in fact constitutes a difficulty for the evolutionary paradigm. The appendix is found in both marsupial creatures, like the wombat, and in placental mammals, such as rats, lemmings and humans. But since, according to evolutionary theory, the last putative common ancestor of marsupials and placental mammals did not possess an appendix, evolutionists are forced to believe that the same organ evolved twice, independently.[24] Needless to say, this is fantastically improbable.[25]

Darwin of the Gaps

One of the curious twists in the story we have followed is the role reversal. In the past, proponents of scientism often threw the God of the Gaps argument in the face of believers. The argument is supposed to work like this. Believers, wallowing in ignorance, would explain sundry aspects of nature with a shrug and “God did it” or “That’s how God wanted it to be”. But as science progressed, and one dark corner after another was illuminated by the light of reason and empirical evidence, the believers had to retreat. The gaps in our understanding of the natural universe shrank, leaving less and less for God to accomplish.

The God of the Gaps argument was always silly, but in the case of vestigial organs the shoe – of a decidedly uncomfortable fit – is on the other foot. Biologists made the claim about various organs that they must be the product of evolution. But this argument depended on the claim that these organs had no function and thus had to be vestiges of a more-useful past. But as human knowledge expands, we find that structures that were previously assumed to be without function in fact have important function. With each of these steps, evolutionary biologists have had to retreat.

***

 And what about the Jewish angle? Well, the Talmud teaches that everything that God created has a purpose.[26] And Maharal (ca. 1520-1609), one of the greatest Jewish philosophers of the past half-millennium, wrote that even if we do not understand every feature of the human body, yet we are quite certain that nothing in it is superfluous.[27] Are those viewpoints compatible with the notion that our bodies are littered with useless vestiges of the past? I am not sure that these statements are sufficient to preclude the conclusion that wisdom teeth, for example, are vestiges of a past in which our diet was substantially different to what it is nowadays. But I do know this. The left atrial appendage is a small muscular ‘pouch’ in the heart. Dr. Amanda Varnava, a cardiologist quoted in the newspaper article I cited at the beginning of this essay, says that “It is completely redundant – it has no functional role.” It would be prudent to reserve judgment.[28]


[1] See http://oolon.awardspace.com/SMOGGM.htm.

Retrieved 8th April 2013.

[2] See http://discovermagazine.com/2004/jun/useless-body-parts#.UWKY5qJT6Ag.

Retrieved 8th April 2013.

[3] The Star, Wednesday, 20th March 2013, page 36, The Bits of Your Body that Nature Got Wrong, by Lucy Elkins:

The human body is a wonder of nature: our brains react faster than a computer, our hearts beat without the need for rest. But it’s not perfect. The appendix, for instance, seems to have no real function yet can be the cause of appendicitis and, if left untreated, life-threatening peritonitis (inflammation of the abdominal lining) – so it could be simpler to get rid of it.

EARS… Once the ear drum has collected the noise vibrations, they are passed into the inner ear. Here, the noise vibrations are picked up by tiny crystalline rods that look like hairs. “These are very fine and help make our hearing especially sensitive by adding volume and clarity,” [ENT specialist Tony Wright] adds. But this sensitivity also means the rods are prone to damage. “Something like an explosion can destroy them instantly – and persistent loud noise of 85 decibels or more (such as in a noisy factory or a disco) can damage them over years. The problem is that we can’t regenerate them.”

The article can be read online:

http://www.iol.co.za/lifestyle/body-parts-we-just-don-t-need-1.1476443#.Ub7GQeel4dU

Retrieved 17th June 2013.

[4] To read about some of the latest research regarding the astonishing abilities of the human ear, see http://www.evolutionnews.org/2013/03/our_ears_are_am070551.html.

Retrieved 14th June 2013.

[5] מסכת מנחות דף קג עמוד ב במשנה: אין לא ראינו ראיה.

[6] Steven R. Scadding, “Do ‘vestigial organs’ provide evidence for evolution?” Evolutionary Theory 5 (1981): 173-176. See this article by Dr. Jonathan Wells:

http://www.evolutionnews.org/2009/05/the_myth_of_vestigial_organs_a020111.html

Retrieved 14th June 2013.

[7] See http://www.scientificamerican.com/article.cfm?id=what-is-the-function-of-t

Retrieved 10th April 2013.

[8] A report on LiveScience dated 29th May 2006 states:

The appendix is a slimy, dead-end sac that hangs between the small and large intestines.  It’s about a half inch in diameter and three inches long.  As quickly as 11 weeks after conception, the appendix starts making endocrine cells for the developing fetus.  Endocrine cells secrete useful chemicals, such as hormones, and the appendix endocrine cells secrete amines and peptide hormones that help with biological checks and balances as the fetus grows.

See http://www.livescience.com/10489-appendix-slimy-worthless.html.

Retrieved 14th April 2013.

[9] The same LiveScience report also makes the following point:

http://www.livescience.com/10489-appendix-slimy-worthless.html

After birth, the appendix mainly helps the body stave off disease by serving as a lymphoid organ.  Lymphoid organs, with their lymphoid tissue, make white blood cells and antibodies.

The appendix, by virtue of its lymphoid tissue, is part of a complicated chain that makes B lymphocytes (one variety of white blood cell) and a class of antibodies known as immunoglobulin A antibodies.  The appendix also produces certain chemicals that help direct the white blood cells to the parts of the body where they are needed the most. 

[10] There is a sinister side to this surgical versatility. In 2001, rumours were circulating in Greek hospitals that surgery residents, eager to rack up scalpel time, were falsely diagnosing hapless Albanian immigrants with appendicitis. At the University of Ioannina medical school’s teaching hospital, a newly minted doctor named Athina Tatsioni was discussing the rumours with colleagues when a professor who had overheard asked her if she’d like to try to prove whether they were true – he seemed to be almost daring her. She accepted the challenge and, with the professor’s help, eventually produced a formal study showing that, for whatever reason, the appendices removed from patients with Albanian names in six Greek hospitals were more than three times as likely to be perfectly healthy as those removed from patients with Greek names. “It was hard to find a journal willing to publish it, but we did,” recalls Tatsioni. “I also discovered that I really liked research.” See http://www.theatlantic.com/magazine/print/2010/11/lies-damned-lies-and-medical-science/308269/.

Retrieved 6th May 2013.

[11] My account of Dr. Parker’s work is based on an article on the website of Scientific American. It can be read here: http://blogs.scientificamerican.com/guest-blog/2012/01/02/your-appendix-could-save-your-life/

Retrieved 9th April 2013.

[12] See Randal Bollinger R, Barbas AS, Bush EL, Lin SS, Parker W. Biofilms in the large bowel suggest an apparent function of the human vermiform appendix. Journal of Theoretical Biology, 2007 Dec 21; 249(4):826-31.

[13] See http://www.sciencedaily.com/releases/2007/10/071008102334.htm.

Retrieved 14th June 2013.

[14] About one person in twenty has the appendix removed.

[15] See the abstract here: http://www.cghjournal.org/article/S1542-3565(11)00580-5/abstract

Retrieved 9th April 2013.

[16] Darwin, The Origin of Species, Chapters XIV (p. 402) and XV (p. 420). Available online at http://darwin-online.org.uk/content/frameset?viewtype=side&itemID=F391&pageseq=430

Retrieved 14th June 2013.

[17] Ernst Mayr, What Evolution Is, Basic Books, page 143.

[18] Charles Darwin, Origin of Species (1859), sixth edition, page 401. See http://darwin-online.org.uk/content/frameset?keywords=caverns%20dark%20inhabiting%20animals&pageseq=430&itemID=F401&viewtype=text.

Retrieved 8th April 2013.

“By the time that an animal had reached, after numberless generations, the deepest recesses, disuse will on this view have more or less perfectly obliterated its eyes, and natural selection will often have affected other changes, such as an increase in the length of antennae or palpi, as a compensation for blindness.” Charles Darwin, Origin of Species (1859), sixth edition, page 111. See

http://darwin-online.org.uk/content/frameset?keywords=recesses%20the%20deepest&pageseq=139&itemID=F391&viewtype=text

Retrieved 14th June 2013.

[19] See http://discover.coverleaf.com/discovermagazine/201104?pg=68#pg70.

Retrieved 14th June 2013.

[20] See http://www.livescience.com/10489-appendix-slimy-worthless.html.

Retrieved 11th April 2013.

[21] parathyroid gland. (2009). Encyclopædia Britannica. Encyclopædia Britannica 2009 Ultimate Reference Suite.  Chicago: Encyclopædia Britannica.

[22] “Tonsillectomies have been linked to dozens of medical complications and conditions, ranging from polio to weight gain. A typical example is a study published in 2011 in the European Heart Journal. The study “found a higher risk of AMI [acute myocardial infarction] related to surgical removal of the tonsils and appendix before age 20. These results are consistent with the hypothesis that subtle alterations in immune function following these operations may alter the subsequent cardiovascular risk…” The Black Swan, Nassim Nicholas Taleb, Penguin Books, 2010, page 55.

See http://eurheartj.oxfordjournals.org/content/early/2011/05/27/eurheartj.ehr137.

Retrieved 14th June 2013.

See this article for a general overview:

http://boingboing.net/2012/01/10/tonsillectomy-confidential-do.html

[23] Darwin, Charles. The Descent of Man, First Edition (London: John Murray, 1871), Chapter I (page 27). Available at http://darwin-online.org.uk/content/frameset?viewtype=side&itemID=F937.1&pageseq=40.

With respect to the alimentary canal I have met with an account of only a single rudiment, namely the vermiform appendage of the caecum. The caecum is a branch or diverticulum of the intestine… It appears as if, in consequence of changed diet or habits, the caecum had become much shortened in various animals, the vermiform appendage being left as a rudiment of the shortened part… Not only is it useless, but it is sometimes the cause of death… this is due to small hard bodies, such as seeds, entering the passage and causing inflammation.

[24] This process is called convergent evolution.

[25] http://www.livescience.com/10571-appendix-fact-promising.html

[26] מסכת שבת דף עז עמוד ב: אמר רב יהודה אמר רב כל מה שברא הקב”ה בעולמו לא ברא דבר אחד לבטלה.

[27] מהר”ל תפארת ישראל פרק ח: וכבר אמרנו שאף אם אין ידוע לנו טעם וסבה של כל דבר ודבר שנמצא באדם למה הוא כך, מכל מקום ידוע לנו שאין דבר אחד לבטלה…

[28] A paper published in November 1999 in the journal Heart states that:

The physiological properties and anatomical relations of the LAA [left atrial appendage] render it ideally suited to function as a decompression chamber during left ventricular systole and during other periods when left atrial pressure is high. These properties include the position of the LAA high in the body of the left atrium; the increased distensibility of the LAA compared with the left atrium proper; the high concentration of atrial natriuretic factor (ANF) granules contained within the LAA; and the neuronal configuration of the LAA… Obliteration or amputation of the LAA may help to reduce the risk of thromboembolism, but this may result in undesirable physiological sequelae such as reduced atrial compliance and a reduced capacity for ANF secretion in response to pressure and volume overload. See http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1760793/.

Retrieved 13th May 2013.

Origin of Life and Philosophical Outlook

June 28, 2013

In Signature in the Cell, Dr. Stephen Meyer presented a comprehensive and accessible history of research into the origin of life. In this post, we take a bird’s eye view of research into this area over the past three-quarters of a century. We shall also digress in order to get a snapshot of how ideological commitments shape the views of many scientists.

***

Let’s begin with Dr. Ernst Chain. Chain won a Nobel Prize for his contribution to the development of penicillin. I mentioned him in Genesis and Genes, in the context of the discussion about whether evolutionary theory is relevant to nuts-and-bolts research in biology. I cited an article by Philip Skell (1918-2010), who was a distinguished professor of chemistry and a member of the National Academy of Sciences in the USA and a prominent Darwin sceptic. In a 2009 article in Forbes.com entitled The Dangers of Overselling Evolution, he made the point that evolutionary theory makes no contribution to actual research:

In 1942, Nobel Laureate Ernst Chain wrote that his discovery of penicillin (with Howard Florey and Alexander Fleming) and the development of bacterial resistance to that antibiotic owed nothing to Darwin’s and Alfred Russel Wallace’s evolutionary theories.[1]

Chain understood the immensity of the task of trying to explain life in naturalistic terms. In The Life of Ernst Chain: Penicillin and Beyond, we read that:

I have said for years that speculations about the origin of life lead to no useful purpose as even the simplest living system is far too complex to be understood in terms of the extremely primitive chemistry scientists have used in their attempts to explain the unexplainable that happened billions of years ago.[2]

In August 1954, Dr. George Wald, another Nobel Laureate, wrote in Scientific American:

There are only two possibilities as to how life arose. One is spontaneous generation arising to evolution; the other is a supernatural creative act of God. There is no third possibility… a supernatural creative act of God. I will not accept that philosophically because I do not want to believe in God, therefore I choose to believe that which I know is scientifically impossible; spontaneous generation arising to Evolution.

 This statement may seem astonishingly frank to many members of the public. Informed consumers of science, in contrast, are aware that much of the debate around the origin of life and biological evolution has precious little to do with drawing inevitable conclusions from straightforward evidence. It is far more about worldviews and ideologies, and only extremely naive observers assume that this does not apply to scientists who participate in the debate. Wald makes it perfectly clear that his direction was dictated by his philosophical leanings, and that is true of many scientists and Western intellectuals. Consider the views of Thomas Nagel. Nagel is a courageous thinker whose latest book, Mind and Cosmos, is a fierce demolition of Darwinian evolution.[3] But Nagel will only go so far. In The Last Word, which appeared in 1997, he offered a candid account of his philosophical inclinations:

I am talking about something much deeper—namely, the fear of religion itself. I speak from experience, being strongly subject to this fear myself: I want atheism to be true and am made uneasy by the fact that some of the most intelligent and well-informed people I know are religious believers… It isn’t just that I don’t believe in God and, naturally, hope that I’m right in my belief. It’s that I hope there is no God! I don’t want there to be a God; I don’t want the universe to be like that.[4]

 The fact that faith – the faith of many scientists in the ability of unguided matter and energy to create life – drives much of the discussion about evolution, was underscored by Dr. Gerald Kerkut, Professor  Emeritus of Neuroscience at the University of Southampton, who wrote in 1960 that: 

The first assumption was that non-living things gave rise to living material. This is still just an assumption… There is, however, little evidence in favor of abiogenesis and as yet we have no indication that it can be performed… it is therefore a matter of faith on the part of the biologist that abiogenesis did occur and he can choose whatever method… happens to suit him personally; the evidence for what did happen is not available.

 Harold Urey won a Nobel Prize for chemistry, but is probably more famous for participating, with his graduate student Stanley Miller, in what became known as the Miller-Urey experiment. Writing in The Christian Science Monitor on 4th January 1962, Urey wrote: 

All of us who study the origin of life find that the more we look into it, the more we feel it is too complex to have evolved anywhere. We all believe as an article of faith that life evolved from dead matter on this planet. It is just that its complexity is so great, it is hard for us to imagine that it did.

 Hubert Yockey, the renowned information theorist, wrote in the Journal of Theoretical Biology in 1977 that:

One must conclude that… a scenario describing the genesis of life on earth by chance and natural causes which can be accepted on the basis of fact and not faith has not yet been written.

Richard Dickerson, a molecular biologist at UCLA, wrote in 1978 in Scientific American that: 

The evolution of the genetic machinery is the step for which there are no laboratory models; hence one can speculate endlessly, unfettered by inconvenient facts. The complex genetic apparatus in present-day organisms is so universal that one has few clues as to what the apparatus may have looked like in its most primitive form.[5]

 Francis Crick needs no introduction. In Life Itself, published in 1981, he wrote that: 

Every time I write a paper on the origin of life, I determine I will never write another one, because there is too much speculation running after too few facts.

 Crick’s conclusion is that:

The origin of life seems almost to be a miracle, so many are the conditions which would have had to have been satisfied to get it going.[6]

 Prominent origin-of-life researcher Leslie Orgel wrote in New Scientist in 1982 that:

Prebiotic soup is easy to obtain. We must next explain how a prebiotic soup of organic molecules, including amino acids and the organic constituents of nucleotides evolved into a self-replicating organism. While some suggestive evidence has been obtained, I must admit that attempts to reconstruct the evolutionary process are extremely tentative.[7]

 The views of Nobel Prize winner Fred Hoyle are particularly interesting. He struggled with the conflict between his ardent atheism and his knowledge of the excruciating difficulty of positing a naturalistic start to life. Writing in 1984, Hoyle stated that: 

From my earliest training as a scientist I was very strongly brain-washed to believe that science cannot be consistent with any kind of deliberate creation. That notion has had to be very painfully shed. I am quite uncomfortable in the situation, the state of mind I now find myself in. But there is no logical way out of it; it is just not possible that life could have originated from a chemical accident.[8]

 The writer Andrew Scott hit the nail on the head when he wrote, in 1986, that most scientists’ adherence to naturalistic accounts of the origin of life owed little to the evidence and much to ideological commitments:

But what if the vast majority of scientists all have faith in the one unverified idea? The modern ‘standard’ scientific version of the origin of life on earth is one such idea, and we would be wise to check its real merit with great care. Has the cold blade of reason been applied with sufficient vigor in this case? Most scientists want to believe that life could have emerged spontaneously from the primeval waters, because it would confirm their belief in the explicability of Nature – the belief that all could be explained in terms of particles and energy and forces if only we had the time and the necessary intellect.[9]

 This conclusion is mirrored in the words of Paul Davies, a theoretical physicist and authority on origin-of-life studies. Writing in 2002, Davies affirms that it is scientists’ adherence to methodological naturalism that drives their agenda and conclusions:

First, I should like to say that the scientific attempt to explain the origin of life proceeds from the assumption that whatever it was that happened was a natural process: no miracles, no supernatural intervention. It was by ordinary atoms doing extraordinary things that life was brought into existence. Scientists have to start with that assumption.[10]

 In 1988, Klaus Dose, another prominent origin-of-life theorist, summed up the situation nicely when he wrote that: 

More than 30 years of experimentation on the origin of life in the fields of chemical and molecular evolution have led to a better perception of the immensity of the problem of the origin of life on Earth rather than to its solution. At present all discussions on principal theories and experiments in the field either end in stalemate or in a confession of ignorance.[11]

 Carl Woese was a pioneer in taxonomy, and one of the major figures in 20th century microbiology. His view of the origin of life: 

In one sense the origin of life remains what it was in the time of Darwin – one of the great unsolved riddles of science. Yet we have made progress…many of the early naïve assumptions have fallen or have fallen aside…while we do not have a solution, we now have an inkling of the magnitude of the problem.[12]

 Paul Davies, too, writes that no substantive progress has been made in this area since Darwin’s time. In a recent short paper suggesting that life be viewed as a software package, Davies writes:

Darwin pointedly left out an account of how life first emerged, “One might as well speculate about the origin of matter,” he quipped. A century and a half later, scientists still remain largely in the dark about life’s origins. It would not be an exaggeration to say that the origin of life is one of the greatest unanswered questions in science.[13]

 Readers of Genesis and Genes will recall Richard Lewontin’s admission that his mathematical models of evolutionary mechanisms are a sham – they do not correspond to reality. The biologist Lynn Margulis reminisced:

Population geneticist Richard Lewontin gave a talk here at UMass [University of Massachusetts] Amherst about six years ago, and he mathematized all of it – changes in the population, random mutation, sexual selection, cost and benefit. At the end of his talk he said, “You know, we’ve tried to test these ideas in the field and the lab, and there are really no measurements that match the quantities I’ve told you about.” This just appalled me. So I said, “Richard Lewontin, you are a great lecturer to have the courage to say it’s gotten you nowhere. But then why do you continue to do this work?” And he looked around and said, “It’s the only thing I know how to do, and if I don’t do it I won’t get grant money.” So he’s an honest man, and that’s an honest answer.

 Lewontin, who is one of the most prominent geneticists in the world and a protégé of one of the founders of neo-Darwinism, Theodosius Dobzhansky, was equally forthright about the role that faith plays in moulding scientists’ approach to important issues. In his review of a book by Carl Sagan, Lewontin wrote in 1997 that:

We take the side of science in spite of the patent absurdity of some of its constructs, in spite of its failure to fulfill many of its extravagant promises of health and life, in spite of the tolerance of the scientific community for unsubstantiated just-so stories, because we have a prior commitment, a commitment to materialism. It is not that the methods and institutions of science somehow compel us to accept a material explanation of the phenomenal world, but, on the contrary, that we are forced by our a priori adherence to material causes to create an apparatus of investigation and a set of concepts that produce material explanations, no matter how counter-intuitive, no matter how mystifying to the uninitiated. Moreover, that materialism is absolute, for we cannot allow a Divine Foot in the door.[14]

 Stuart Kauffman of the Santa Fe Institute is one of the world’s leading origin-of-life researchers and a leading expert on self-organisational systems. He writes:

Anyone who tells you that he or she knows how life started on the earth some 3.45 billion years ago is a fool or a knave. Nobody knows.[15]

 In Genesis and Genes, I also quoted the biochemist Franklin Harold. In his book The Way of the Cell, Harold frankly acknowledged that “We must concede that there are presently no detailed Darwinian accounts of the evolution of any biochemical or cellular system, only a variety of wishful speculations.”[16] Regarding the origin of life, Harold writes that:

It would be agreeable to conclude this book with a cheery fanfare about science closing in, slowly but surely, on the ultimate mystery; but the time for rosy rhetoric is not yet at hand. The origin of life appears to me as incomprehensible as ever, a matter for wonder but not for explication.[17]

 Massimo Pigliucci was formerly a professor of evolutionary biology and philosophy at the State University of New York at Stony Brook, and holds doctorates in genetics, botany, and the philosophy of science. He is currently the chairman of the department of philosophy at City University of New York. He is a prominent international proponent of evolution and the author of several books. Writing in 2003, Pigliucci writes that “[I]t has to be true that we really don’t have a clue how life originated on Earth by natural means.”[18]

In 2007, we find science writer Gregg Easterbrook writing in Wired: “What creates life out of the inanimate compounds that make up living things? No one knows. How were the first organisms assembled? Nature hasn’t given us the slightest hint. If anything, the mystery has deepened over time.”[19]

 Also in 2007, Harvard chemist George M. Whitesides, in accepting the highest award of the American Chemical Society, wrote: “The Origin of Life. This problem is one of the big ones in science. It begins to place life, and us, in the universe. Most chemists believe, as do I, that life emerged spontaneously from mixtures of molecules in the prebiotic Earth. How? I have no idea… On the basis of all the chemistry that I know, it seems to me astonishingly improbable.”[20] 

As recently as 2011, Scientific American acknowledged that origin-of-life research has gotten nowhere in the last century. In an article by John Horgan, we read that:

Dennis Overbye just wrote a status report for the New York Times on research into life’s origin, based on a conference on the topic at Arizona State University. Geologists, chemists, astronomers, and biologists are as stumped as ever by the riddle of life.[21]

 Also writing in 2011, Dr. Eugene Koonin provided a neat summary of the utter failure of this endeavour: 

The origin of life is one of the hardest problems in all of science… Origin of Life research has evolved into a lively, interdisciplinary field, but other scientists often view it with skepticism and even derision. This attitude is understandable and, in a sense, perhaps justified, given the “dirty” rarely mentioned secret: Despite many interesting results to its credit, when judged by the straightforward criterion of reaching (or even approaching) the ultimate goal, the origin of life field is a failure – we still do not have even a plausible coherent model, let alone a validated scenario, for the emergence of life on Earth. Certainly, this is due not to a lack of experimental and theoretical effort, but to the extraordinary intrinsic difficulty and complexity of the problem. A succession of exceedingly unlikely steps is essential for the origin of life… these make the final outcome seem almost like a miracle.[22]

***

The area of origin-of-life research is fascinating not only for its own sake, but also in the way that it exposes what many uninformed members of the public take for granted, namely, that scientists are driven by data, and data alone. I elaborated on this misconception in Genesis and Genes, demonstrating that the commitment of many scientists to methodological naturalism is a far more important factor than the scientific evidence in reaching conclusions about life on Earth.

***

 See Also:

The post Certitude and Bluff:

http://torahexplorer.com/2013/01/15/certitude-and-bluff/

References:

Some of the quotations in this post come from an article by Rabbi Moshe Averick, published in The Algemeiner. The article can be read here:

http://www.algemeiner.com/2012/09/27/speculation-faith-and-unproven-assumptions-the-history-of-origin-of-life-research-in-scientists-own-words/

Retrieved 26th June 2013.

[1] The article can be read here:

http://www.forbes.com/2009/02/23/evolution-creation-debate-biology-opinions-contributors_darwin.html.

Retrieved 2nd November 2010.

[2] R.W. Clark, Weidenfeld and Nicolson, London (1985), page 148.

[3] To read more about Nagel and his latest book, see these reviews:

http://www.newrepublic.com/article/112481/darwinist-mob-goes-after-serious-philosopher

http://www.weeklystandard.com/articles/heretic_707692.html

[4] See http://www.jidaily.com/914e2?utm_source=Jewish+Ideas+Daily+Insider

Retrieved 27th June 2013.

[5] Richard E. Dickerson, “Chemical Evolution and the Origin of Life”, Scientific American, Vol. 239, No. 3, September 1978, page77.

[6] Life Itself, New York, Simon and Schuster, 1981, page 88.

[7] Leslie E. Orgel, “Darwinism at the very beginning of life”, New Scientist, Vol. 94, 15 April 1982, page 150.

[8] Fred Hoyle, Evolution from Space, New York, Simon and Shuster, 1984, page 53.

[9] Andrew Scott, “The Creation of Life: Past, Future, Alien”, Basil Blackwell, 1986, page 111.

[10] Paul Davies, “In Search of Eden, Conversations with Paul Davies and Phillip Adams”.

[11] Klaus Dose, “The Origin of Life: More Questions Than Answers”, Interdisciplinary Science Reviews, Vol. 13, No. 4, 1988, page 348.

[12] Carl Woese, Gunter Wachtershauser, “Origin of Life” in Paleobiology: A Synthesis, Briggs and Crowther – Editors (Oxford: Blackwell Scientific Publications, 1989.

[13] See: http://arxiv.org/abs/1207.4803.

Retrieved 27th June 2013.

[14] “Billions and Billions of Demons”, Richard Lewontin, 9th January 1997, New York Times Book Review.

[15] At Home in the Universe, London, Viking, 1995, page 31.

[16] Franklin Harold, The Way of the Cell: Molecules, Organisms and the Order of Life, Oxford University Press, 2001, page 205.

[17] Ibid. page 251.

[18] Massimo Pigliucci, “Where Do We Come From? A Humbling Look at the Biology of Life’s Origin,” in Darwin, Design and Public Education, eds. John Angus Campbell and Stephen C. Meyer (East Lansing, MI: Michigan State University Press, 2003), page 196.

[19] Gregg Easterbrook, “Where did life come from?” Wired, page 108, February, 2007.

[20] George M. Whitesides, “Revolutions in Chemistry: Priestly Medalist George M. Whitesides’ address”, Chemical and Engineering News, 85 (March 26, 2007): p. 12-17. See http://ismagilovlab.uchicago.edu/GMW_address_priestley_medal.pdf.

Retrieved 22nd April 2012.

[21] John Horgan, Scientific American, 28th February 2011.

[22] Eugene Koonin, The Logic of Chance: The Nature and origin of Biological Evolution (Upper Saddle River, NJ, FT Press, 2011, page 391.

Genesis and Genes on Television

June 15, 2013

A local television station, SABC 2, recently featured Genesis and Genes. The segment, which is about 7-minutes long, is now available on YouTube. Here is the link:

 

http://www.youtube.com/watch?v=SiqEDnN0aM8

Science as a Self-Correcting Mechanism

June 9, 2013

Writing in the Huffington Post recently, Karl Giberson, a prominent proponent of theistic evolution, appealed to the well-known argument that science is a self-correcting mechanism.[1] He writes,

Science – and this includes evolution – is a self-correcting enterprise. I know little of psychiatry, but I am not shocked to discover that critical voices have emerged and are being heard. This is the norm for science. Seemingly secure science is often modified – think Newtonian physics – and entire fields even disappear, like phrenology (studying personality via bumps on the skull). Anyone who understands the scientific community knows it to be full of renegade individualists only too eager to overturn the status quo. This aggressive self-examination is the reason why we now understand the world so well…

The reality is different from this idyllic description, and informed consumers of science know that, public relations aside, there are serious doubts as to the extent to which science is a self-correcting enterprise. For example, the epidemiologist John Ioannidis wrote a paper in 2012 entitled Why Science Is Not Necessarily Self-Correcting.[2] The abstract begins as follows:

The ability to self-correct is considered a hallmark of science. However, self-correction does not always happen to scientific evidence by default. The trajectory of scientific credibility can fluctuate over time, both for defined scientific fields and for science at-large. History suggests that major catastrophes in scientific credibility are unfortunately possible and the argument that “it is obvious that progress is made” is weak.

 Ioannidis proceeds to mention one mechanism which renders self-correction less than perfect:

 Efficient and unbiased replication mechanisms are essential for maintaining high levels of scientific credibility. Depending on the types of results obtained in the discovery and replication phases, there are different paradigms of research: optimal, self-correcting, false nonreplication, and perpetuated fallacy. In the absence of replication efforts, one is left with unconfirmed (genuine) discoveries and unchallenged fallacies.

 What the last sentence means is that, if replication of research results is not a ubiquitous feature of science, there will be unchallenged fallacies. They will not be corrected. And, as we have discussed several times in this forum, replicability of research is a major weakness in contemporary science.

 Ioannidis is too savvy about problems with contemporary science to swallow Karl Giberson-type propaganda:

The self-correction principle does not mean that all science is correct and credible. A more interesting issue than this eschatological promise is to understand what proportion of scientific findings are correct (i.e., the credibility of available scientific results).

 And:

 Even if we believe that properly conducted science will asymptotically trend towards perfect credibility, there is no guarantee that scientific credibility continuously improves and that there are no gap periods during which scientific credibility drops or sinks (slightly or dramatically). The credibility of new findings and the total evidence is in continuous flux. It may get better or worse.

 The paper by Ioannidis is enlightening. I was particularly pleased to discover that arguments I made in Genesis and Genes mirrored those made by Ioannidis. So I reproduce here the section of the book which deals with the issue of science as a self-correcting mechanism:

Jonathan: I’ve heard it said that the fact that new theories replace old theories only proves that science is a self-correcting enterprise. Do you agree?

YB: That’s a nice way to put a happy face on it. But there are two serious problems with this suggestion. Firstly, even if science were this gigantic super-tanker that eventually turns around, it might be too slow for the individual who lived while the old paradigm prevailed. Let’s consider the demise of the eternal universe paradigm. Until 1965, most scientists believed that the universe had never been created – it was eternal. This stood in total contrast to the Torah view that the universe was created at a specific point. By 1965, the old paradigm had collapsed, and was replaced by the Big Bang model, according to which the universe came into existence, apparently ex nihilo. Now imagine a person who died in 1950. Does it help him that science is a self-correcting mechanism? His entire life was spent in the shadow of the monolithic scientific consensus that the universe is eternal. Since he, like all of us, was not a prophet, he could not foresee that some time after his death, the scientific paradigm that dominated his life would crumble and be replaced with a radically different picture. If this person had been a Jew, he would have lived his entire life with unresolved tension between the scientific paradigm that the universe is eternal, and Jewish belief in the creation of the universe. So this business of self-correction, even if it were true, is only good for historians. It won’t help your average individual struggling with a particular issue and having only one lifetime.

 Jonathan: I see. But you mentioned that there were two problems with this suggestion.

YB: Yes. The second problem is this: Why do you believe that science is a self-correcting mechanism? It is because we know that in specific cases, certain beliefs that the scientific community subscribed to turned out to be wrong and were discarded. But there is no way to estimate in what percentage of all cases science indeed reverses its course. Oh, I know the party line about how scientists constantly scrutinise the evidence, compare their hypotheses to experimental results and the rest of it. But we saw enough in the previous chapters to appreciate that in real life, it hardly ever reaches this ideal. I described some stories that had happy endings, like the one involving Dr. Robin Warren, who established that bacteria cause some ulcers. But do you know how many stories had a sad ending? Can you estimate how often in the past a researcher had a hunch but abandoned his line of research when he was subjected to ridicule? Do you have any way of estimating which ending happens more frequently, the sad or the happy? What if for every case like Dr. Warren’s, there were a hundred scientists who had a promising insight or idea, but were deterred by the initial rejection they experienced? We only hear the stories with a happy ending. But scientists are human beings, and most human beings don’t have a thick skin.

See Also:

The post Dr. John Ioannidis and the Reality of Research:

http://torahexplorer.com/2013/05/05/dr-john-ioannidis-and-the-reality-of-research/

The post Dr. Ben Goldacre and the Reproducibility of Research:

http://torahexplorer.com/2013/04/10/dr-ben-goldacre-and-the-reproducibility-of-research/

 

References:

[1] See http://www.huffingtonpost.com/karl-giberson-phd/evolutions-refusal-to-die_b_3292734.html.

Retrieved 9th June 2013.

[2] See http://pps.sagepub.com/content/7/6/645.

Retrieved 9th June 2013.

Brain Scam

June 3, 2013

Imagine that Tom is analysing a work of literature – The Grapes of Wrath, say. He looks at the plot, characterisation, historical context, and uses the various tools of literary analysis. But now Tom takes the study further, and begins to examine the type of paper that the book was printed on. Next, he looks at the ink used, employing gas chromatography to elucidate the chemical makeup of its ingredients. What if at some point Tom insists that the book can be fully understood through this latter, scientific methodology, and that The Grapes is nothing more than the sum of its parts – the molecular interactions between ink droplets and the cellulose in the paper?

Science has made great progress in the last three centuries by pressing the cause of reductionism. The idea is that underneath complex phenomena and entities are simpler, more fundamental layers that can be studied in order to fully elucidate the complex conglomerate. For example, biology has benefited by exploiting the reductionist tools of biochemistry – reducing complex biological phenomena to the level of chemistry. But, as in our example above, the process can go haywire, as when claims are made that human beings are no more than a collection of biochemical responses to stimuli and neuronal interactions. [Chris Mooney’s The Republican Brain: the Science of Why They Deny Science – and Reality (2012) disavows “reductionism” yet encourages readers to treat people with whom they disagree more as pathological specimens of brain biology than as rational interlocutors.]

Informed consumers of science need to be aware of reductio ad absurdum in the realm of brain scans. The idea that a neurological explanation could exhaust the meaning of experience was already being mocked as “medical materialism” by the psychologist William James a century ago. And in The Invisible Gorilla (2010), Christopher Chabris and Daniel Simons advise readers to be wary of such “brain porn”. But popular magazines, science websites and books are frenzied consumers of- and proselytisers for these scans. “This is your brain on music”, announces a caption to a set of fMRI images, and we are invited to conclude that we now understand more about the experience of listening to music. The genre is inexhaustible: “This is your brain on poker”, “This is your brain on metaphor”, “This is your brain on diet soda”, “This is your brain on God” and so on. The attempt to explain, through snazzy brain-imaging studies, not only how thoughts and emotions function, but how politics and religion work, and what the correct answers are to age-old philosophical controversies is nothing less than an intellectual pestilence, a plague of neuroscientism, also known as neurobabkes. For years, the uninformed public has been deluged by references to innumerable studies that “explain” the most complex, subtle and ethereal phenomena on the basis of some colour-drenched picture of a sliced brain. The accompanying report, which purports to explain why human beings love, or envy, or believe in God, or prefer Coke to Pepsi, is heavy on neuro-babble. This is reductionist science run amok. The ubiquity of headlines containing phrases like brain scans show is matched only by the confusion they create in the minds of the public, uninformed about science as it is. So let’s revise some basics.

The human brain is, so far as we know, the most complex object in the universe. That a part of it “lights up” on a functional magnetic resonance imaging (fMRI) scan does not mean that the rest is inactive; it means that certain areas in the brain have an elevated oxygen consumption when a subject performs a task such as reading or reacting to stimuli such as pictures or sounds. The significance of this is not necessarily obvious. Technicolor brain scans are not anything remotely like photographs of the brain in action in real time. Scientists cannot “read” minds. Paul Fletcher, Professor of health neuroscience at Cambridge University, says that he gets “exasperated” by much popular coverage of neuroimaging research, which assumes that “activity in a brain region is the answer to some profound question about psychological processes. This is very hard to justify given how little we currently know about what different regions of the brain actually do.” Too often, he says, a popular writer will “opt for some sort of neuro-flapdoodle in which a highly simplistic and questionable point is accompanied by a suitably grand-sounding neural term and thus acquires a weightiness that it really doesn’t deserve. In my view, this is no different to some mountebank selling quacksalve by talking about the physics of water molecules’ memories, or a beautician talking about action liposomes.”

In fact, a new branch of the neuroscience-explains-everything genre may be created at any time by simply attaching the prefix “neuro” to whatever. So “neuroeconomics” is the latest in a line of rhetorical attempts to sell the dismal science as a hard one; “molecular gastronomy” has now been trumped in the gluttony stakes by “neurogastronomy”; students of Republican and Democratic brains are doing “neuropolitics”; literature academics practise “neurocriticism”, and there is “neurotheology”, “neuromarketing” and other assorted neurononsense.

When the media conjure up stories with titles like “Brain Scans Show Vegetarians and Vegans More Empathic than Omnivores”, the content is almost entirely fictitious. It would be hilarious if not for the fact that the masses out there take this as Science – magisterial, peremptory, authoritative. Examples of this pop-science abound. Marketing consultant Martin Lindstrom tells us that people “love” their iPhones. This conclusion is based on the fact that brain scans of telephone users listening to their personal ring tones showed a “flurry of activation” in the insula, a prune-sized area of the brain. But researchers at UCLA claimed that photos of former presidential candidate John Edwards provoked feelings of “disgust” in subjects because they “lit up” the… insula. Is dopamine “the molecule of intuition”, as Jonah Lehrer suggested in The Decisive Moment (2009), or is it the basis of “the neural highway that’s responsible for generating the pleasurable emotions”, as he wrote in Imagine (2012)? Susan Cain’s Quiet: the Power of Introverts in a World That Can’t Stop Talking (2012), meanwhile, calls dopamine the “reward chemical” and postulates that extroverts are more responsive to it. Other stars of the pop literature are the hormone oxytocin (the “love chemical”) and mirror neurons, which allegedly explain empathy.

***

Informed consumers of science are aware that just about any conclusion in science – but especially in psychiatry, neurology and psychology – is possible, if you pick your evidence carefully. “Having outlined your theory,” says Professor Fletcher, “you can then cite a finding from a neuroimaging study identifying, for example, activity in a brain region such as the insula… You then select from among the many theories of insula function, choosing the one that best fits with your overall hypothesis, but neglecting to mention that nobody really knows what the insula does or that there are many ideas about its possible function.” The insula plays a role in a broad range of psychological experiences, including empathy and disgust, but also sudden insight, uncertainty, and the awareness of bodily sensations, such as pain, hunger, and thirst. With such a broad physiological portfolio, it is no surprise that the insula is activated in many fMRI studies.

Even more versatile than the insula is the infamous amygdala. Invariably described as “primitive” or even “reptilian”, the amygdala shows increased activation when one experiences fear, but it also springs to life when one encounters novel or unexpected stimuli. The multi-functionality of most brain areas renders reasoning backwards from neural activation depicted by a scan to the subjective experience of the brain’s owner a dubious strategy. This approach – formally referred to as “reverse inference,” – is nothing but a high-tech and expensive Rorschach test, inviting interpreters to read whatever they wish into ambiguous findings.  There is strong evidence for the amygdala’s role in fear, but then fear is one of the most heavily studied emotions; popularisers downplay or ignore the amygdala’s associations with the cuddlier emotions and memory. (In The Republican Brain, Mooney suggests that “conservatives and authoritarians” might be the nasty way they are because they have a “more active amygdala”.)

Brain imaging is ubiquitous in pop science mostly because the images are mediagenic. The technology lulls the hoi-polloi into thinking that the most complex entities and phenomena are reducible to simple images on a screen, a perfect fit for a generation hooked on iGadgets. Pretty pictures of the brain can seduce us into drawing simplistic conclusions, leading us to ask more of these images than they can possibly deliver. And the pictures inspire uncritical devotion: a 2008 study, notes Fletcher, showed that “people – even neuroscience undergrads – are more likely to believe a brain scan than a bar graph”.

Even if brain scans were reliable indicators of brain activity, it is not straightforward to infer general lessons about life from experiments conducted under highly artificial conditions. Furthermore, let’s remember that we do not have the faintest clue about the biggest mystery of all – how a lump of grey matter produces the conscious experience we take for granted.

***

Brain scams are not the only area where scientists and science reporters overreach. The same is true of gene studies that purport to pin down the most intricate human characteristics and behaviours to this or that gene, reducing human beings to nothing but a collection of amino acids.

And the same is true of evolutionary biology, which purports to reduce human beings to the sum total of random mutations. Any claim about diffuse phenomena that is made on the basis of reductionism should be treated with suspicion.

***

References:

See the following two articles:

http://ideas.time.com/2013/05/30/dont-read-too-much-into-brain-scans/

Retrieved 3rd June 2013.

http://www.newstatesman.com/culture/books/2012/09/your-brain-pseudoscience

Retrieved 3rd June 2013.

Peer Review

May 27, 2013

One factor that clearly distinguishes informed consumers of science and the general public is the attitude these groups have towards the process of peer-review. The general public entertains unrealistic, highly-idealised visions of a process by which scientific research is assessed by peers. In theory, peer review is supposed to act as a filter, weeding out the crackpots; in practice, it often turns out to be a way to enforce orthodoxy.

Copernicus’s heliocentric cosmology, Galileo’s mechanics, Newton’s gravity and equations of motion – these ideas never appeared in journal articles. They appeared in books that were reviewed, if at all, by associates of the author. The peer-review process as we know it was instituted after the Second World War, largely due to the huge growth of the scientific enterprise and the enormous pressure on academics to publish ever more papers.

Since the 1950s, peer-review has worked as follows: a scientist wishing to publish a paper in a journal submits a copy of the paper to the editor of a journal. The editor forwards the paper to several academics whom he considers to be experts on the matter, asking whether the paper is worthy of publication. These experts – who usually remain anonymous – submit comments about the paper that constitute the “peer review”. The referees judge the content of the paper on criteria such as the validity of the claims made in the paper, the originality of the work, and whether the work, even if correct and original, is important enough to be worthy of publication. Often, the journal editor will require the author to amend his paper in accordance with the recommendations of the referees.

Prior to the War, university professors were mainly teachers, carrying a teaching load of five or six courses per semester (a typical course load nowadays is one or two courses). Professors with this onerous teaching burden were not expected to write papers. The famous philosopher of science Sir Karl Popper wrote in his autobiography that the dean of the New Zealand university where Popper taught during World War II said that he regarded Popper’s production of articles and books a theft of time from the university.

But at some point, universities came to realise that their prestige – and with it the grants they received from governments and corporations – depended not so much on the teaching skills of their professors but rather on the scholarly reputation of these professors. And this reputation could only be enhanced through publications. Teaching loads were reduced to allow professors more time for research and the production of papers; salaries began to depend on one’s publication record. Before the War, salaries of professors of the same rank (associate professor, assistant professor, adjunct professor, full professor etc.) were the same (except for an age differential, which reflected experience). Nowadays, salaries of professors in the same department of the same age and rank can differ by more than a factor of two.

One consequence of all this is that the production of papers has increased by a factor of more than one thousand over the past fifty years. The price paid for this fecundity is a precipitous decline in quality. Before the War, when there was no financial incentive to publish papers, scientists wrote them as a labour of love. These days, papers are written mostly to further one’s career. One thus finds that nowadays, most papers are never cited by anyone except their author(s).

Philip Anderson, who won a Nobel Prize for physics, writes that

In the early part of the postwar period [a scientist’s] career was science-driven, motivated mostly by absorption with the great enterprise of discovery, and by genuine curiosity as to how nature operates. By the last decade of the century far too many, especially of the young people, were seeing science as a competitive interpersonal game, in which the winner was not the one who was objectively right as [to] the nature of scientific reality, but the one who was successful at getting grants, publishing in Physical Review Letters, and being noticed in the news pages of Nature, Science, or Physics Today… [A] general deterioration in quality, which came primarily from excessive specialization and careerist sociology, meant quite literally that more was worse.[1]

More is worse. As Nature puts it, “With more than a million papers per year and rising, nobody has time to read every paper in any but the narrowest fields, so some selection is essential. Authors naturally want visibility for their own work, but time spent reading their papers will be time taken away from reading someone else’s.” The number of physicists has increased by a factor of one thousand since the year 1900. Back then, ten percent of all physicists in the world had either won a Nobel Prize or had been nominated for it. Things are much the same in chemistry. The American Chemical Society made a list of the most significant advances in chemistry over the last 100 years. There has been no change in the rate at which breakthroughs in chemistry have been made in spite of the thousand-fold increase in the number of chemists. In the 1960s, US citizens were awarded about 50 000 patents in chemistry-related areas per year. By the 1980s, the number had dropped to 40 000. But the number of papers has exploded. One result of this publish-or-perish mentality is that groundbreaking papers are often rejected because they are submitted to referees who are incapable or unwilling to recognise novel ideas. Consider these examples.

Rosalyn Yalow won the Nobel Prize in Physiology in 1977. She describes how her Nobel-winning paper was received: “In 1955 we submitted the paper to Science… the paper was held there for eight months before it was reviewed. It was finally rejected. We submitted it to the Journal of Clinical Investigations, which also rejected it.”[2]

Günter Blobel also won a Nobel Prize in Physiology, in 1999. In a news conference given just after he was awarded the prize, Blobel said that the main problem one encounters in one’s research is “when your grants and papers are rejected because some stupid reviewer rejected them for dogmatic adherence to old ideas.” According to the New York Times, these comments “drew thunderous applause from the hundreds of sympathetic colleagues and younger scientists in the auditorium.”[3]

Mitchell J. Feigenbaum thus described the reception that his revolutionary papers on chaos theory received: “Both papers were rejected, the first after a half-year delay. By then, in 1977, over a thousand copies of the first preprint had been shipped. This has been my full experience. Papers on established subjects are immediately accepted. Every novel paper of mine, without exception, has been rejected by the refereeing process. The reader can easily gather that I regard this entire process as a false guardian and wastefully dishonest.”[4]

Theodore Maiman invented the laser, an achievement whose importance is not doubted by anyone. The leading American physics journal, Physical Review Letters, rejected Maiman’s paper on constructing a laser.[5]

John Bardeen, the only person to have ever won two Nobel Prizes in physics, had difficulty publishing a theory in low-temperature solid-state physics that went against the paradigm.[6]

Stephen Hawking needs no introduction. According to his first wife Jane, when Hawking submitted to Nature what is generally regarded as his most important paper on black hole evaporation, the paper was initially rejected.[7] The physicist Frank J. Tipler writes that “I have heard from colleagues who must remain nameless that when Hawking submitted to Physical Review what I personally regard as his most important paper, his paper showing that a most fundamental law of physics called ‘unitarity’ would be violated in black hole evaporation, it, too, was initially rejected.”

 Conventional wisdom in contemporary geophysics holds that the Hawaiian Islands were formed sequentially as the Pacific Plate moved over a hot spot deep inside the Earth. This idea was first developed in a paper by the Princeton geophysicist Tuzo Wilson. Wilson writes: “I… sent [my paper] to the Journal of Geophysical Research. They turned it down… They said my paper had no mathematics in it, no new data, and that it didn’t agree with the current views. Therefore, it must be no good. Apparently, whether one gets turned down or not depends largely on the reviewer. The editors, too, if they don’t see it your way, or if they think it’s something unusual, may turn it down. Well, this annoyed me…”[8]

There is not much incentive for referees to carefully adjudicate their fellow-scientists’ papers. As Nature puts it: “How much time do referees expend on peer review? Although referees may derive benefits from reviewing, it still represents time taken away from other activities (research, teaching and so forth) that they would have otherwise prioritized. Referees are normally unpaid but presumably their time has some monetary value, as reflected in their salaries.”

In 2006, Nature published an essay by Charles G. Jennings, a former editor with the Nature journals and former executive director of the Harvard Stem Cell Institute. As an editor, Jennings was intimately familiar with the peer-review system, and knows full well how badly misunderstood this process is by the public:

Whether there is any such thing as a paper so bad that it cannot be published in any peer reviewed journal is debatable. Nevertheless, scientists understand that peer review per se provides only a minimal assurance of quality, and that the public conception of peer review as a stamp of authentication is far from the truth.

Jennings writes that “many papers are never cited (and one suspects seldom read)”. These papers are written, to a large extent, because “To succeed in science, one must climb this pyramid [of journals]: in academia at least, publication in the more prestigious journals is the key to professional advancement.” Advancement, in this context, is measured by career rewards such as recruitment and promotion, grant funding, invitations to speak at conferences, establishment of collaborations and media coverage.

***

Many in the scientific community recognise the ills that plague the peer-review process, and experiments are being conducted to improve – or sidestep – the current dispensation. For example, some journals no longer grant referees the protection of anonymity. Instead, reviewers are identified and their critiques of papers are made available to the author of the paper being reviewed. The author is then able to defend his paper. This may ameliorate the problem of reviewers who hamper the publication of a paper for less than noble reasons (such as professional jealousy).

At any rate, informed consumers of science understand that peer-review is far from perfect. It is an efficient way of strangling new ideas, rather than a vehicle for promoting truly novel ideas. The peer-review system often stifles true innovation, allowing the reigning paradigm to squash all competition unfairly. This is especially true in controversial areas like biological evolution.

***

References:

My two main references for this post are:

  1. An essay by the physicist Frank J. Tipler entitled Refereed Journals: do they insure quality or enforce orthodoxy? The essay appeared in the volume Uncommon Dissent: Intellectuals who find Darwinism Unconvincing, William A. Dembski (editor), ISI Books, 2004.
  2. A 2006 editorial in Nature, available here: http://www.nature.com/nature/peerreview/debate/nature05032.html. Retrieved 26th May 2013.

[1] Philip Anderson, in Brown, Pais and Pippard, editors, Twentieth Century Physics, American Institute of Physics Press, 1995,page 2029.

[2] Walter Shropshire Jr., editor, The Joys of Research, Smithsonian Institution Press, 1981, page 109.

[3] New York Times, 12th October 1999, page A29.

[4] Mitchell J. Feigenbaum, in Brown, Pais and Pippard, editors, Twentieth Century Physics, American Institute of Physics Press, 1995, page 1850.

[5] Ibid. page 1426.

[6] Lillian Hoddeson, True Genius: The Life and Science of John Bardeen, Joseph Henry Press, 2002, page 300.

[7] Jane Hawking, Music to Move the Stars: A Life with Stephen Hawking, Trans-Atlantic Publications, 1999, page 239.

[8] Walter Shropshire Jr., editor, The Joys of Research, Smithsonian Institution Press, 1981, page 130.

The Science Mystique

May 20, 2013

A reader has kindly drawn my attention to an article by a physician, Jalees Rehman, which treads territory that will be familiar to readers of TorahExplorer. In this post, I reproduce some of Dr. Rehman’s points, interspersed with my comments.[1]

***

Dr. Rehman begins by discussing what he terms the doctor mystique – “Doctors had previously been seen as infallible saviors who devoted all their time to heroically saving lives and whose actions did not need to be questioned” – a notion now rapidly crumbling. Informed patients have access to an immense amount of information with which to question the decisions of their physicians – “Instead of blindly following doctors’ orders, they want to engage their doctor in a discussion and become an integral part of the decision-making process.” In addition, patients nowadays are more aware of various factors that can skew doctors’ judgement:

The recognition that gifts, free dinners and honoraria paid by pharmaceutical companies strongly influence what medications doctors prescribe has led to the establishment of important new rules at universities and academic journals to curb this influence…

I discussed related issues in posts such as Dr. John Ioannidis and the Reality of Research and Dr. Ben Goldacre and the Reproducibility of Research.

Dr. Rehman’s essay, however, is devoted to another myth, one that he calls The Science Mystique. He correctly notes that it still persists where similar notions – the feminine mystique and the doctor mystique – have disappeared or are disintegrating. But Dr. Rehman is clear that the science mystique is vulnerable:

As with other mystiques, it [i.e. The Science Mystique] consists of a collage of falsely idealized and idolized notions of what science constitutes. This mystique has many different manifestations, such as the firm belief that reported scientific findings are absolutely true beyond any doubt, scientific results obtained today are likely to remain true for all eternity and scientific research will be able to definitively solve all the major problems facing humankind.

Quite right. Readers of Genesis and Genes will be familiar with a comment made by the physicist and philosopher Sir John Polkinghorne:

Many people have in their minds a picture of how science proceeds which is altogether too simple. This misleading caricature portrays scientific discovery as resulting from the confrontation of clear and inescapable theoretical predictions by the results of unambiguous and decisive experiments… In actual fact… the reality is more complex and more interesting than that.

Science is a human – read fallible – endeavour. Informed consumers of science understand that a host of factors influence research. Beyond the technical aspects of research, there are societal factors, political factors, ideological factors, financial factors and dozens more, some of which I discussed in the first chapter of Genesis and Genes. One consequence of this is that scientific findings come in a spectrum of credibility, ranging from solid to hopelessly speculative and ideological.

Dr. Rehman:

This science mystique is often paired with an over-simplified and reductionist view of science. Some popular science books, press releases or newspaper articles refer to scientists having discovered the single gene or the molecule that is responsible for highly complex phenomena, such as a disease like cancer or philosophical constructs such as morality.

Indeed. Most members of the public are not informed consumers of science, and are easily swayed by simplistic or exaggerated claims. A common example of exaggerated claims swallowed by the public comes from palaeontology. A fossil is unearthed and proclaimed as the latest earliest ancestor of human beings. After the media frenzy subsides and the public’s attention is diverted, the claims inevitably prove to be hollow. [For several excellent examples of the genre, see the chapter entitled Human Origins and the Fossil Record in Science and Human Origins.][2] This is true with respect to complicated concepts and phenomena like cancer or morality, as Dr. Rehman writes, but it is all the more true with respect to over-arching theories that purport to explain ultimate questions about the universe or life. The gullible public is unaware of the tremendous superstructure of assumptions, ideological commitments and technical difficulties that go into scientists’ absolutist statements about such subjects.

Dr. Rehman continues:

As flattering as it may be, few scientists see science as encapsulating perfection. Even though I am a physician, most of my time is devoted to working as a cell biologist. My laboratory currently studies the biology of stem cells and the role of mitochondrial metabolism in stem cells. In the rather antiquated division of science into “hard” and “soft” sciences, where physics is considered a “hard” science and psychology or sociology are considered “soft” sciences, my field of work would be considered a middle-of-the-road, “firm” science. As cell biologists, we are able to conduct well-defined experiments, falsify hypotheses and directly test cause-effect relationships. Nevertheless, my experience with scientific results is that they are far from perfect and most good scientific work usually raises more questions than it provides answers. We scientists are motivated by our passion for exploration, and we know that even when we are able to successfully obtain definitive results, these findings usually point out even greater deficiencies and uncertainties in our knowledge.

An important qualification is needed here. Researchers like Dr. Rehman are usually aware that in their field, perfection is elusive. But they are often largely ignorant of other fields, and may harbour unrealistic views of the reliability of research in those fields.

Readers of Genesis and Genes will recall chapter 3, in which I described how scientists from half-a-dozen different disciplines were attempting to determine the age of the Earth in the latter part of the 19th century. It was frequently the case that practitioners of one discipline, aware of the limitations of their own field, failed to understand that other fields were just as vulnerable, but for different reasons. This led to a situation in which a mirage was created that there was independent confirmation, arising from several different disciplines, regarding the age of the Earth. This turned out to be completely illusory.

Dr. Rehman now turns to reproducibility of research:

One key problem of science is the issue of reproducibility. Psychology is currently undergoing a soul-searching process[3] because many questions have been raised about why published scientific findings have such poor reproducibility when other psychologists perform the same experiments. One might attribute this to the “soft” nature of psychology, because it deals with variables such as emotions that are difficult to quantify and with heterogeneous humans as their test subjects. Nevertheless, in my work as a cell biologist, I have encountered very similar problems regarding reproducibility of published scientific findings. My experience in recent years is that roughly only half the published findings in stem cell biology can be reproduced when we conduct experiments according to the scientific methods and protocols of the published paper.

Recall that earlier, Dr. Rehman characterised his field, cell biology, as a ‘firm’ science, somewhere between physics and psychology on a spectrum similar to the ‘proof continuum’ I discussed in Genesis and Genes. As he says, cell biology is an area of science where ostensibly objective parameters exist that should ensure the reproducibility of research. Alas, to a significant degree, reproducibility is elusive. Cell biology is not sociology or anthropology; nor are we talking about drug trials here (where as much as 90% of published studies may be wrong). Nonetheless, upwards of 50% of the research in cell biology is not reproducible. One is reminded of this passage in Genesis and Genes:

[Glenn] Begley [who served, for a decade, as head of global cancer research at Amgen] met for breakfast at a cancer conference with the lead scientist of one of the problematic studies. “We went through the paper line by line, figure by figure,” said Begley. “I explained that we re-did their experiment 50 times and never got their result. He said they’d done it six times and got this result once, but put it in the paper because it made the best story. It’s very disillusioning.”

Dr. Rehman:

On the other hand, we devote a limited amount of time and resources to replicating results, because there is no funding available for replication experiments. It is possible that if we devoted enough time and resources to replicate a published study, tinkering with the different methods, trying out different batches of stem cells and reagents, we might have a higher likelihood of being able to replicate the results. Since negative studies are difficult to publish, these failed attempts at replication are buried and the published papers that cannot be replicated are rarely retracted. When scientists meet at conferences, they often informally share their respective experiences with attempts to replicate research findings. These casual exchanges can be very helpful, because they help us ensure that we do not waste resources to build new scientific work on the shaky foundations of scientific papers that cannot be replicated.

The difficulty of publishing negative results and the lack of incentive to verify other researchers’ results are recognised as major contributors to systemic problems within contemporary science. The average member of the public labours under the illusion that mechanisms such as peer-review suffice to ensure that whatever is published in a mainstream journal is infallible. This, of course, constitutes child-like naivety. As Nature put it in a 2006 editorial, “Scientists understand that peer review per se provides only a minimal assurance of quality, and that the public conception of peer review as a stamp of authentication is far from the truth.”

Dr. Rehman:

Most scientists are currently struggling to keep up with the new scientific knowledge in their own field, let alone put it in context with the existing literature. As I have previously pointed out,[4] more than 30-40 scientific papers are published on average on any given day in the field of stem cell biology. This overwhelming wealth of scientific information inevitably leads to a short half-life of scientific knowledge… What is considered a scientific fact today may be obsolete within five years.

Quite true. As I wrote in Genesis and Genes,

A paper published in the Proceedings of the National Academy of Scientists in 2006 noted that “More than 5 million biomedical research and review articles have been published in the last 10 years.” That’s an average of 1370 papers per day. And this is just biomedical research.

This deluge of information, and the fact that “What is considered a scientific fact today may be obsolete within five years”, has important repercussions for informed consumers of science. Those who follow the evolution debate are aware of how the ephemeral nature of scientific knowledge can have an impact on what was only recently considered absolute. Whether it is Tree of Life research, Junk DNA or the discovery of numerous instances of Lamarckian heredity, there have been breathtaking turnarounds in recent years. Basic prudence dictates that when evolutionary biologists invoke ‘overwhelming evidence’ on the basis of whatever, that their claims be taken with a sack of salt.

Dr. Rehman:

One aspect of science that receives comparatively little attention in popular science discussions is the human factor. Scientific experiments are conducted by scientists who have human failings, and thus scientific fallibility is entwined with human frailty. Some degree of limited scientific replicability is intrinsic to the subject matter itself… At other times, researchers may make unintentional mistakes in interpreting their data or may unknowingly use contaminated samples… However, there are far more egregious errors made by scientists that have a major impact on how science is conducted. There are cases of outright fraud… [but] Such overt fraud tends to be unusual… However, what occurs far more frequently than gross fraud is the gentle fudging of scientific data, consciously or subconsciously, so that desired scientific results are obtained. Statistical outliers are excluded, especially if excluding them helps direct the data in the desired direction. Like most humans, scientists also have biases and would like to interpret their data in a manner that fits with their existing concepts and ideas.

Bravo. This is a major theme of Genesis and Genes, and it is crucial in becoming an informed consumer of science. In this short essay, Rehman obviously cannot describe all the influences that have an impact on scientific research. One of Rehamn’s more important omissions is that there is an enormous amount of conditioning which influences scientists – like everyone else – long before they step into the laboratory. Take evolution. If you grew up in the West any time in the last fifty years, you will have encountered innumerable instances in which the claims of evolutionary biology would have been seared into your consciousness, from David Attenborough documentaries to museum dioramas to advertising campaigns named The evolution of the office to countless articles in New Scientist. Scientists do not enter their research careers with a tabula rasa. As professor John Polkinghorne puts it,

Scientists do not look at the world with a blank gaze; they view it from a chosen perspective and bring principles of interpretation and prior expectations… to bear upon what they observe. Scientists wear (theoretical) “spectacles behind the eyes”.

Dr. Rehman:

Human fallibility not only affects how scientists interpret and present their data, but can also have a far-reaching impact on which scientific projects receive research funding or the publication of scientific results. When manuscripts are submitted to scientific journals or when grant proposal are submitted to funding agencies, they usually undergo a review by a panel of scientists who work in the same field and can ultimately decide whether or not a paper should be published or a grant funded. One would hope that these decisions are primarily based on the scientific merit of the manuscripts or the grant proposals, but anyone who has been involved in these forms of peer review knows that, unfortunately, personal connections or personal grudges can often be decisive factors.

Correct. If you happen to be conducting climate research that produces unpopular results, for example, you can be almost sure that your findings will not be published in the most prestigious journals. If you happen to suspect that the brilliant mathematician Irving Segal was right, and that the linear relationship that Edwin Hubble saw between the redshift and apparent brightness of galaxies is perhaps illusory, you are almost certain to receive very little telescope time. Exploring the natural world to your heart’s content, following your curiosity wherever it leads you – that picture of how science was done was fairly accurate up to about the middle of the 19th century. Affluent gentleman scientists could indulge their curiosity about how nature operates. These days, the confines within which research is done will be dictated, to a significant extent, by whatever is considered acceptable by the majority of the community.

***

The science mystique will eventually topple, and that will be a liberating moment for science. It will usher in an age in which scientists and the public alike will be informed consumers of science, able to accurately assess various findings of scientists and assign to them appropriate levels of credibility.

***

See also:

The post Dr. John Ioannidis and the Reality of Research:

http://torahexplorer.com/2013/05/05/dr-john-ioannidis-and-the-reality-of-research/

The post Dr. Ben Goldacre and the Reproducibility of Research:

http://torahexplorer.com/2013/04/10/dr-ben-goldacre-and-the-reproducibility-of-research/

References:

[1] See http://www.3quarksdaily.com/3quarksdaily/2013/02/the-science-mystique-.html.

Retrieved 17th May 2013.

[2] Discovery Institute Press, 2012.

[3] Dr. Rehman cites this paper at this point:

http://pps.sagepub.com/content/7/6/537.full.

Retrieved 19th May 2013.

[4] Dr. Rehman cites the following article:

http://www.scilogs.com/next_regeneration/science-journalism-and-the-inner-swine-dog/.

Retrieved 19th May 2013.


Follow

Get every new post delivered to your Inbox.

Join 45 other followers