Genesis and Genes (Feldheim, 2013) is now available in selected bookstores in Israel and the UK. A shipment is making its way to the USA, and then to South Africa and elsewhere. In the meantime, I intend to post a number of sample passages. Here is a passage from Chapter 1.
Anyone who reads science publications will periodically come across such items. Seed, an award-winning science magazine, published an article in May 2007 about the research of epidemiologist John Ioannidis. The article reports that,
In a 2005 article in the Journal of the American Medical Association, epidemiologist John Ioannidis showed that among the 45 most highly cited clinical research findings of the past 15 years, 99 percent of molecular research had subsequently been refuted. Epidemiology findings had been contradicted in four-fifths of the cases he looked at, and the usually robust outcomes of clinical trials had a refutation rate of one in four. The revelations struck a chord with the scientific community at large: A recent essay by Ioannidis simply entitled “Why most published research findings are false” has been downloaded more than 100,000 times; the Boston Globe called it “an instant cult classic.”
Part of the explanation for this shocking finding is something we just discussed:
Cash-for-science practices between the nutrition and drug companies and the academics that conduct their research may also be playing a role. A survey of published results on beverages earlier this year found that research sponsored by industry is much more likely to report favorable findings than papers with other sources of funding. Although not a direct indication of bias, findings like these feed suspicion that the cherry-picking of data, hindrance of negative results, or adjustment of research is surreptitiously corrupting accuracy. In his essay, Ioannidis wrote, “The greater the financial and other interest and prejudices in a scientific field, the less likely the research findings are to be true.”
Financial prejudices are only part of the problem, as Glenn Begley’s experience in cancer research shows. During a decade as head of global cancer research at Amgen, Begley identified 53 “landmark” publications – papers in top journals, from reputable labs – for his team to reproduce. Begley sought to double-check the findings before trying to build on them for drug development. Result: 47 of the 53 studies (89%) could not be replicated. He described his findings in a commentary piece published in the journal Nature in March 2012. In a Reuters report, Begley said “It was shocking. These are the studies the pharmaceutical industry relies on to identify new targets for drug development… As we tried to reproduce these papers we became convinced you can’t take anything at face value.” Begley’s experience echoes a report from scientists at Bayer AG. In a 2011 paper titled Believe it or not, they analyzed in-house projects that built on “exciting published data” from basic science studies. “Often, key data could not be reproduced,” wrote Khusru Asadullah, vice president and head of target discovery at Bayer HealthCare in Berlin, and colleagues. Of 47 cancer projects at Bayer during 2011, less than one-quarter could reproduce previously reported findings, despite the efforts of three or four scientists working full time for up to a year. Bayer dropped the projects.
Bayer and Amgen found that the prestige of a journal was no guarantee a paper would be solid. “The scientific community assumes that the claims in a preclinical study can be taken at face value,” Begley and Lee Ellis of MD Anderson Cancer Center wrote in Nature. They and others fear the phenomenon is the product of a skewed system of incentives that has academics cutting corners to further their careers. Part way through his project to reproduce promising studies, Begley met for breakfast at a cancer conference with the lead scientist of one of the problematic studies. “We went through the paper line by line, figure by figure,” said Begley. “I explained that we re-did their experiment 50 times and never got their result. He said they’d done it six times and got this result once, but put it in the paper because it made the best story. It’s very disillusioning.”
- Seed was a finalist for two National Magazine Awards in 2007 in the categories of Design and General Excellence, is the recipient of the Utne Independent Press Award, and is included in the 2006 Best American Science and Nature Writing anthology published by Houghton Mifflin.
- The original research by Ioannidis can be read here: http://jama.ama-assn.org/content/294/2/218.full.pdf+html. Retrieved 5th June 2011.
- http://seedmagazine.com/content/article/dirty_little_secret/. Retrieved 5th June 2011.
- See http://www.newsdaily.com/stories/bre82r12p-us-cancer/. Retrieved 31st March 2012.