Have you read this week a claim by the Harvard School of Public Health that Food Z is linked to cancer or from our U.S. National Institute of Environmental Health Sciences (NIEHS) claims Chemical X is linked to Harm Y?
It's technically true, though in high-profile cases like with the International Agency for Research on Cancer in France, what they leave out is that to create their statistical hazard, they use studies with up up to 10,000 times the normal dose. To create hazard, they torture data but because they were able to get a p-value>.05 they declare it statistically significant.
In recent years, aided by shock media desperate to get new stories out, it has worked. It makes statistics experts crazy, though. Probability values (p-values) have been used as a way to measure the significance of research studies since the 1920s but their misuse has gotten so rampant - psychologists doing surveys of college students will rightly declare their results statistically significant even if they are actually meaningless to the public - that the American Statistical Association finally took a stand against it in 2016. Though a p-value declaring statistical significance is enough to impress journalists when reading claims about chemicals, it won't get you published in an actual statistics journal. Or in physics. Or any place where they understand data and what "exploratory" means.
Statistical significance means that you had an interesting encounter, not that you had a baby.
"Significance tests and dichotomized p-values ... have turned many researchers into what I'll call 'scientific snowbirds', trying to avoid dealing with uncertainty by escaping to a happier place" says Ron Wasserstein, Executive Director of the American Statistical Association, in a new How Researchers Changed the World podcast.
With increasing use, and misuse, of p-values, statistics as a whole was starting to get a bad name. Some journals even banned the use of p-values and other statistical methods. So, Ron was tasked with leading the creation of a framework outlining how p-values should be used in research, which would be published as a statement by the American Statistical Association.
"We were challenged to do the ASA statement on p-values because of these attacks on statistics as a whole field of research," says Wasserstein
Has it worked? Not really. A paper recently claimed something called "ultraprocessed" foods made people fat, with statistical significance. But it had no science, there was no biological hypothesis for why people chose to eat more other than vague claims about being addicted to an additive but not the same chemical if it was natural. Still, it got coverage everywhere.
Probability Abuse: Why P-Values And Statistical Significance Are So Often Misunderstood
Related articles
- The American Statistics Association Warnings On P-Values
- Exposure To Wildfire Smoke May Increase Preterm Birth Risk- But It Would Be The PM10, Not PM2.5
- ATLAS And CMS Publish 2011 Higgs Results
- Positive Phase 2 Study Results For Tivantinib In Previously Treated Hepatocellular Carcinoma
- IARC Is Disconnected From Reality- That's Why Its Next Director Shouldn't Be An Epidemiologist
Comments