Conducting scientific research is never simple, and there are often sedate disasters along the way. The researcher accidentally spills coffee on the keyboard, destroying data. Or one of the chemicals used in the analysis is contaminated and the list is long.
However, when we read the results of the study in the scientific article, it always looks flawless. The test went smoothly without hiccups, and here are our results.
But research may contain errors from which not all independent experts or “mutual reviewers” play before publication.
Statistical Pieczyn can be hard to find, because it really requires someone trained in statistics to notice something bad.
When statistical errors are made, it can have a deep impact on people who could change their lifestyle as a result of incorrect study.
These three examples of unintentional statistical errors had sedate consequences for our health and purchasing habits.
1. Have you thrown black plastic spoons?
At the end of last year I came across Information article About how black plastic kitchen utensils were unsafe because they could potentially leaks toxic chemicals of the flame-rustardants into your food.
Being a natural skeptic, I looked at the original paperwhich was published in the journal Chemosphere. The article looked genuine, the journal was renowned. So – maybe many other people – I threw my black plastic kitchen utensils and replaced them with silicone.
In the study, the authors checked 203 household products (about half are kitchen utensils) made of black plastic.
The authors found that toxic flame delays in 85% of the tested products, with the levels approaching the maximum daily daily limits determined by the Environmental Protection Agency in the United States.
Unfortunately, the authors created mistake in their calculations. They were in ten factors. This meant that the level of toxic chemicals was far below the safety limits.
In recent weeks, authors He apologized and corrected their paper.
2. Have you avoided HRT?
AND breakthrough excited concerns regarding safety regarding hormone replacement therapy or HRT (currently also known as menopausal hormone therapy). This is emphasized by a different type of statistical error.
The Women’s Health Initiative (WHI) test 10,739 women after menopause aged 50-79 recruited from 40 clinical centers in the USA were involved. He compared the health of randomized women to take HRT with those who took placebo. Neither researchers nor women knew which treatment was given.
In 2002 paperThe authors reported higher indicators of invasive breast cancer in the HRT group. They used the unit called “Personal Years”. A person is a way to measure the total time in which a group of people spends the study. For example, if 100 people are in the study for one year, it’s 100 years senior. If someone leaves the trial after just six months, only this six months will be counted for them.
The authors showed the 38 invasive breast cancer indicator per 10,000 individuals in the HRT group, compared to 30 out of 10,000 individuals in the Placebo group. This gives a speed of 1.26 (one speed divided by the other).
This fairly huge boost in breast cancer rate, also expressed as an boost of 26%, caused a common panic around the world and led to the detention of thousands of women.
But the actual risk of breast cancer in each group is low. The rate of 38 out of 10,000 individuals is equivalent to the annual rate of 0.38%. At very compact rates, the authors should really apply Rate difference instead Rate. The difference in rates is subtracted from the second, not divided by it. This corresponds to an annual boost of 0.08% of breast cancer cases in the HRT group – much more compact.
The authors of the article from 2002 also indicated that 26% boost in breast cancer rate “almost achieved nominal statistical significance”. Almost It is not a statistical significance and formally means that there was no difference in breast cancer indicators between two groups. In other words, the difference between two groups could happen by accident.
The authors should have been more cautious when describing their results.
3. Has Popeye’s spinach changed meals?
The characteristic form of Popeye is one -eyed, smoking pipes with mangraded English, in love with willow olive ol. He still gets into trouble, and when he needs additional energy, he opens a can of spinach and swallows content. His biceps immediately bulge and go to solve the problem.
https://www.youtube.com/watch?v=7zrlmgebes
But why does POPEEYE to eat spinach?
. history It starts around 1870, with a German chemist, Erich von Wolf or Emil von Wolff, depending on which one Version of the events You read.
He measured the amount of iron in various types of deciduous vegetables. According to legend, which some disputeshe wrote the iron content of spinach in the notebook and misunderstood the decimal point, writing 35 milligrams instead of 3.5 milligrams on 100 gram spinach support. The error was found and corrected in 1937.
Until then, Popeye’s figure was created and the spinach became extremely popular among children. ApparentlySpinach consumption in the USA increased by a third as a result of a cartoon.
This story has gained a legendary status, but it has one compact drawback. In the cartoon from 1932, Popeye explains exactly why his spinach and it has nothing to do with iron. Speaks in his distorted English:
The spinach is full of vitamin A. An’tha, what makes Hoomans forceful and Helta!