Home In-Depth Feature Master of Illusion: The Irony of Using Statistics to Lie

Master of Illusion: The Irony of Using Statistics to Lie


By Carol Richardson | April 10th, 2017

When it comes to making a sound argument, logically it seems as if statistics are a safe bet. Numbers are absolute, right? If you have the numbers to back something up, then it must be correct, right?

Well, no. Not exactly. Statistics are a great tool if used properly, but by their very nature — statistics can easily be manipulated to tell the story you want to tell. In fact, statistics have been used quite boldly to straight up lie. It’s an effective strategy that politicians, advertisers, researchers, journalists, and pretty much every human has used to sway others into believing a certain story. If you haven’t used statistics to tell a particular story yourself, there’s a good chance you have been convinced to believe certain things by statistics that tell a false story.

Humans are hardwired to perceive numbers as absolute. That table is six feet long because it is measured in feet, but if you change the unit of measurement into much small increments — the numbers change. The problem comes when people use differing measurements to mislead or persuade people into believing something that’s not true.

To help illustrate how statistics have been used to lie over the course of history, we have pulled together some examples of statistics that seem to tell one story — but omit important factors in a well-rounded truth.

I. Selective Choosing

On the surface, the most tallied of something would conclude that it happens the most, right? No, not hardly ever. One has to take into account the sample size and percentages of that group to get more accurate picture of the statistics.

Take this airline survey, for example, published in US News and World Report in 2001:

Most Complaints:
United Airlines, 252
American Airlines, 162
Delta Airlines, 119

Least Complaints:
Alaska Airlines, 13
Southwest Airlines, 22
Continental Airlines, 60

On the surface, it seems like Alaska Airlines is awesome at customer service, while United Airlines is the worst. United may have exponentially more customer complaints on file, but one can’t conclude the airline is worse just from this data. United Airlines flies exponentially more people every year than Alaska Airlines. Without seeing the percentage of people that flew each airline and then complained — these stats are effectively worthless.

Another example comes from an ad for the weight loss drug, Lipozene. You may have seen television commercials for this drug that boast its ability to spark weight loss without any changes to diet. In one particular Internet ad, Lipozene claims to have helped its sample group lose 400% more weight than the placebo group. That’s a powerful number to be throwing out, but when you look at the fine print — the ad says the Lipozene group lost 2.75 lbs over 60 days while the placebo group gained 2.18 pounds. The 400% difference is technically true, but most people wouldn’t get too excited if their weight fluctuated about three pounds over 60 days.

II. Bad Sampling

One of the most common ways to mislead people with statistics is to use completely accurate numbers — but omit the sample information. In a study or survey, the sample is the total group observed. It is very common for journalists write about studies that have impressive-sounding results — only to find out the sampling was very small or not an accurate representation of the population it is trying to emulate.

We’ve all heard the famous statistic that approximately 50% of all marriages end in divorce. That number does come from real data, but it is a bit misleading. If you take a look at the marriage rate and the divorce rate from each individual year — it turns out that the number of divorces turns out to be a number that is about 50% of the people getting married each year. What’s misleading about that 50% is that it’s not taking into account when those divorced couples got married. For example if in 2017, 100 couples get married and 50 couples divorce, that would be a 50% statistic — but it doesn’t mean that 50 of those 100 will actually get divorced. In fact, marriages from the 1970s and 1980s have a much higher divorce rate than those couples that tied the knot in the 1990s or 2000s.

To find more examples, let’s look to the leader of misleading statistics — political messaging. In one popular pie graph making the rounds on social media, it showed military spending to be 57% of federal spending and the “food stamp” budget to be just 1%. What the graph — trying to make a point about human service spending — failed to include was that these numbers were pulled only from the federals discretionary spending — not the total budget. It fails to take into account federal mandatory spending like Medicare, Medicaid, and Social Security. The food stamp program is actually part of mandatory budgets and has nothing to do with the numbers used for this pie chart.

In a different political statistic snafu, website TruthStreamMedia.com published an article titled “Why Have There Been More Mass Shootings Under Obama than the Four Previous Presidents Combined?” That inflammatory language paired with very misleading — and semi-inaccurate — statistics paint a skewed picture of the truth.

The numbers used to create the chart and story include data from “multiple sources.” Snopes.com found that to determine the numbers, the criteria for “mass shootings” was expanded to include domestic incidents during the Obama administration — but not for the other presidents. This a clear example of manipulating data to tell the story you want — by pulling from different audiences and claiming a level playing field.

Both of these political examples are skewing information to meet their own needs — without even taking into account that the president has very little control over homicide rates or big picture federal spending. It can be dangerous to lie when there are real life consequences.

III. Correlation is NOT Causation

When two actions or results seem to have a logical connection, we can sometimes place meaning where it doesn’t belong. With statistics, people are able to create compelling arguments with a correlation that seems to be true — but really have no provable connection.

Take, for one example, the increased instances of diagnosed autism in the U.S. In 2012, the Center for Disease Controlled reported a significant increase in autism diagnoses. From the year 2000 to 2012, the autism diagnosis grew from approximately 1 in every 150 children to 1 in every 68. This growth in autism has been attributed to a huge range of causes — mostly debunked — that include vaccines, GMO food, pharmacological drugs, etc. The part that almost always gets left out in these reports is that the criteria for autism diagnosis have expanded significantly since 2000.

In 2000, you were unlikely to hear someone is on the “autism spectrum” as we had, collectively, a different understanding of autism then. In 2012 and beyond, someone demonstrating autistic behavior is more likely to get diagnosed rather than be labeled as mentally ill.

Correlation as causation can be so greatly manipulated that it becomes comical. One student at Havard, Tyler Vigen, set out to make this point by correlating unusual data together and making laughable claims like:

– Sour cream sales increase deaths from motorbike accidents
– The higher the cost of potato chips goes, the more death result from falling out of a wheelchair
– The more often Nicholas Cage stars in a movie, the more people drown in swimming pools

Vigen was quoted that he didn’t mean for researchers or the public to not take correlation into account, but to rather use critical thinking to understand if the correlation makes sense.

IV. Questionable Research Methods

Often, surveys and research studies are conducted with the understanding that the results must portray a certain outcome. This is quite prevalent in product studies so that advertisements can later use numbers as a convincing sales tactic.

For example, in 2007 Colgate toothpaste brand ran an ad campaign with the slogan “80% of dentists recommend Colgate.” When phrased that way, with that number, it suggests that 80% of dentists recommend Colgate — over other brands. It was later revealed that the survey the dentist sample took allowed dentists to choose ANY toothpaste brand they would recommend. 80% of the doctors chose Colgate as ONE of those brands — not THE brand they would recommend. That slight deception in questioning, surveying, and the language used to report it painted a picture that wasn’t entirely true.

After further investigation from ASA, it was also discovered that Colgate was conducting the survey over the phone and identifying as an independent market research company. Colgate failed to make it clear that they funded the research or that the information would be used in advertising.

V. Unsupported Numbers

This is where the Internet has really helped spread false statistics and lies. Who hasn’t seen a meme or two that looks official because it has statistics on it. But does this statement actually have well-researched facts to back it up? In many instances, no.

Our current president, Donald Trump, has been caught on multiple occasions tweeting or quoting statistics that are outright lies. While still just a potential presidential candidate, Trump tweeted some crime statistics in fall of 2015 that were quite inflammatory.

The tweet had an image of a man pointing a gun with his face covered by a bandana. The stats were labeled “USA Crime Statistics ~ 2015” and claimed to be delivered by the “Crime Statistics Bureau — San Francisco.” Here were the numbers:

Blacks Killed by Whites — 2%
Blacks Killed by Police — 1%
Whites Killed by Police — 3%
Whites Killed by Whites — 16%
Whites Killed by Blacks — 81%
Blacks Killed by Blacks — 97%

One can see why this is a powerful message, amid the tense political environment that has pushed Trump to the presidency and Black Lives Matter activists to the streets. But these numbers are pulled from thin air.

First of all, the “Crime Statistic Bureau” is NOT A REAL THING. It doesn’t exist, it just sounds official. Also, the FBI would be responsible for releasing these kind of statistics, but since 2015 wasn’t over yet at the time of this tweet — the FBI couldn’t do that either. On top of all that the numbers aren’t accurate if you look back into past years. In 2014, only 14% of whites were killed by blacks. Whites actually kill each other much more often at 82.3% — not 16%. Also, black on black homicide was at 89% in 2014, according to real, documented data directly from the FBI.

The problem with memes and other incorrect information on the Internet is that most people won’t do the background research to find out if the numbers are true. This clearly racists image will now be spread as fact — because it looks official with numbers and an official sounding (fake) institution. Moves like this (from both side of the political party aisle) are moving past deception and into dangerous lie territory.

How to Tell if Statistics Are Lying to You

Image created by Jannoon028 – Freepik.com

Unfortunately, with so many false and misleading statistical claims out there — it can be overwhelming to think about believing any of it. Please don’t lose heart however. Just be diligent in your information and avoid believing some statistics just because a media site reports them.

Statistics and numbers are still a great way to get an understanding about our world. We can help by not spreading stats we know are false and performing our due diligence in finding out the real truth before we believe or pass along information.

Use your common sense. Also, resources like Snopes.com or even a quick Google search can help you find the full picture.

Questions, Comments, or Suggestions? Email me at carol.richardson@thenewsreflection.com

Load More Related Articles
Load More In In-Depth Feature

Check Also

Power of History on Modern European and Japanese Culture

By Ron Lewis | April 18th, 2017 All ideals, even the most utopian, have history for their …