Introduction
The Backtracking Project is an interdisciplinary assignment where we were given a week to find a fact/infographic online and trace it back to the information’s original source. This project’s goal was to help us gain the ability to distinguish true or false facts online, and do not automatically believe everything you read. We also took a look at how false facts came to be, which can partly be because of faulty experiments. It is very easy for an experiment to have a lot of issues in it, which is why it is important to not immediately believe something that is “backed up by” an experiment when you do not know what that experiment pertains. One issue that may arise is the percent of error. Say that a study showed that 60% of McGehee students enjoy doing their homework, but there was a ±15% of error. This would mean that a newsletter could be sent out saying that either 45% of McGehee students enjoy doing their homework or 75% of McGehee students enjoy homework. Even the wording of results can change how they are perceived: hearing ‘3 out of 4 McGehee students enjoy doing their homework!’ sounds better than saying ‘1 quarter of McGehee’s students do not like doing their homework!’. Incorrect graphs can also lead to misinformation. We learned about what makes some graphs misleading and even fixed our own. In fact, there are many rules that go into making a graph, like the rule of proportional ink. This states that if a value is half of another value, it should be portrayed as so. With this in mind, a bar graph with a value of 30 and 15 should show the bar with the value of 15 as half the size of 30.

A bar graph I re-made to proportionally show North Dakota's and New Jersey's spending versus the average ACT score in the year 1998. This bar graph is also out of 800, because that is the highest possible score for either the math or verbal section of the ACT.
Another example of misinformation is when news sources use headline packaging. News sources, or any website that posts articles, want people to click on their articles as much as possible so that they get more ad revenue from the ads placed on the page of the article. This addition to the media causes writers to focus more on intriguing headlines—or shocking ones—that grab their audience’s attention. Sometimes the same exact article is posted twice with a slightly different title.

We also learned about how human nature has a big effect on what is perceived to be true or false. Humans find patterns, or correlation, between two items when (sometimes) there is no cause and effect occurring. With these patterns, we jump to conclusions quickly then seek out and overvalue any information that supports our argument. This causes issues with trying to prove a fact to be true because the person hearing the explanation may be blocking out the factual evidence that goes against their beliefs. In government class, we played a game of ‘telephone’ to represent how news can get twisted as it goes through many sources, even just from classmate to classmate—human to human. The first time we did it, the message was short and the final girl to receive the information recited the original statement (almost perfectly). The second time we did it, however, the message the last girl got was almost incomprehensible to its lengthiness and many details. Information can start out as an accurate fact on its original source, but, as it is passed through different news articles and social media posts, it becomes skewed.
Before we started this project, we established a Pyramid of Reliability that ranked how credible certain websites were. At the bottom, we had websites that can be edited by anyone: Wikipedia, Blogger, YouTube, Twitter, Snapchat, and other social media. These sources are unreliable because posts are not peer-reviewed and anyone (and bots) can post what they call ‘facts’. The third tier consists of biased news sources that are focused on getting revenue from their articles, and the top two tiers are the most reliable—peer-reviewed, purely educational, unbiased.

Process
Step 4: Evaluation of Facts
After I understood the experiment as much as I could, it was really up to me to decide whether or not the experiment done was trustworthy and showed accurate information. There were a lot of questions to ask and the whole project linking loneliness to long, warm bath/showers was very long, but I sifted through the details to my best ability.
Step 2: Going up the Tiers
Once I got my fact, I searched for the information on social
media or in articles that were not peer reviewed or created by
a scientist. I quickly found the information in multiple places on Twitter
and on an anonymous confession app called Whisper. I also found
websites that mimicked the infographic and articles that went into more detail about where the information came from.
Step 1: Finding the Right Infographic
My goal was to find an infographic with a shocking ‘fact’ on it that did not seem to bizarre to
the point where it was completely unbelievable. I also wanted something that would not be to easily
traced back to its original source.
Step 3: Finding
the Top Tier
Once I found the Yale experiment, I
had to read through it gain an understanding
of the experiment the scientists designed, what
their goal was with this experiment, how they went
about it, what their results were, and what made the qualified to be accurately devising an experiment.
My Project: Presentation to Class and Presentation Slides

Reflection
While it is a little discomforting to know that a lot of facts you read about online are untrue, I am glad that I am aware of how easily information can be altered or portrayed in a way that is deceiving so that I do not blindly believe things that affect my daily life. At the very beginning of this project, when the unit was just being introduced, some myths I had heard about were debunked. ‘Facts’ like “artificial sweeteners give you cancer” caused unnecessary stress on my life and made me paranoid about all the ice coffee I have had with a pack of sweetener in it. It is for this reason that it is important to distinguish fact from opinion or exaggeration.
When reading “facts” online, I now have some tools to decide upon its reliability, making it easier for me to know what to believe. I can tell that something is true by quickly doing a quick Google search on the fact and finding a reliable website that backs it up. I can even do something before that, which is to just think about a fact logically. I've always known that "You can't trust everything you read online!", but part of me still always blindly believed the things that I heard or read (especially if it was backed up even by the smallest of pieces of evidence). Now, I am much more concerned about where I am getting my information from—even if my friend tells me something, I usually ask her where she read it or heard it from. While I am more skeptical now, I believe that I do have a better appreciation for factual evidence.