Algorithms decide what users want to get out of our internet experience by analyzing factors such as location, browser, age, gender and browsing history to decide what we see on different platforms. Google is not even a standard search engine anymore, it too has highly altering algorithms. Grab a friend and two different devices in different places, search something on Google and watch the vastly different results displayed.
These algorithms are sweeping the internet, following our every keystroke to come up with whatever it thinks will keep us online, keeping us falling down the rabbit hole.
All the information is still there, so why is it so dangerous? The danger lies in the fact that we constantly see exactly what we want to see. We are coddled by being served familiarity all the time.
The algorithm will not show us something totally out of the ordinary or something that makes us too uncomfortable because it wants our experience on whatever platform we’re using to be great. It wants us to feel secure so we return again and again. We see what we want, but not always what we need.
“A squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa,” Facebook CEO and founder Mark Zuckerberg said when asked about Facebook’s news feed. Internet activist Eli Pariser quoted this in his Ted Talk about “filter bubbles,” a term he coined.
“Your filter bubble is your own personal unique universe of information that you live in online,” Pariser said. Filter bubbles are the reason you see an ad on YouTube for the furniture website you were just searching on. It is not magic or “big brother” watching you, it is algorithmic equations taking your electronic footprints as inputs to output similar things that will keep you surfing and sharing.
The problem with filter bubbles is that the majority of ideas and beliefs we are shown are just like our own. Our beliefs are confirmed and ideas justified every single time we open an app. As Pariser says, this is a threat to democracy. The world is not just what we know and what is comfortable. There are things happening to real people just like us and stories we need to hear about whether they fit into a social media platforms recipe for our viewing experience or not.
It isn’t just a threat to the idea of democracy where we are all given the same facts and figures to make sound decisions. It quite literally affects democracy in the way we vote.
“Given that many elections are won by small margins, our results suggest that a search engine company has the power to influence the results of a substantial number of elections with impunity,” said Jacob N. Shapiro, a Princeton University student who conducted a study on how the internet influences people.
There is no way to overcome this kind of suppression of information. Once the algorithm figures out which candidate they think you like, it’s possible he or she is the one you’ll see the most. We aren’t challenged enough. Notice the surplus of posts about getting out and voting around election time through the years? According to a study, posts on Facebook caused over 340,000 people to vote after seeing those ads.
So many people use platforms like Twitter and Facebook as their daily source of news, so these algorithms are troubling. If they are going to exist, people should at least be made more aware.
Olivia James is a 19-year-old mass communication freshman from Baton Rouge, Louisiana.