The Filtered Internet

Introduction

When Tim Berners-Lee proposed the World Wide Web in 1989 no one would have guessed how far it would expand in 25 years. As of the end of 2013, 39% of the world’s population has consistent access to Internet in some form. (Internetworldstats.com, 2014.) The Internet has evolved so much that it has started to impact how we communicate. It is nearly impossible to go about your daily life without having some interaction with the Internet. Realizing the Internet has become a daily interaction, Internet giants have started tracking our online movement. Many people have started to ask, with all this tracking, has the Internet gone too far?

 

Riding the Bubble

Eli Pariser, co-founder of the Internet blog community Upworthy, among other Internet communities, has developed a communication theory about how the filters and trackers on sites, such as Google and Facebook, are impacting the way society views daily life and current events. He calls this theory The Filter Bubble. A Filter Bubble is a personalized Internet universe each individual lives in online. The content of your Filter Bubble depends on who you are, and what you do online; you have no physical control over your Filter Bubble.

 

As you move about the Internet, websites install on average 111.9 cookies within one minute of Internet browsing. Of these 111.9 cookies, 40.28 are trackers that stick with your browser. The remaining cookies relate to the functionality of the web page and are deleted when you leave the website. (Beaumont, 2011.) The tracking cookies are able to tell what computer operating system you are using, your Internet browser, your location, and any information you have given to that particular website. This information is sent back to massive servers where it is stored as a database of information.

 

Life in the Bubble

Much like George Gerbner’s Cultivation Theory, The Filter Bubble, for most people, is an unconscious acceptance. In Gerbner’s theory he suggests that the amount and genre of television consumed has an impact on people’s perception of reality. The same can be said for Internet filters; if we are only shown one type of content, we’re more likely to believe that is reality.

 

Wired columnist Mat Honan noticed his Facebook News Feed changing, sometimes subtly, other times blatantly. In the name of pseudo-science, Honan decided to conduct an experiment: ‘like’ everything that came across his News Feed for 48 hours. His goal was to see if anything would change. At first Honan saw no change and became skeptical. Late into his first day he started to see some major changes; his News Feed had been taken over by brands and messaging. (Honan, 2014.) The next day Honan noticed his News Feed had shifted politically to the right. He also noticed that the content shown on Facebook’s mobile App and desktop site differed. The desktop site still showed, albeit little, friend updates, while the mobile App only showed branded content. He suggests that the Facebook algorithms act differently depending on the size of screen being used, with the small screen’s real estate being more valuable. Near the end of his experiment, Honan was contacted by several of his friends and colleagues saying that he had started to overrun their News Feeds with his likes. Honan had started influencing his Facebook friends’ Filter Bubbles.

 

Contrary to Pariser’s theory, Honan believes we set up our Filter Bubble semi-consciously. For example, if you ‘like’ pizza on Facebook, the website’s algorithms will tailor the advertisements and suggested pages to those relating to pizza, thus building pizza into your Filter Bubble, but excluding things pertaining to salad.

 

Survey Says

Pariser conducted a small research study where he asked two of his friends to capture screen shots of Google searches of a city, in this case Egypt. He then compared the two and noticed major differences; all relating back to the individual’s Filter Bubble. One person’s results focused around general Egypt information and tourism, while the other’s focused around the Egyptian uprising in 2011.

 

I was intrigued by Pariser’s research, so I decided to replicate his experiment. I put out an open invitation to my Facebook friends to send me a screen capture of their Google search results for Ebola. I figured it is a relevant topic that has been in the media enough for people to start researching it. My Facebook friend list is made up of 370 people, mostly under the age of 30 and living in Canada. Of the 370, I collected voluntary responses from 20 people. The response demographic data is below. (Figures 1-4)

 

            

 

 

 

 

 

 

              

Figure 1 – Gender Responses                        Figure 2 – Operating Systems

 

          

 

 

 

 

 

 

 

                 

       Figure 3 – Age Group                                 Figure 4 – Internet Browser

 

I divided the information this way to illustrate how the Google and Facebook filter algorithms break down each user. These are the four basic pieces of information looked at, gender, age, operating system, and Internet browser. There are hundreds of other pieces of information also collected, but that is all dependent on the individual site, and what information you voluntarily give.

 

I quickly noticed a pattern in the results. The first link given to 13 of the 20 participants was to a news article about Ebola in the United States. The remaining seven were first shown the Wikipedia page for Ebola. (Figure 5.)

 

 

 

 

 

 

 

 

 

Figure 5 – Which link was seen first

 

At first I thought this experiment does not prove anything; all the results were the same, either news or Wikipedia. After thinking about it for some time, I decided to compare the person with their results. It was the students, journalists, and managers who were directed towards Wikipedia, and everyone else was pointed towards news articles.

 

At first glance I thought these results were going to disprove Pariser’s theory, but in reality everyone is trapped in their Filter Bubble. Students, journalists, and managers want quick easily understood facts, while everyone else has time to read a news article. Although this was not the case for everyone, it fit the majority.

 

 However, there are many variables not considered in my experiment. A large number of the participants employ advertisement blocking software, which block a large portion of cookies and trackers, thus not giving much for the algorithms to work with. Another way people are confusing and getting around the algorithms is by changing their proxy. In essence changing your proxy changes how you access a network or the Internet as a whole (e.g., changes your IP address to reflect a different country.)

 

One of the participants conducted her search on Yahoo’s search engine, rather than Google. Like many others, her top results were for news articles, however the types of articles were different; they had a more negative spin. I recreated her search on my own computer, and was presented with similar negative articles. This made me realize that each search engine has unique filters and different views on different topics.

 

Pop the Bubble

Even though Filter Bubbles have become unavoidable, and will continue developing at an alarming rate, there are still a few things you can do to keep an un-opinionated Internet.

 

  • Reset your Internet browser: clear your history, remove cookies, and reset the caches.

  • Install an advertisement blocking software.

  • Use a VPN, DNS, or Proxy server to browse anonymously.

  • Consider what information you share; websites only know the personal information you give it.

 

Have the Filters Gone too far?

Internet personalization has been around in some form or another for the last two decades, Filter Bubbles about the same. Now that people have become aware that their online movements are being tracked, many are calling for government intervention. There is a movement in the United Kingdom called The Cookie Collective who have set up a service to audit websites for the types of cookies and trackers they are using, and are trying to have the UK Parliament pass laws on Internet tracking, to which they have only been partially successful.

 

It is my opinion that Internet tracking is inevitable, yet it is up to each individual person as to how, and why, they are tracked. The Internet only knows as much as you tell it. Sandra Petronio’s Communication Privacy Management Theory looks at what we consider private information, how and to whom we share information, and how we create privacy boundaries. When we were limited to face-to-face communication we were very selective about who received a particular piece of information. Now with social media we broadcast what used to be considered private information, day in and day out to everyone we know without thinking twice. And then we go and blame online advertisements for suggesting anti-depressants because you posted something about breaking up with your boyfriend.

 

At the end of the day there is no permanent escape from Internet tracking. It is a part of life now, and will be for a long time.

 

 

 

 

 

 

 

 

References

Beaumont, R. (2011). Cookie Stats Revealed. The Cookie Collective. Retrieved 3 October

2014, from http://www.cookielaw.org/blog/2011/5/27/cookie-stats-revealed/

 

Griffin, E. (2012). A First Look at Communication Theory (8th ed.). Boston: McGraw-

Hill.

 

Honan, M. (2014). I Liked Everything I Saw on Facebook for Two Days. Here's What It

Did to Me. WIRED. Retrieved 3 October 2014, from http://www.wired.com/2014/08/i-liked-everything-i-saw-on-facebook-for-two-days-heres-what-it-did-to-me/

 

Internetworldstats.com,. (2014). World Internet Users Statistics and 2014 World

PopulationStats. Retrieved 3 October 2014, from http://www.internetworldstats.com/stats.htm

 

Pariser, E. (2011). Transcript of "Beware online "filter bubbles"". Ted.com. Retrieved 3

October 2014, from http://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles/transcript?language=en