language-icon Old Web
English
Sign In

Filter bubble

A filter bubble – a term coined by Internet activist Eli Pariser – is a state of intellectual isolation that allegedly can result from personalized searches when a website algorithm selectively guesses what information a user would like to see based on information about the user, such as location, past click-behavior and search history. As a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles. The choices made by these algorithms are not transparent. Prime examples include Google Personalized Search results and Facebook's personalized news-stream. The bubble effect may have negative implications for civic discourse, according to Pariser, but contrasting views regard the effect as minimal and addressable. The results of the U.S. presidential election in 2016 have been associated with the influence of social media platforms such as Twitter and Facebook, and as a result have called into question the effects of the 'filter bubble' phenomenon on user exposure to fake news and echo chambers, spurring new interest in the term, with many concerned that the phenomenon may harm democracy.(Technology such as social media) “lets you go off with like-minded people, so you're not mixing and sharing and understanding other points of view ... It's super important. It's turned out to be more of a problem than I, or many others, would have expected.”According to one Wall Street Journal study, the top fifty Internet sites, from CNN to Yahoo to MSN, install an average of 64 data-laden cookies and personal tracking beacons each. Search for a word like 'depression' on Dictionary.com, and the site installs up to 223 tracking cookies and beacons on your computer so that other Web sites can target you with antidepressants. Share an article about cooking on ABC News, and you may be chased around the Web by ads for Teflon-coated pots. Open—even for an instant—a page listing signs that your spouse may be cheating and prepare to be haunted with DNA paternity-test ads.A world constructed from the familiar is a world in which there's nothing to learn ... (since there is) invisible autopropaganda, indoctrinating us with our own ideas. A filter bubble – a term coined by Internet activist Eli Pariser – is a state of intellectual isolation that allegedly can result from personalized searches when a website algorithm selectively guesses what information a user would like to see based on information about the user, such as location, past click-behavior and search history. As a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles. The choices made by these algorithms are not transparent. Prime examples include Google Personalized Search results and Facebook's personalized news-stream. The bubble effect may have negative implications for civic discourse, according to Pariser, but contrasting views regard the effect as minimal and addressable. The results of the U.S. presidential election in 2016 have been associated with the influence of social media platforms such as Twitter and Facebook, and as a result have called into question the effects of the 'filter bubble' phenomenon on user exposure to fake news and echo chambers, spurring new interest in the term, with many concerned that the phenomenon may harm democracy. The term was coined by Internet activist Eli Pariser circa 2010 and discussed in his 2011 book of the same name; according to Pariser, users get less exposure to conflicting viewpoints and are isolated intellectually in their own informational bubble. He related an example in which one user searched Google for 'BP' and got investment news about British Petroleum, while another searcher got information about the Deepwater Horizon oil spill, and noted that the two search results pages were 'strikingly different'. Pariser defined his concept of a filter bubble in more formal terms as 'that personal ecosystem of information that's been catered by these algorithms'. An Internet user's past browsing and search history is built up over time when they indicate interest in topics by 'clicking links, viewing friends, putting movies in queue, reading news stories', and so forth. An Internet firm then uses this information to target advertising to the user, or make certain types of information appear more prominently in search results pages. This process is not random, as it operates under a three-step process, per Pariser, who states, 'First, you figure out who people are and what they like. Then, you provide them with content and services that best fit them. Finally, you tune to get the fit just right. Your identity shapes your media.' Pariser also reports: As of 2011, one engineer had told Pariser that Google looked at 57 different pieces of data to personally tailor a user's search results, including non-cookie data such as the type of computer being used and the user's physical location. Other terms have been used to describe this phenomenon, including 'ideological frames' and 'the figurative sphere surrounding you as you search the Internet'. A related term, 'echo chamber', was originally applied to news media, but is now applied to social media as well. Pariser's idea of the filter bubble was popularized after the TED talk he gave in May 2011, in which he gives examples of how filter bubbles work and where they can be seen. In a test seeking to demonstrate the filter bubble effect, Pariser asked several friends to search for the word 'Egypt' on Google and send him the results. Comparing two of the friends' first pages of results, while there was overlap between them on topics like news and travel, one friend's results prominently included links to information on the then-ongoing Egyptian revolution of 2011, while the other friend's first page of results did not include such links. In The Filter Bubble, Pariser warns that a potential downside to filtered searching is that it 'closes us off to new ideas, subjects, and important information', and 'creates the impression that our narrow self-interest is all that exists'. It is potentially harmful to both individuals and society, in his view. He criticized Google and Facebook for offering users 'too much candy, and not enough carrots'. He warned that 'invisible algorithmic editing of the web' may limit our exposure to new information and narrow our outlook. According to Pariser, the detrimental effects of filter bubbles include harm to the general society in the sense that they have the possibility of 'undermining civic discourse' and making people more vulnerable to 'propaganda and manipulation'. He wrote: Many people are unaware that filter bubbles even exist. This can be seen in an article on The Guardian, which mentioned the fact that 'more than 60% of Facebook users are entirely unaware of any curation on Facebook at all, believing instead that every single story from their friends and followed pages appeared in their news feed.' A brief explanation for how Facebook decides what goes on a user's news feed is through an algorithm which takes into account 'how you have interacted with similar posts in the past.'

[ "Social media" ]
Parent Topic
Child Topic
    No Parent Topic