User Tools

Site Tools


filter_bubble
Snippet from Wikipedia: Filter bubble

A filter bubble – a term coined by internet activist Eli Pariser – is a state of intellectual isolation that allegedly can result from personalized searches when a website algorithm selectively guesses what information a user would like to see based on information about the user, such as location, past click-behavior and search history. As a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles. The choices made by these algorithms are not transparent. Prime examples include Google Personalized Search results and Facebook's personalized news-stream. The bubble effect may have negative implications for civic discourse, according to Pariser, but contrasting views regard the effect as minimal and addressable. The results of the U.S. presidential election in 2016 have been associated with the influence of social media platforms such as Twitter and Facebook, and as a result have called into question the effects of the "filter bubble" phenomenon on user exposure to fake news and echo chambers, spurring new interest in the term, with many concerned that the phenomenon may harm democracy and well-being by making the effects of misinformation worse.

(Technology such as social media) “lets you go off with like-minded people, so you're not mixing and sharing and understanding other points of view ... It's super important. It's turned out to be more of a problem than I, or many others, would have expected.”

A filter bubble is a result state in which a website algorithm selectively guesses what information a user would like to see based on information about the user (such as location, past click behaviour and search history) and, as a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles. Prime examples are Google's personalised search results and Facebook's personalised news stream. The term was coined by internet activist Eli Pariser in his book by the same name; according to Pariser, users get less exposure to conflicting viewpoints and are isolated intellectually in their own informational bubble. Pariser related an example in which one user searched Google for “BP” and got investment news about British Petroleum while another searcher got information about the Deepwater Horizon oil spill and that the two search results pages were “strikingly different.”<ref name=twsT43>

</ref><ref name=twsO11>

</ref><ref name=twsO14>

</ref><ref name=twsO13/> The bubble effect may have negative implications for civic discourse, according to Pariser, but there are contrasting views suggesting the effect is minimal<ref name=twsO13/> and addressable.<ref>

</ref>

Concept

Pariser defined his concept of filter bubble in more formal terms as “that personal ecosystem of information that's been catered by these algorithms”.<ref name=twsT43/> Other terms have been used to describe this phenomenon, including “ideological frames”<ref name=twsO11/> or a “figurative sphere surrounding you as you search the Internet.”<ref name=twsO15>

</ref> The past search history is built up over time when an Internet user indicates interest in topics by “clicking links, viewing friends, putting movies in your queue, reading news stories” and so forth.<ref name=twsO15/> An Internet firm then uses this information to target advertising to the user or make it appear more prominently in a search results query page.<ref name=twsO15 /> Pariser's concern is somewhat similar to one made by Tim Berners-Lee in a 2010 report in The Guardian along the lines of a

which happens when Internet social networking sites were walling off content from other competing sites––as a way of grabbing a greater share of all Internet users––such that the “more you enter, the more you become locked in” to the information within a specific Internet site.<ref name=twsO16/> It becomes a “closed silo of content” with the risk of fragmenting the Worldwide Web, according to Berners-Lee.<ref name=twsO16>

</ref>

In The Filter Bubble, Pariser warns that a potential downside to filtered searching is that it “closes us off to new ideas, subjects, and important information”<ref name=twsT41>

</ref> and “creates the impression that our narrow self-interest is all that exists.”<ref name=twsO11/> It is potentially harmful to both individuals and society, in his view. He criticized Google and Facebook for offering users “too much candy, and not enough carrots”.<ref name=twsT42>

</ref> He warned that “invisible algorithmic editing of the web” may “limit our exposure to new information and narrow our outlook.” According to Pariser, the detrimental effects of filter bubbles include harm to the general society in the sense that it has the possibility of “undermining civic discourse” and making people more vulnerable to “propaganda and manipulation.”<ref name=twsO11/> He wrote:

</ref>}}

Reactions

There are conflicting reports about the extent to which personalised filtering is happening and whether such activity is beneficial or harmful. Analyst Jacob Weisberg writing in Slate Magazine did a small non-scientific experiment to test Pariser's theory which involved five associates with different ideological backgrounds conducting exactly the same search—the results of all five search queries were nearly identical across four different searches, suggesting that a filter bubble was not in effect, which led him to write that a situation in which all people are “feeding at the trough of a Daily Me” was overblown.<ref name=twsO11/> A scientific study from Wharton<ref name=“Fleder” /> that analyzed personalized recommendations also found that these filters can actually create commonality, not fragmentation, in online music taste. Consumers apparently use the filter to expand their taste, not limit it.<ref name=“Fleder”>

</ref> Book reviewer Paul Boutin did a similar experiment among people with differing search histories, and found results similar to Weisberg's with nearly identical search results.<ref name=twsO13/> Harvard law professor Jonathan Zittrain disputed the extent to which personalisation filters distort Google search results; he said “the effects of search personalization have been light.”<ref name=twsO11/> Further, there are reports that users can shut off personalisation features on Google if they choose<ref name=twsO12>

</ref> by deleting the Web history<ref name=twsO13/> and by other methods. A spokesperson for Google suggested that algorithms were added to Google search engines to deliberately “limit personalization and promote variety.”<ref name=twsO11/>

Nevertheless, there are reports that Google and other sites have vast information which might enable them to further personalise a user's Internet experience if they chose to do so. One account suggested that Google can keep track of user past histories even if they don't have a personal Google account or are not logged into one.<ref name=twsO13>

</ref> One report was that Google has collected “10 years worth” of information amassed from varying sources, such as Gmail, Google Maps, and other services besides its search engine,<ref name=twsO14/> although a contrary report was that trying to personalise the Internet for each user was technically challenging for an Internet firm to achieve despite the huge amounts of available web data. Analyst Doug Gross of CNN suggested that filtered searching seemed to be more helpful for consumers than for citizens, and would help a consumer looking for “pizza” find local delivery options based on a personalised search and appropriately filter out distant pizza stores.<ref name=twsO14/> There is agreement that sites within the Internet, such as the Washington Post, The New York Times, and others are pushing efforts towards creating personalized information engines, with the aim of tailoring search results to those that users are likely to like or agree with.<ref name=twsO11/>

The filter bubble concept is similar to a phenomenon in which people and organisations seek information which is initially perceived as relevant but which turns out in fact to be useless or only partially useful, and avoid information perceived as irrelevant but which turns out to be useful. The problem happens because the real relevance of a particular fact or concept in these cases is apparent only after that fact has become known. Before that, the idea of learning a particular fact may have been dismissed because of a misperception of irrelevance. Accordingly, the information seeker is trapped in a paradox and fails to learn what he or she really needs to know, and can be caught in a kind of intellectual blind spot. This phenomenon has been described as the relevance paradox;<ref name=twsT41/> it happens in many situations throughout human intellectual development, and is an important issue for science and education. A book entitled The IRG Solution predicted and analysed this problem and suggested a generalised solution.

Further reading

  • Pariser, Eli. The Filter Bubble: What the Internet Is Hiding from You, Penguin Press (New York, May 2011) ISBN 978-1-59420-300-8

See also

References

filter_bubble.txt · Last modified: 2020/03/12 18:34 (external edit)