This is very similar to what happens with
filter bubbles (Pariser, 2011) and echo chambers on social media in general and on Facebook in particular.
Indeed, the ways in which algorithms have segmented elements of society into their own "
filter bubbles" has resulted in a scenario where the very essence of truth is now in question.
Even Pariser, who gave the
filter bubble its name, agrees the internet isn't entirely to blame.
While the app will enable users to get "personalized" news, it will also include top stories for all readers, aiming to break the so-called
filter bubble of information designed to reinforce people's biases.
Outro ponto que trago para o debate e o conceito de
filter bubble e o vies da confirmacao.
"But we can counter both the
filter bubble and we can counter false narratives this way."
The result would be to burst the social media
filter bubble by rounding out our experience around news and current issues, introducing more variety into the information flow that feeds our intuition, judgment, decisions, and behavior.
The social-media giant is being criticized for the way its algorithms conspire to cocoon each user inside a "
filter bubble," surrounding people with content that rarely challenges their worldview, including too many false news reports that often fail to get debunked.
This phenomenon reduces information diversity and keeps individuals in their information bubble, which explains the term "
Filter Bubble" (12).
You've likely been told that Facebook paves the way for a "
filter bubble," or an "echo chamber." (1) This argument suggests that since Facebook is trying to optimize for content you see, you'll never see opposing views.