Bursting the News Filter Bubble
- Control Publications (Australia)
- Publication Type:
- Australasian Science, 2017, (June/July)
- Issue Date:
Copyright Clearance Process
- Recently Added
- In Progress
- Open Access
This item is open access.
After the US presidential elections, Google searches for Breitbart news peaked; friends – not secret Trump supporters – took to the right wing source to try and understand the view they were espousing. Since then there have been frequent calls for more of us to step out of our social media echo chambers and to ‘burst our filter bubble’. These echo chambers, it’s alleged, combine with a filter bubble effect: social media and search engine personalization that emphasizes content similar to content you have viewed or liked before. So, your Facebook feed only exposes you to views you already agree with, and to information that supports those views, leading to a general deterioration in public and political debate as we seem unable or unwilling to engage across perspectives. If we believe this argument, then your use of Facebook presents an information-access issue, insulating you from diverse perspectives exposure to which would improve political discourse. Empirical research on this topic is hard. Companies control their data, users typically don’t state their politics explicitly, and the impact of proprietary algorithms can only be guessed at. Whether you’re liberal or conservative you’re more likely to believe information that confirms your prior beliefs, the question is, what role is technology playing in this? And does this cognitive bias mean that we should indiscriminately take a more even handed approach to sources?
Please use this identifier to cite or link to this item: