You’ve no doubt heard a lot about echo chambers and filter bubbles lately.
Despite being first described back in 2011 by Upworthy founder Eli Pariser, the phenomenon has garnered renewed attention in the past year, thanks to its perceived impact on the way we consume information and how that affects our politics.
In case you aren’t familiar, terms like ‘online echo chamber’ or ‘filter bubble’ are used to describe the way social media websites and search engines show you the content you’re most likely to want to see first. By filtering out the stuff you probably aren’t going to click on — based on your likes, demographic information and previous clicks – these websites mirror your own preferences back to you, creating a loop of self-affirmation that keeps you online.
It has become clear recently that misinformation flourishes when people aren’t exposed to both sides of the story; making it easier for people to be manipulated by fabricated news stories. Jill Abramson, the former executive editor of the New York Times, recently told an audience at the American Academy in Berlin that “the new technology of the internet has fueled extremism and polarized the electorate”. The much-discussed phenomenon of fake news, she said, is only a symptom of our underlying isolation from one another online.
There’s little doubt that social media tends to reflect easily digestible opinions back to us, and can filter out the multiplicity of views that encourages debate in a healthy democracy. The good news is that there are various strategies you can use to not only broaden the range of information you are exposed to, but also push back against the idea that we’re better off in online neighborhoods where everyone thinks more or less the same.
Read on for five tips for breaking out of your online echo chamber:
Broaden Your Horizons
Probably the single most important thing you can do is remember that social media tends to create a bubble. Simply being conscious of this fact helps, as does making a point of stepping out of your echo chamber every so often and actively checking what people in the other camp are saying.
Forget the comment sections and hyper-partisan commentators, though: they’ll only confirm all your worst opinions about the other side. Instead, read well-respected columnists and see how other publications report the big stories.
Some newspapers and websites have started offering readers briefings from the other side. Slate’s “Today in Conservative Media” is a good place for liberals to start, for example.
Take Control of Your News Feed
According to research by PEW, 44% of the U.S. population received current affairs news through Facebook in 2016. That might make you wonder how there can be enough cat videos, holiday photos and personality tests to populate the entire feeds of the other 56% of Americans, but it also demonstrates that Facebook is the main place where the filter bubble intersects with politics.
A simple step to combat this phenomenon is to adjust your News Feed Preferences to show you people or websites you normally disagree with first. This can be done in the menu at the top-right of your Facebook page.
Force yourself to consider material and arguments that don’t confirm your biases before getting your fix of self-affirmation. If you want to go further, the browser extension Escape Your Bubble injects a curated dose of right-wing views into liberals’ feeds, and vice versa, based on where you fall on the political spectrum.
Embrace the Fake Data Strategy
There is also an argument for pushing back against the tools that data analysts and marketers use to group internet users into clusters, so they can promote content and products to them. In a recent book, Helen Nissenbaum recommends a strategy of ‘obfuscation’: creating fake data while you browse, in order to confuse the algorithms tracking your movements.
She co-initiated a browser extension called AdNauseum, a controversial tool that stops ads appearing on the pages you view, while also ‘clicking’ on all of them to obscure the patterns on which online data gathering relies. This can also mean more ad revenue for the sites you are visiting, which are, presumably, reliable news sources that pay journalists for quality reporting.
Google hates AdNauseam because it completely undermines its business model. It has been blocked on Chrome since late last year, so you’ll have to either install it manually in developer mode (easier than it sounds) or use another browser.
Fake Out the Machines
AdNauseam is a follow-up in some ways to Nissenbaum’s TrackMeNot, which sent out random requests as users browsed the internet, in order to compile a false set of digital breadcrumbs that confuse algorithmic targeting. Another way to achieve a similar effect on Facebook wold be to strategically ‘like’ and follow a bunch of things that people in your demographic aren’t expected to be into, even if they don’t really interest you.
One of the biggest stories in data analysis in the last year involved a company called Cambridge Analytica, which was reported to have worked with Republicans in America and pro-Brexit politicians in the UK to create Facebook ads micro-targeted to key voters.
The company had apparently amassed so much data through online personality tests and social media profiles that they could correlate preferences for certain products, brands or entertainment with character traits like openness or security-consciousness. The company claimed it could use this information to push the perfect message to specific voters in order to sway their decision on election day.
A series of random likes on your profile would make it harder for you to be manipulated by this kind of campaign, so feel free to start fake-‘liking’.
Make ‘Meh’ Great Again
It is easy to mistake online social networks and the communication they facilitate for real conversations. Despite all the ways they resemble each other, one crucial difference is that when you and your friends shop, eat, talk and hang out in the real world, your every comment, choice and movement isn’t being tracked and fed into a vast data set to better sell you stuff in the future.
In a blog post in February, Mark Zuckerberg said he wanted Facebook to help combat the filter bubble and expose users a wider range of views, writing: “If we connect with people about what we have in common – sports teams, TV shows, interests – it is easier to have dialogue about what we disagree on.”
That might be true, but it still relies on an over-simplified understanding of how we get along. Social media sites encourage us to react with a binary of like or dislike – or extremes like love, laughter, outrage and tears – which are more helpful for forming data sets than for understanding complex human reactions and interactions.
The more revealing middle ground might lie in all those things we don’t care a huge amount about. Wendy Chun has suggested ‘clusters of mutual indifference’ as a way of organizing internet users into highly diverse groups which nonetheless have a lot in common. One small way of getting developers to take notice might be to make more use of this guy 😐 in your online comments (probably better suited to a story you aren’t interested in than your cousin’s engagement announcement).
Despite the incredible advances we’ve seen in our own lifetimes, the internet is still in its infancy in terms of how it will impact our choices and affect our societies. The good news, though, is that developers are constantly gathering feedback and integrating it into new iterations.
As academics like Nissenbaum and Chun show, we don’t have to accept technology to be changed by it, but we can take steps to help set its future course.
Now read about how your smartphone might be ruining your mental health.
- Lead image: Stephen Cheetham
- Words: Michael Hornsby