Eli Pariser, who’s currently the chief executive of Upworthy, wrote a book in 2011 called The Filter Bubble. A nine-minute TED Talk on many of the same topics is embedded above, but essentially, the idea is that personalized search — i.e. the algorithmic targeting that Google and Facebook and an increasing amount of websites do — actually can make our search experience less rewarding, because we’re only getting exposed to information and links that some formula believes we’d like. How do we get relevant competing viewpoints so that we’re truly informed on an issue? Without that, won’t partisan rancor just continue to increase by generation, since we’re only being fed links and stories and products that are tied to our interests? Think about this example from current events for a second — there’s a new David Remnick-Barack Obama sitdown/profile out in The New Yorker. In it, they discuss the emergence of marijuana legalization legislation, and Remnick writes this:
Less dangerous, he said, “in terms of its impact on the individual consumer. It’s not something I encourage, and I’ve told my daughters I think it’s a bad idea, a waste of time, not very healthy.” What clearly does trouble him is the radically disproportionate arrests and incarcerations for marijuana among minorities. “Middle-class kids don’t get locked up for smoking pot, and poor kids do,” he said. “And African-American kids and Latino kids are more likely to be poor and less likely to have the resources and the support to avoid unduly harsh penalties.” But, he said, “we should not be locking up kids or individual users for long stretches of jail time when some of the folks who are writing those laws have probably done the same thing.” Accordingly, he said of the legalization of marijuana in Colorado and Washington that “it’s important for it to go forward because it’s important for society not to have a situation in which a large portion of people have at one time or another broken the law and only a select few get punished.”
As is his habit, he nimbly argued the other side. “Having said all that, those who argue that legalizing marijuana is a panacea and it solves all these social problems I think are probably overstating the case. There is a lot of hair on that policy. And the experiment that’s going to be taking place in Colorado and Washington is going to be, I think, a challenge.” He noted the slippery-slope arguments that might arise. “I also think that, when it comes to harder drugs, the harm done to the user is profound and the social costs are profound. And you do start getting into some difficult line-drawing issues. If marijuana is fully legalized and at some point folks say, Well, we can come up with a negotiated dose of cocaine that we can show is not any more harmful than vodka, are we open to that? If somebody says, We’ve got a finely calibrated dose of meth, it isn’t going to kill you or rot your teeth, are we O.K. with that?”
Look at the first line of the second paragraph: “… he nimbly argued the other side.”
These guys have tested this approach by focusing on the topic of abortion as discussed by people in Chile in August and September this year. Chile has some of the most restrictive anti-abortion laws on the planet–it was legalised here in 1931 and then made illegal again in 1989. With presidential elections in November, a highly polarised debate was raging in the country at that time.
They found over 40,000 Twitter users who had expressed an opinion using the hashtags such as #pro-life and #pro-choice. They trimmed this group by choosing only those who gave their location as Chile and by excluding those who tweeted rarely. That left over 3000 Twitter users.
The team then computed the difference in the views of these users on this and other topics using the regularity with which they used certain other keywords. This allowed them to create a kind of wordcloud for each user that acted like a kind of data portrait.
They then recommended tweets to each person based on similarities between their word clouds and especially when they differed in their views on the topic of abortion.
The results show that people can be more open than expected to ideas that oppose their own. It turns out that users who openly speak about sensitive issues are more open to receive recommendations authored by people with opposing views, say Graells-Garrido and co.
So there’s an indirect way to connect dissimilar people — show them where their middle ground lies. It’s not revolutionary but it’s important. The current Google SEO algorithm is called “Hummingbird;” they tend to get revised every so often. Perhaps the next revision could somehow include the ability to get differing viewpoints — although that would kinda make no sense for Google, given that their ad side is based on targeting exactly what people need near the top of their results.
There’s also this aspect to the filter/algorithm bubble: conversation has slowly started to die. If you go to a dinner party with mid-40somethings and their 6-to-12 year old children (as described in the opening anecdote of that link), it’s an interesting thing: the 45 year-olds can still talk to each other, but the 10 year-olds, despite other 10 year-olds being around, will focus on their screens. This is probably less filter bubble and more broadly the impact of technology, but when you take the two together, it’s a bit dangerous for the next 30 years of interaction: basically you have people getting information that’s tailored for them (limited opposing viewpoint), and then subsequently you have conversation (which typically susses out differing ways of looking at things) on the decline. I’m not saying society is doomed by any means — humans are a remarkably resilient people and will evolve in new and different ways — but this is certainly moving towards a paradigm shift of sorts.