Facebook is an echo chamber. This shouldn’t be breaking news, but from November 9th until now, it appears that it is breaking news for many people. That’s awkward.
Actually, as of July, it became an even bigger echo chamber — this summer, they tweaked their News Feed algorithm so that stuff from people more and more like you rises up. I should actually probably back this up one step, because I’m not sure everyone even realizes what the fuck an algorithm is. Basically, it’s driving most of the world right now — how we market, how we interact, etc. When you click stuff, the algorithm assumes you “like” that stuff; it feeds you more of that stuff down the road. This is good for you, because you’re seeing stuff you want to buy, or stuff that underscores your world view. It’s good for the brand using the algorithm, because you’re mostly happy and keep coming back.
Is it good for society? Good Lord no. Even our man Zuck is admitting Facebook may have a problem. The name of this problem is “an algorithm bubble.” Basically, how the hell can you understand “the other side” of any issue if all you see is stuff similar to you? This is the inherent echo chamber problem.
But why is Facebook being an echo chamber a problem? I barely use that site!
First off, you’re probably full of shit that you don’t use that site. About 15 percent of the globe uses it, and remember — half the globe don’t have Internet. Unfortunately (yes, I feel it’s unfortunate), a lot of people — maybe as high as 62 percent — in America get their news from Facebook. Problem is: Facebook consistently brands itself as a “tech company,” when in fact it’s a massive media company — but one not beholden to normal standards of media. We’ve all seen the “fake news problem” by now. You share something about how Hilary is dying of Parkinson’s, and to a lot of people, it looks like it came from Washington Post. It didn’t. It came from some dude in Crimea, probably. But not everyone is that smart, not everyone vets, and people are on Facebook already. Also remember: people are lazy. They see an article in front of them where they are, and maybe they kinda sorta believe it. Even if they don’t, if they keep seeing shit like that — which happens in an algorithmic echo chamber — it seeps in.
So here’s the bouncing ball:
- Many people use Facebook
- A lot of those people get their news from it
- Facebook isn’t beholden to any real standards around that news
- Algorithms make the whole thing an echo chamber whereby one half of America barely understands the other half
Seeing how this becomes a problem?
The echo chamber: A lot of people don’t really understand the Internet, honestly
This is maybe the bigger problem. Most people really have no fucking clue what the Internet is or how it works aside from how they specifically choose to use it. At one level, that’s beautiful — the Internet is a “brand” that 4B people experience differently every day. Some CMO just openly wept.
At another level, though, it’s absolutely terrifying. For example, most people I know — friends and foes alike — totally lie about their reason for using Facebook in the first place. You use it because people you know are on there and you want those people to see cool shit about your life. Basically, Facebook got monetized because people are selfish pricks who want to rub their friends’ noses in their moments of success. Can we just admit that? You share on there because you want people to “like” your beautiful family, your new car, your dog, your new job, your party pics, whatever. It’s all rooted in selfishness to an extent.
This blows up politically, though. When you’re pounding your chest about what a fuckwad Trump is — he was/is, yes — the echo chamber is resonating that to mostly people who already believe that. So when you caught 287 likes for your baby announcement and thought you were hot shit? Well, see … those 287 people may already agree with you re: HRC vs. DT. So now it’s a giant echo chamber. Literally about half of America is seeing one thing, and half is seeing another set of things.
We tend to always think of tech as “good” and “saving the world,” but what if it’s actually more divisive than connective?
This is a common argument about tech and the echo chamber, right?
Yes, for sure. I am not original in this thought. People have been saying since the 1700s that tech could destroy us. Now that AI is closer, maybe that fear is creeping up a bit more. I’m not really sure.
Think about it in these terms, though. The final states that put Trump over the top were the Upper Midwest, right? This is a conventional and cliched narrative, but the Upper Midwest lost jobs — and they lost them in part because of automation programs from tech companies. You can’t ignore that. Silicon Valley dudes get rich, in part, by creating things that make people less relevant to guys who run companies. You can quibble with that assessment, but you can’t outright deny it. Tech helps, but it also hurts. On November 8, 2016, the Upper Midwest punched back. And now here we are.
In a way, then, this whole “Facebook is an echo chamber” debate that’s been out there for the last week is just a representation of the broader discussion about tech. Did Facebook ruin the 2016 election? Probably not. It’s much deeper and more psychological than that. Did the Facebook echo chamber play a role? Yes, of course. In the same way: is tech destroying the Rust Belt? No. Many things are destroying parts of the Rust Belt. But is some dude creating software and AI robots hurting how we traditionally “make” things? Yes. It’s a complex tapestry, but tech fucks us almost as much as it saves us.
Can the echo chamber be solved?
Probably not, because Facebook and its product team are now part of a public company. They’re chasing cheddar. Apparently, they had tools to limit the fake news problem — but didn’t use ’em because they didn’t want to piss off users. Typical corporate shit. Gotta please those stakeholders! It’s funny how a “move fast and break things” startup became a company chasing nickels in the couch cushions. Oh well!
The point is this: the echo chamber is real. The problems with tech are real. We’re seeing some of this shit writ large right now. There needs to be a greater sense of responsibility (on the company side) and a greater effort at understanding what’s happening here (on the user side). I’m not actually sure either is possible — the production side wants $$$ and the user side wants their college friends to see their daughter in a sandbox. At that intersection, the echo chamber and the algorithm bubble probably persist.
What else you got on the echo chamber and tech’s roles and responsibilities here?