At some point during the 2016 presidential election, I decided to undertake a social media experiment. I “unfollowed” every Facebook friend that posted something political, regardless of affiliation.
I had the unique opportunity of watching the election conclude from afar. (I voted weeks beforehand.) Speaking at a conference in Europe, I witnessed a cataclysmic result through a haze of jet lag, with a different-alike people, disconnected from home by distance and time. I spent my time overseas trying to answer the questions of concerned and curious onlookers. A wardrobe attendant even questioned my political affiliation before returning my coat and bag. The experience was unsettling.
The result of my experiment: I was left with my family and a handful of friends who rarely sign into Facebook. More importantly, I was left with the realization that the platform adds very little to my life. It’s a noise box. It reverberates, it mixes, it amplifies nonsense. Through its commotion, it managed to elect a tinpot president bent on the same conspiracy theories that it fosters.
This is no judgement upon anyone that finds something positive in the noise. However, the world will not find it’s way when mediums, that allow us to spew hate and falsehoods at no one in particular, are allowed to advance unchecked.
The time has come to disconnect from the platform.
Facebook has no incentive to prevent fake news from spreading.
Since the election, news media has been rife with coverage of Facebook’s “fake news” problem. Facebook is inherently an advertising venture. It’s a click-machine and, let’s face it, fake news generates a boatload of clicks.
While I have no explicit evidence that Facebook sought to promote click-bait content, it’s News Feed feature set has evolved to promote it. As an example of this evolution, consider Facebook’s “Trending” sidebar content. It initially appeared — a product of the battle against Twitter — as a collection of links with a title, image, and summary. Today, the feature exists as a list of nothing more than one or two word phrases, forcing a user to hover or click through to get some sense of what it may pertain to. One can only assume that the feature evolved in this manner to funnel traffic.
Facebook is a company and needs to make its money somewhere. When content of dubious provenance began appearing as trending stories though, it crossed a line. No efforts were made to even acknowledge the issue until after the election, amidst an outcry that portended an impact to their user base.
Easy thought is easy.
The more frightening aspect of Facebook’s problem is the allure and gratification of easy thinking. Much of the content that we encounter through social media is tuned for virality — quick quips and memes. It employs humor, sensationalist statements, and affecting imagery. It’s impactful stuff. It hits us hard and its digestible. It’s easy to swallow deceptive inaccuracies.
I’ve fallen prey to some of this material myself and that’s frightening to me. Easy thought is deceptively alluring. It feeds the gratification centers of our brain and validates emotional cues. Studies indicate that we’re wired to eat this stuff up.
The consumption of propaganda is concerning but what concerns me more is its propensity to fuel superficial solutions.
In engineering, superficial solutions tend to treat symptoms. They ignore architectural implications. They address a narrow concern without consideration of broader applications. They overlook the root factors that may have resulted in the problem in the first place. We must appreciate this fact so as not to jeopardize the future of a project.
Superficiality in political solutions presents similar problems. They also treat symptoms and they ignore societal implications. They trade the future for the present. Sadly, they are often a tool wielded by ambitious leaders as a means to an ulterior end. Have immigration concerns? Build a wall. Is an industry failing? Bail it out. Does someone’s beliefs confuse or frighten you? Ban them.
In society, just as in engineering, we must be cognizant of superficiality and remain vigilant against its temptation.
Facebook (like other media outlets) has no incentive to silence their inadvertent propaganda machine unless it impacts their bottom line. Some portion of their user base has to demonstrate that action will be more profitable than inaction. In leaving the platform, I hope to contribute some small amount of progress toward that cause.