Wherever humans gather, news spreads. This has been the case throughout history, from the stone age fire pit to the cubicle age water cooler. Today, Facebook is the most efficient news conduit: One billion people check Facebook eight times each day, on average. They read declarations from friends, from family and from organizations alongside a masterfully-curated selection of paid content.
Liking, clicking or even lingering on a post teaches Facebook exactly what moves you. Over time its algorithm locks into your rhythm, and dispenses dopamine-triggering headlines and videos. Before long your communication with the world becomes a well-fortified silo of ideology that’s consistently reinforced by the content chosen for you by Facebook. This process mimics the off-line world where we tend to surround ourselves with people whose core beliefs align with our own. Unlike the off-line world though, some entity (Facebook) knows more about us than in some way we know about ourselves, and that insight is for sale.
Enter the marketers who know that whatever your current perspective, a slightly more extreme take on what you believe can be tantalizing. If you believe some politician is bad, content that promises ‘wait till you read this recently uncovered email, recording, memo, etc.’ the exposes some ‘grave lie, conspiracy, hypocrisy, etc.’ is hard not to click. Even if you don’t click, the daily stream of salacious headlines that transfer from your Facebook news feed to your brain shapes your reality.
A reality that can be manipulated.
A team of Russians at the Internet Research Agency built a massive propaganda machine. The latest fruit of Robert Mueller’s investigation asserts this about the modest-sized group:
“Defendants, posing as U.S. persons and creating false U.S. personas, operated social media pages and groups designed to attract U.S. audiences. These groups and pages, which addressed divisive U.S. political and social issues, falsely claimed to be controlled by U.S. activists when, in fact, they were controlled by Defendants. Defendants also used the stolen identities of real U.S. persons to post on ORGANIZATION-controlled social media accounts. Over time, these social media accounts became Defendants’ means to reach significant numbers of Americans for purposes of interfering with the U.S. political system, including the presidential election of 2016.”
This discovery reveals the layered yet uncomplicated methods by which social media can be manipulated to rewire our perspectives. Before the internet age, we received information in the relative singularity of person-to-person communication. One person claimed something, and the receiver absorbed that claim in relation to the respect they had for the claimer and the believability of the claim. Coordinated lies were rare and hard to pull off. But not any longer.
Triangulated falsehoods are the “advancements in communication” that our minds haven’t caught up with.
Now any claim can come at a person from a dozen different sources, completely overwhelming their suddenly antiquated critical-thinking skills. The very communication fabric that holds the modern tribe together – social media content – has become a marketing tool for those who want to influence your spend and your vote.
As we see with the Internet Research Agency strategy, what can’t be bought can be stolen. What can’t be stolen can be traded for dopamine (likes and positive comments) for the retelling of the narrative. The most active and influential users on Facebook are the most likely to be invested to their silos. They are the storytellers who drive the narrative and shape the perspectives in their online communities.
Injected fake news feeds the influencer by reinforcing their world view, and they are fed again when the retell the news by the validating positive feedback (likes/attention) they receive from the tribe they influence. The Russian-based Internet Research Agency effectively injected fake news, and they did it at scale.
Facebook knows how massive the problem is. Until it can find a solution, the company is doubling its information security staff, from 10,000 to 20,000 employees. Founder Mark Zuckerberg also recently announced significant changes to the Facebook algorithm. In short: you’ll see fewer business posts and more family and friends content. This decrease in business posts means a decrease in ad revenue. The announcement caused Facebook’s stock to drop 4.4%, which dented Zuckerberg’s personal net worth by $3.3 billion.
These moves will constrict the flow of fake news – but they won’t solve the underlying problem.
Our current methods of truth scoring have been beaten by connecting marketing to affirmation. What must come next is a move from the need to be right to the need to be dupe-proof. If the platforms, like Facebook, that shape our narrative can make objectivity and critical thinking the superior position instead of liberal or conservative, then we’ll take an important step in mitigating the sway that news manipulators have.