My initial reaction to Facebook having personal information about me was more “whatever” than “OMG!”
I suspect I wasn’t alone in not caring much that Facebook knew my birthday, my political leanings, and what kind of music I like. After all, I readily volunteered all of that information when I chose to share it on social media. I’ve never made a secret of it, and life’s too short to worry about it anyway.
But now I do worry. And the more I learn, the more I worry.
It’s not about me personally. I still couldn’t care less what somebody might glean from Facebook about my likes and dislikes. And truth to tell, I don’t lose any sleep over the fact that companies use that data to target me for advertising of goods and services they think I might like.
What worries me is how all of the data we hand over to Facebook and other social media platforms is harvested, brokered, mined, analyzed and weaponized to manipulate elections.
Every single thing we enter on Facebook, from travel photos to “likes,” contributes to our personal profiles. The cumulative effect is staggering. A 2015 research report by experts at the University of Cambridge and Stanford found that “Your Facebook activity data alone could indicate your psychological make-up more accurately than your friends, your family – better even than your partner, given enough info.”
Information about the psychological makeup of large groups of people has enormous commercial and political value:
“Facebook can learn almost anything about you by using artificial intelligence to analyze your behavior,” said Peter Eckersley, the chief computer scientist for the Electronic Frontier Foundation, a digital rights nonprofit. “That knowledge turns out to be perfect both for advertising and propaganda.”
In December of 2015, The Guardian first reported that Cambridge Analytica, then a little-known data company financed by right-wing billionaire Robert Mercer, was embedded in Ted Cruz’s presidential campaign. The company was coordinating an aggressive voter-targeting campaign using “psychographic profiles” based upon data harvested from Facebook users, largely without their permission.
Following that report, The Guardian’s Carol Cadwalladr and others blew the lid off what became the Cambridge Analytica scandal. They detailed how CA harvested and used data obtained from Facebook and other social media platforms to manipulate elections, including but not limited to the 2016 U.S. presidential election.
CA obtained its Facebook data from a young Cambridge University lecturer named Alexandr Kogan. Kogan, claiming that he was conducting purely academic research, created an app that utilized Facebook’s login procedures to obtain personal profile data from some 270,000 Facebook users. He then leveraged the data from those users by obtaining information about their friends. None of this was prohibited by Facebook at the time.
In the end, Kogan had accessed data from as many as 50 million Facebook users, information he subsequently shared with Cambridge Analytica.
Once they got their hooks into all of this data, Cambridge Analytica and the Trump Campaign mined it to identify persuadable voters in crucial swing states and analyzed the data to determine what kind of messaging might influence their votes.
Then they saturated social media with ads designed to appeal to the fears and prejudices they had uncovered by mining the Facebook data.
The problem with this propaganda campaign isn’t just the dishonest and often vile content of the ads.
It’s the volume, more like saturation bombing than an advertising campaign. Trump’s 2016 digital media director, Brad Parscale, bragged that he ran 50,000 to 60,000 variations of Facebook ads each day during the campaign. For this, Parscale was rewarded by being named as Trump’s 2020 re-election campaign manager.
And it’s the lack of attribution. It was impossible to tell who was behind the phony ads. Or how much they cost. Or where the money came from.
And it’s the narrow distribution. Since the ads were directed only to small but crucial targeted audiences, they were mostly invisible to regulators and the general public. Most people didn’t even know it was happening.
And it’s the lack of any comprehensive record of what the Trump campaign was placing on social media. The ads were written in the modern-day equivalent of disappearing ink. Like mercenaries, once they completed their mission, they disappeared.
And it’s not just U.S. elections that have been corrupted by dark, data-driven propaganda campaigns.
The Netflix documentary film “The Great Hack” shows how Cambridge Analytica developed a phony grass-roots campaign called “Do-So” to suppress the Afro-Trinidadian vote in order to favor Indo-Trinidadian in the 2010 general election in Trinidad and Tobago. And CA also played an important, although perhaps uncompensated, role in Brexit. New Yorker reporter Jane Mayer put it this way:
“For two years, observers have speculated that the June, 2016, Brexit campaign in the U.K. served as a petri dish for Donald Trump’s Presidential campaign in the United States. Now there is new evidence that it did. Newly surfaced e-mails show that the former Trump adviser Steve Bannon, and Cambridge Analytica, the Big Data company that he worked for at the time, were simultaneously incubating both nationalist political movements in 2015.”
No country is immune. Even progressive, egalitarian Sweden increasingly finds itself under the sway of nativist, made-in-Russia (and elsewhere) propaganda designed to convince Swedes that “immigration has brought crime, chaos and a fraying of the cherished social safety net, not to mention a withering away of national culture and tradition.” Investigative reporting by the New York Times links the propaganda campaign in Sweden to “the workings of an international disinformation machine, devoted to the cultivation, provocation and amplification of far-right, anti-immigrant passions and political forces. Indeed, that machine, most influentially rooted in Vladimir V. Putin’s Russia and the American far right, underscores a fundamental irony of this political moment: the globalization of nationalism.”
And it’s still happening right before our eyes in the U.S. The Trump 2020 re-election campaign spent over $11.1 million on Facebook ads alone in the first six months of 2019. Over 2000 of those ads use the word “invasion” to describe migrants seeking entry at the U.S.-Mexico border.
Unfortunately, there’s no quick fix for this.
Absent an unprecedented outbreak of self-control, people are not going to stop expressing themselves on social media in ways that add revealing personal information about themselves, their families and their friends. Consulting firms and data experts are not going to stop analyzing that data in search of persuadable target audiences. And political campaigns are not going to stop bombarding persuadable voters with nasty, misleading ads.
Senator Amy Klobuchar has introduced a bill in the Senate called the Honest Ads Act. Lindsay Graham is a co-sponsor of the Act, so it has some bipartisan support. The Act would close a loophole to existing election laws by subjecting paid internet and digital advertisements to government regulation. It would also require online platforms to make all reasonable efforts to ensure that foreign individuals and entities are not purchasing political advertisements, and require disclosure of target audiences, the number of views generated, the dates and times of publication, the rates charged, and information about the purchaser.
That’s a start, but increased transparency alone is not going to solve the problem.
A study by the Brookings Institute suggests other ways to combat fake news and disinformation in political campaigns. While it’s nice to know that somebody is on the case, the breadth of the problem, the First Amendment risks of regulation, and the fundamental changes that would be required from government, the news industry, technology companies, educational institutions, and the public at large make the effort look more like a utopian fantasy than a real-world fix.
Until somebody comes up with something better, it looks like this problem isn’t going away. Not any time soon. Certainly not before the 2020 presidential election.
And that’s something to worry about.