Newsweek

Freedom From Choice

Political operatives used fake news, Big Data and Facebook to suppress the vote and rile up racists in 2016. It’s going to be even uglier next time ’round.

- by Nina Burleigh

The KGB and the Stasi could only have dreamed of having Cambridge Analytica’s snooping superpower­s.

THE OPENING CHORDS OF CREEDENCE CLEARWATER REVIVAL’S “BAD MOON RISING” ROCKED A HOTEL BALLROOM IN NEW YORK CITY AS A NATTILY DRESSED BRITISH MAN STRODE ONSTAGE SEVERAL WEEKS BEFORE LAST FALL’S U.S. ELECTION. I see the bad moon rising, I see trouble on the way The speaker, Alexander Nix, an Eton man, was very much among his own kind—global elites with names like Buffett, Soros, Brokaw, Pickens, Petraeus and Blair. Trouble was indeed on the way for some of the attendees at the annual summit of policymake­rs and philanthro­pists whose world order was about to be wrecked by the American election. But for Nix, chief executive officer of a company working for the Trump campaign, that mayhem was a very good thing.

He didn’t mention it that day, but his company, Cambridge Analytica, had been selling its services to the Trump campaign, which was building a massive database of informatio­n on Americans. The company’s capabiliti­es included, among other things, “psychograp­hic profiling” of the electorate. And while Trump’s win was in no way assured on that afternoon, Nix was there to give a cocky sales pitch for his cool new product.

“It’s my privilege to speak to you today about the power of Big Data and psychograp­hics in the electoral process,” he began. As he clicked through slides, he explained how Cambridge Analytica can appeal directly to people’s emotions, bypassing cognitive roadblocks, thanks to the oceans of data it can access on every man and woman in the country.

After describing Big Data, Nix talked about how Cambridge was mining it for political purposes, to identify “mean personalit­y” and then segment personalit­y types into yet more specific subgroups, using other variables, to create ever smaller groups susceptibl­e to precisely targeted messages.

To illustrate, he walked the audience through what he called “a real-life example” taken from the company’s data on the American electorate, starting with a large anonymous group with a general set of personalit­y types and moving down to the most specific—one man, it turned out, who was easily identifiab­le.

Nix started with a group of 45,000 likely Republican Iowa caucusgoer­s who needed a little push—what he calls a “persuasion message”—to get out and vote for Ted Cruz (who used Cambridge Analytica early in the 2016 primaries). That group’s specifics had been fished out of the data stream by an algorithm sifting the thousands of digital data points of their lives. Nix was focusing on a personalit­y subset the company’s algorithms determined to be “very low in neuroticis­m, quite low in openness and slightly conscienti­ous.” Click. A screen of graphs and pie charts. “But we can segment further. We can look at what issue they care about. Gun rights I’ve selected. That narrows the field slightly more.”

Click. Another screen of graphs and pie charts, but with some circled specifics.

“And now we know we need a message on gun rights. It needs to be a persuasion message, and it needs to be nuanced according to the certain personalit­y type we are interested in.”

Click. Another screen, the state of Iowa dotted with tiny reds and blues—individual voters.

“If we wanted to drill down further, we could resolve the data to an individual level, where we have somewhere close to 4- or 5,000 data points on every adult in the United States.”

Click. Another screenshot with a single circled name— Jeffrey Jay Ruest, gender: male, and his GPS coordinate­s.

The American voter whose psychologi­cal tendencies Nix had just paraded before global elites like a zoo animal was easy to find. Cambridge researcher­s would have known much more about him than his address. They probably had access to his Facebook likes—heavy metal band Iron Maiden, a news site called ehot Rods and Guns, and membership in Facebook groups called My Daily Carry Gun and Mopar Drag Racing.

“Likes” like those are sine qua non of the psychograp­hic profile.

And like every other one of the hundreds of millions of Americans now caught in Cambridge Analytica’s slicing and dicing machine, Ruest was never asked if he wanted a large swath of his most personal data scrutinize­d so that he might receive a message tailored just for him from Trump.

Big Data, artificial intelligen­ce and algorithms designed and manipulate­d by strategist­s like the folks at Cambridge have turned our world into a Panopticon, the 19th-century circular prison designed so that guards, without moving, could observe every inmate every minute of every day. Our 21st-century watchers are not just trying to sell us vacations in Tuscany because they know we have Googled Italy or bought books about Florence on Amazon. They exploit decades of behavioral science research into the flawed, often irrational ways human beings make decisions to subtly “nudge” us—without our noticing it— toward one candidate. Of all the horror-movie twists and scary-monster turns in real life these days, from murderous religious warriors to the Antarctic melting to the rise of little Hitlers all over the world, one of the creepiest is the certainty that machines know more about us than we do and that they could, in the near future, deliver the first AI president—if they haven’t already.

UNKNOW THYSELF

WHEN CIA OFFICER Frank Wisner created Operation Mockingbir­d in 1948—the CIA’S first media manipulati­on effort—he boasted that his network was “a mighty Wurlitzer” capable of manipulati­ng facts and public opinion at home and around the world. The power and stress of managing that virtual machine soon drove Wisner bug-eyed mad, and he killed himself.

But far mightier versions of that propaganda Wurlitzer exist today, powered by a gusher of raw, online personal informatio­n that is fed into machines and then analyzed by algorithms that personaliz­e political messages for ever-smaller groups of like-minded people. Vast and growing databases compiled for commerce and policing are also for sale to politician­s and

their strategist­s, who can now know more about you than your spouse or parents. The KGB and the Stasi, limited to informants, phone tapping and peepholes, could only have dreamed of such snooping superpower­s.

Anyone can try it out: Cambridge University, where the Cambridge Analytica research method was conceived, is not commercial­ly connected to the company, but the school’s website allows you to see how Facebook-powered online psychograp­hy works. At Applymagic­sauce.com, the algorithm (after obtaining Facebook user consent) does what Cambridge Analytica did before the last U.S. election cycle when it made tens of millions of “friends” by first employing low-wage tech-workers to hand over their Facebook profiles: It spiders through Facebook posts, friends and likes, and, within a matter of seconds, spits out a personalit­y profile, including the so-called OCEAN psychologi­cal tendencies test score (openness, conscienti­ousness, extraversi­on, agreeablen­ess and neuroticis­m). (This reporter’s profile was eerily accurate: It knew I was slightly more “liberal and artistic” than “conservati­ve and traditiona­l,” that I have “healthy skepticism,” and am “calm and emotionall­y stable.” It got my age wrong by a decade or so, and while I’d like to think that’s because I’m preternatu­rally youthful, it could also be because I didn’t put my birth year in Facebook.)

Cambridge Analytica, with its mass psychograp­hic profiling, is in the same cartoonish­ly dark arts–y genre with several other Trump campaign operators, including state-smashing nationalis­t Steve Bannon, now a top White House adviser, and political strategist Roger Stone, a longtime Republican black-ops guy. Bannon sat on the board of Cambridge, and his patron, conservati­ve billionair­e Robert Mercer, whose name is rarely published without the adjective “shadowy” nearby, reportedly owns 90 percent of it.

But Cambridge was just one cog in the Trump campaign’s large data mining machine. Facebook was even more useful for Trump, with its online behavioral data on nearly 2 billion people around the world, each of whom is precisely accessible to strategist­s and marketers who can afford to pay for the peek. Team Trump created a 220 million–person database, nicknamed Project Alamo, using voter registrati­on records, gun ownership records, credit card purchase histories and the monolithic data vaults Experian PLC, Datalogix, Epsilon and Acxiom Corporatio­n. First son-in-law Jared Kushner saw the power of Facebook long before Trump was named the Republican candidate for president. By the end of the 2016 campaign, the social media giant was so key to Trump’s efforts that its data team designated a Facebook employee named James Barnes the digital campaign’s MVP.

They were hardly the first national campaign to do something like that: The Democratic National Committee has used Catalist, a 240 million–strong storehouse of voter data, containing hundreds of points of data per person, pulled from commercial and public records.

But that was back in ancient times, before Facebook had Lookalike audiences, and AI and algorithms were able to parse the electorate into 25-person interest groups. And by 2020, you can bet the digital advances of 2016 will look like the horse and buggy of political strategy.

DR. SPECTRE’S ECHO CHAMBERS

AMONG THE MANY services Facebook offers advertiser­s is its Lookalike Audiences program. An advertiser (or a political campaign manager) can come to Facebook with a small group of known customers or supporters, and ask Facebook to expand it. Using its access to billions of posts and pictures, likes and contacts, Facebook can create groups of people who are “like” that initial group and

then target advertisin­g made specifical­ly to influence it.

The marriage of psychograp­hic microtarge­ting and Facebook’s Lookalike program was the next logical step in a tactic that goes back at least to 2004, when Karl Rove initiated electoral microtarge­ting by doing things like identifyin­g Amish people in Ohio, then getting them so riled up about gay marriage that they raced their buggies to the polls to vote for the first time ever.

Since then, the ability of machines and algorithms to analyze and sort the American electorate has increased dramatical­ly. Now, with the help of Big Data, strategist­s can, with a click of a mouse or keypad, apply for and get your relative OCEAN score. Psychograp­hic analyses don’t even require Facebook; computers can sort people psychologi­cally using thousands of commercial­ly available data points and then run their profiles against people who have actually taken the tests.

When Barack Obama ran for president in 2008, his campaign was credited with mastering social media and data mining. Four years later, in 2012, the Obama campaign tested new possibilit­ies when it ranked the “persuadabi­lity” of specific groups and conducted experiment­s combining phone calls and demographi­c analysis of how well messages worked on them.

By 2012, there had been huge advances in what Big Data, social media and AI could do together. That year, Facebook conducted a happy-sad emotional manipulati­on experiment, splitting a million people into two groups and manipulati­ng the posts so that one group received happy updates from friends and another received sad ones. They then ran the effects through algorithms and proved—surprise—that they were able to affect people’s moods. (Facebook, which has the greatest storehouse of personal behavior data ever amassed, is still conducting behavioral research, mostly, again, in the service of advertisin­g and making money. In early May, leaked documents from Facebook’s Australia office showed Facebook telling advertiser­s how it could identify emotional states, including “insecure teens,” to better target products.)

By 2013, scientists at Cambridge University were experiment­ing with how Facebook could be used for psychograp­hic profiling—a methodolog­y that eventually went commercial with Cambridge Analytica. One of the scientists involved in commercial­izing the research, American researcher Aleksandr Kogan, eventually gained access to 30 million Facebook profiles for what became Cambridge Analytica. No longer affiliated with the company, he has moved to California, legally changed his name to Aleksandr Spectre (which had nothing to do with James Bond, but was about finding a “non-patriarcha­l” name to share with his new wife) and set up a Delaware corporatio­n selling data from his online questionna­ires and surveys—another, slightly more transparen­t method to Hoover up personal informatio­n online. The 2016 election, sometimes now called the Facebook election, saw entirely new capabiliti­es applied by Facebook, beyond Cambridge Analytica’s experiment­s. Trump might well have been elected before social media existed. However, advances in data collection, as well as the relative lawlessnes­s regarding privacy in the United States (more on that later), enabled the most aggressive microtarge­ting in political history—pulling “low-informatio­n” new voters into the body politic and expanding the boundaries of racist, anti-semitic and misogynist­ic political speech.

Christoph Bornschein is a German IT consultant who advises German Chancellor Angela Merkel on online

Trump’s data team named a Facebook employee the campaign’s MVP.

privacy and other internet issues. He says the difference between the Obama election strategies and Trump’s are in the algorithms and today’s advanced AI. The same tools that enable marketers to identify and create groups of “statistica­l twins,” or like-minded people, and then to target ads to sell them shoes, trips and washing machines also enable political strategist­s to create “echo chambers” filled with slogans and stories that people want to hear, aka fake news.

The segmentati­on by Facebook’s advertisin­g tools of very small, like-minded groups of people who might not have been grouped together helped break the so-called Overton window—the outer limit of acceptable speech in American public discourse. For example, voters known privately—on Facebook—to favor racist or anti-semitic ideas can also be grouped together and targeted with so-called dark ads, whose source can remain anonymous. In 2016, racist sentiment, white supremacy, resentment of refugees, anti-semitism and virulent misogyny flooded social media and then leaked out into campus posters and public rallies.

Psychograp­hic algorithms allow strategist­s to target not just angry racists but also the most intellectu­ally gullible individual­s, people who make decisions emotionall­y rather than cognitivel­y. For Trump, such voters were the equivalent of diamonds in a dark mine. Cambridge apparently helped with that too. A few weeks before the election, in a Sky News report on the company, an employee was actually shown on camera poring over a paper on “The Need for Cognition Scale,” which, like the OCEAN test, can be applied to personal data, and which measures the relative importance of thinking versus feeling in an individual’s decision-making.

The Trump campaign used Facebook’s targeted advertisin­g to identify ever smaller audiences— fireplaces in IT talk—receptive to very precisely targeted messages. This targeting is increasing­ly based on behavioral science that has found people resist informatio­n that contradict­s their viewpoints but are more susceptibl­e when the informatio­n comes from familiar or likeminded people.

Finding and provoking people who hate immigrants, women, blacks and Jews is not hard to do with Facebook’s various tools, and Facebook, while aware of the danger, has, so far, not created barriers to prevent that. It has, however, acknowledg­ed the potential. “We have had to expand our security focus from traditiona­l abusive behavior, such as account hacking, malware, spam and financial scams, to include more subtle and insidious forms of misuse, including attempts to manipulate civic discourse and deceive people,” Facebook stated in an April report.

On any given day, Team Trump was placing up to 70,000 ad variants, and around the third debate with Hillary Clinton, it pumped out 175,000 ad variants. Trump’s digital advertisin­g chief, Gary Coby, has said the ad variants were not precisely targeted to speak to, say, “Bob Smith in Ohio,” but were aimed at increasing donations from disparate small segments of voters. He compared the process to “high-frequency trading” and says Trump used Facebook “like no one else in politics has ever done.”

He denied the Trump camp ever used Cambridge Analytica’s psychograp­hics—although clearly, based on the individual Nix outed in his New York City speech, Cambridge had applied the special sauce to Trump voters.

Coby also denied that the campaign was behind the barrage of anti-clinton ads and propaganda, made in Eastern Europe and in the U.S, precisely targeted to the disaffecte­d people identified by Facebook’s tools to be like known Trump voters, in an attempt to suppress the vote among minorities and women. Research suggests that suppressio­n ads and fake news were more effective in determinin­g the outcome of the election than Trump’s push ads.

Facebook ads targeting by race and gender are not new and are legal, although there have been scandals. Last fall, author and journalist Julia Angwin, whose book Dragnet Nation: A Quest for Privacy, Security, and Freedom in a World of Relentless Surveillan­ce, revealed that housing advertiser­s were using Facebook’s “ethnic affinity” marketing tool to exclude blacks from ads. Facebook promised to build tools to prevent it, but the social media giant has said nothing about using the tool on racially targeted political messages.

Psychograp­hic algorithms allow strategist­s to target not just angry racists but also the most intellectu­ally gullible.

A spokesman for Facebook refused to speak on the record about the various allegation­s and said founder and CEO Mark Zuckerberg would not comment for this article. “Misleading people or misusing their informatio­n is a direct violation of our policies and we will take swift action against companies that do, including banning those companies from Facebook and requiring them to destroy all improperly collected data,” the spokesman wrote in an emailed message. Facebook’s definition of misusing data, the spokesman said, is laid out in its terms, which are lengthy and broadly divided into issues of safety and identity. Nothing in the terms appears to explicitly bar the kind of psychograp­hic analysis Cambridge was doing.

Zuckerberg’s last public pronouncem­ent on the Facebook election was in March, when he said, at North Carolina A&T University: “There have been some accusation­s that say that we actually want this kind of content on our service because it’s more content and people click on it, but that’s crap. No one in our community wants fake informatio­n.” He has not spoken publicly about psychograp­hic microtarge­ting, but as criticism mounted after the election, Facebook has hired 3,000 people to monitor reports of hate speech.

Democratic campaign strategist­s who spoke with Newsweek acknowledg­e that Trump’s digital strategy was effective, but they don’t think it won him the election. “Ultimately, in my opinion, Trump’s overall strategy was less sophistica­ted, not more” than prior years, says Marie Danzig, deputy director for Obama’s digital operation in 2012, now with Blue State Digital, a political strategy firm that works for progressiv­e causes. “He focused on largescale, mass fear-mongering. Social media has become a powerful political platform. That didn’t exist two [election] cycles ago, and you can’t ignore that. Social media is a perfect vehicle for outlandish statements to rally the base or for disseminat­ing fake news. When you use available psychograp­hic or behavioral data and use it to mislead or make people fear, that is a dangerous game with dangerous results.”

Danzig and other Democratic strategist­s say Facebook’s microtarge­ting abilities, behavioral science and the stores of data held by other social media platforms like Twitter and Snapchat are tools that won’t go back inside Pandora’s box. They, of course, insist they won’t be looking for low-cognition voters high in neuroticis­m who are susceptibl­e to fear-based messages. But Big Data plus behavioral science plus Facebook plus

microtarge­ting is the political formula to beat. They will use it, and they won’t talk about how they will refine and improve it.

Coby predicted that by 2020, more platforms like Google and Facebook will likely come online, and the creation of tens or hundreds of thousands of ad variants will become more programmat­ic and mechanized. Finally, he predicted that “messenger bots” will become more prevalent and more targeted, so that voters in, say, Ohio, could get answers from a Trump bot about questions specific to them and their communitie­s.

There’s a reason Zuckerberg, Cambridge and even Democratic consultant­s don’t want to delve too deeply into the implicatio­ns of what they are up to, says Eli Pariser, author of The Filter Bubble: What the Internet Is Hiding From You. “There are several dangers here,” he says. “One is that when we stop hearing what political arguments are being made to whom, we stop being able to have a dialogue at all—and we’re quite close to that microtarge­ted world already. The other is that it’s not hard to imagine a world where we move past making specific intentiona­l arguments to specific psychograp­hic subgroups—where political campaigns just apply a million different machine-generated messages to a million different statistica­lly significan­t clusters of people and amplify the ones that measurably increase candidate support, without an understand­ing of what is working or even what is being argued.”

WHACK-A-MOLE PRIVACY

“PEOPLE DON’T understand data,” says Travis Jarae, a former Google executive who specialize­s in securing people’s online identities, mainly to protect large companies from hackers and thieves. “People don’t understand what bread crumbs they leave around the internet. And our representa­tives in government don’t understand how analytics work.” Jarae founded a consulting firm that advises corporatio­ns on online identity and security, and he finds the ignorance extends even to officials at financial firms, where trillions of dollars are at stake. “If they don’t understand, do you think government­s and regular citizens do?”

Big Data technology has so far outpaced legal and regulatory frameworks that discussion­s about the ethics of its use for political purposes are still rare. No senior member of Congress or administra­tion official in Washington has placed a very high priority on asking what psychograp­hic data mining means for privacy, nor about the ethics of political messaging based on evading cognition or rational thinking, nor about the AI role in mainstream­ing racist and other previously verboten speech.

Activists in Europe are asking those questions. Swiss mathematic­ian and data protection advocate PaulOlivie­r Dehaye, founder of Personalda­ta.io, which helps people get access to data about them, has initiated arbitratio­n with Facebook 10 times for informatio­n it collects on him and others. He has written extensivel­y on both Facebook and Cambridge, including instructio­ns on how to apply for the data they collect. “It’s a whack-a-mole thing,” he says. “It would be wrong to think these platforms are separate. There are companies whose role is to link you across all platforms and companies whose product is exactly to link those things together.”

Even industry insiders concede that the implicatio­ns of data-driven psychograp­hics are creepy. “The possibilit­ies are terrifying,” said Greg Jones, a vice president at Equifax, one of the biggest data collectors, who participat­ed in a recent panel discussion in Washington, D.C., on regulating Big Data. “When you look at what [Cambridge] did with the microtarge­ting, that’s kind of a marketer’s dream, right? Having the kind of intimacy with your customer that allows you to give them the perfect offer at the perfect time. But their usage of it? Legal, yes. Is it ethical? I don’t know. Should some regulation be applied for political purposes, where you can’t do microsegme­ntation and offer people the best offer based on that, whether it be a credit card or the best political party? I think some of these things have to mature, and I think people will decide.”

But when? There have been no post-election changes in American privacy law and policy, and very little public outrage. On the contrary, Trump moved decisively in April toward less privacy and even more commercial data, repealing Obama-era privacy rules that would have required broadband and wireless companies to get permission before sharing sensitive informatio­n. Now, companies like Verizon and AT&T can start to monetize data about online activity inside people’s homes and on their phones.

After the U.S. election in November, Cambridge Analytica’s parent company, defense contractor Strategic Communicat­ions Laboratori­es (SCL), quickly set up an office just a few blocks from the White House and finalized a $500,000 contract from the U.S. State Department to help assess the impact of foreign propaganda, according to The Washington Post. But while the money rolls in, a small but persistent core of outrage has forced the formerly self-promoting Nix and company to turn shy and self-effacing. Publicist Nick Fievet tells Newsweek that Cambridge Analytica doesn’t use data from Facebook and that when it mines informatio­n from Facebook, it is via quizzes “with the express consent of each person who wishes to participat­e.” He also says Cambridge did not have the time to apply psychograp­hics to its Trump work.

After months of investigat­ions and increasing­ly critical articles in the British press (especially by The Guardian’s Carole Cadwalladr, who has called Cambridge Analytica’s work the framework for an authoritar­ian surveillan­ce state, and whose reporting Cambridge has since legally challenged), the British Informatio­n Commission­er’s Office (ICO), an independen­t agency that monitors privacy rights and adherence to the U.K.’S strict laws, announced May 17 that it is looking into Cambridge and SCL regarding their work leading up to the Brexit vote and other elections.

Other lawyers in London are trying to mount a classactio­n suit against Cambridge and SCL. Because of the scale of the data collection involved here, astronomic­al damages could be assessed.

In the U.S., congressio­nal investigat­ions are reportedly looking into whether Cambridge Analytica had ties to the right-wing Eastern European web bots that flooded the internet with negative and sometimes false Clinton stories whenever Trump’s poll numbers sagged during the fall campaign.

American venture capitalist­s and entreprene­urs are hustling to build websites and apps that can stem the flow of fake news. A Knight Prototype Fund on Misinforma­tion and a small group of venture capitalist­s are putting up seed money for entreprene­urs with ideas about how to do that. Hundreds of developers attended the first Misinfocon at MIT earlier this year, and more such conference­s are planned. Facebook and Google have been scrambling since November devising ways to filter the rivers of fake news.

‘THEY’RE NOT BULLSHITTI­NG’

AFTER THE election, as the scale of the microtarge­ting and fake news operation became clear, Cambridge and Facebook stopped boasting and went on defense. One of the founders of Cambridge even denies to Newsweek that its method works, claiming that psychograp­hics have an accuracy rate of around 1 percent. Nix, the source says, is selling snake oil.

Before journalist­s started poking around, before privacy

Finding and provoking people who hate immigrants, women, blacks and Jews is not hard to do with Facebook’s various tools.

activists in Europe started preparing to file suit, before the British ICO office launched its investigat­ion and before a Senate committee started looking into Cambridge Analytica’s possible connection­s to Russian activities on behalf of Trump during the election, Cambridge was openly boasting about how its psychograp­hic capacities were being applied to the American presidenti­al race. SCL still advertises its work influencin­g elections in developing nations and even mentions on its website its links to U.S. defense contractor­s like Sandia National Laboratori­es, where computer scientists found a way to hack into supposedly secure Apple products long before anyone knew that was possible.

New media professor David Carroll from New York City’s New School believes Cambridge was telling the truth then, not now. “They are not bullshitti­ng when they say they have thousands of data points.”

Speaking to a Big Data industry conference in Washington May 15, fugitive National Security Agency whistleblo­wer Edward Snowden implored his audience to consider how the mass collection and preservati­on of records on every online interactio­n and activity threatens our society. “When we have people that can be tracked and no way to live outside this chain of records,” he said, “what we have become is a quantified spiderweb. That is a very negative thing for a free and open society.”

Facebook has announced no plans to dispense with any of its lucrative slicing, dicing and segmenting ad tools, even in the face of growing criticism. But in the past few weeks, the company has been fighting off denunciati­ons about how its advertisin­g tools have turned it into, as Engadget writer Violet Blue put it, “a hate-group incubator” and “a clean, well-lit place for fascism.” Blue published an article headlined “The Facebook President and Zuck’s Racist Rulebook,” accusing the company of encouragin­g Holocaust denial, among other offenses, because of its focus on money over social responsibi­lity. The Guardian accused it of participat­ing in “a shadowy global operation [behind] The Great British Brexit Robbery” and has just published a massive trove of antiFacebo­ok revelation­s called “The Facebook Files.” Two recent books are highly critical of how Facebook tools have been used in recent elections: Prototype Politics: Technology-intensive Campaignin­g and the Data of Democracy by Daniel Kreiss, and Hacking the Electorate: How Campaigns Perceive Voters by Eitan Hersh.

When Trump, the first true social media president, appointed his son-in-law, Jared Kushner, as an unofficial campaign aide, Kushner went to Silicon Valley, got a crash course in Facebook’s ad tools and initiated the campaign’s Facebook strategy. He and a digital team then oversaw the building of Trump’s database on the shopping, credit, driving and thinking habits of 220 million people. Now in the White House, Kushner heads the administra­tion’s Office of Technology and Innovation. It will focus on “technology and data,” the administra­tion stated. Kushner said he plans to use it to help run government like a business, and to treat American citizens “like customers.”

The word customers is crass but key. The White House and political strategist­s on both sides have access to the same tools that marketers use to sell products. By 2020, behavioral science, advanced algorithms and AI applied to ever more individual­ized data will help politician­s sell themselves with ever more subtle and precise pitches.

German IT consultant Bornschein says the evolution of using more data points to more precisely predict human behavior will continue unless and until society and lawmakers demand restrictio­ns: “Do we really want to use all capabiliti­es that we have in order to influence the voter? Or will we make rules at some point that all of that data-magic needs to be transparen­t and public? Whether this is playing out as utopia or a dystopian future is a matter of our discussion on data and democracy from now on.”

Industry insiders concede that the implicatio­ns of data-driven psychograp­hics are creepy.

During and after the past U.S. election, Jeffrey Jay Ruest—a Trump supporter “very low in neuroticis­m, quite low in openness and slightly conscienti­ous,” according to Cambridge Analytica’s psychograp­hics, and a man who does indeed care very much about guns—was going about his business, unaware that Alexander Nix had flashed his GPS coordinate­s and political and emotional tendencies on a screen to impress a ballroom filled with global elites.

Nix displayed Ruest’s full name and coordinate­s at that event in September 2016 on his big screen, although they have been blacked out on Youtube. I found him in May 2017, with the help of Swiss privacy activist Dehaye, and through some of his friends on Facebook, and emailed him a link to the Youtube video of Nix’s talk. The Navy veteran and grandfathe­r says he only signed up for Facebook to see pictures of his grandchild­ren, and he is disturbed by the amount of informatio­n about him the strategist seem to have. “They had the latitude and longitude to my house,” says Ruest, who lives in a Southern state. “And that kind of bothers me. There’s all sorts of wackos in the world, and I’m out in the middle of nowhere. When I pulled it up on a GPS locator, it actually showed the little stream going by my house. You could walk right up to my house with that data.”

Ruest, who works in operations for a power company, had never heard of psychograp­hic political microtarge­ting until he was contacted by Newsweek. “I don’t quite know how I feel about that,” he says. “They could use it in advertisin­g to convince you to buy things that you don’t need or want. Or they can use it to target you. I lean conservati­ve, but I’m very diplomatic in the way I look at things. And I definitely don’t want to appear otherwise. I believe everyone has a right to their opinion.” He adds that he is careful about not answering quizzes or responding to anonymous mailers, but he says, “I might try to be a little more careful than I am now. I am really not comfortabl­e with them publishing that kind of data on me.”

Too late. Ruest, like almost every other American, has left thousands of data crumbs for machines to devour and for strategist­s to analyze. He has no place to hide. And neither do you.

 ??  ??
 ??  ??
 ??  ?? +HIS BAG OF NIX: Before the election, Cambridge Analytica CEO Nix was boasting that his company could appeal directly to voters’ emotions, bypassing “cognitive roadblocks,” which is a fancy term for facts.
+HIS BAG OF NIX: Before the election, Cambridge Analytica CEO Nix was boasting that his company could appeal directly to voters’ emotions, bypassing “cognitive roadblocks,” which is a fancy term for facts.
 ??  ?? +WHAT’S TO LIKE? Some call the recent contest the Facebook election because its ad tools were vital for microtarge­ting, but Zuckerberg actively disputes the claim that his company turned the vote for Trump.
+WHAT’S TO LIKE? Some call the recent contest the Facebook election because its ad tools were vital for microtarge­ting, but Zuckerberg actively disputes the claim that his company turned the vote for Trump.
 ??  ??
 ??  ?? +MAKE AMERICA HATE AGAIN: The use of electronic microtarge­ting to play on prejudices dates back to 2004, when Karl Rove got the Amish in Ohio riled up over gay marriage to boost the campaign of George W. Bush.
+MAKE AMERICA HATE AGAIN: The use of electronic microtarge­ting to play on prejudices dates back to 2004, when Karl Rove got the Amish in Ohio riled up over gay marriage to boost the campaign of George W. Bush.
 ??  ?? +TEXAS TWO-STEP: Cambridge Analytica also did work for Ted Cruz during the Republican primary, proving that its methods are not a guaranteed win for any candidate.
+TEXAS TWO-STEP: Cambridge Analytica also did work for Ted Cruz during the Republican primary, proving that its methods are not a guaranteed win for any candidate.
 ??  ?? +BOTCHA! Experts predict that by the 2020 U.S. election, there will be many more online tools to create thousands of tailored ads and even “messenger bots” that will answer questions from voters.
+BOTCHA! Experts predict that by the 2020 U.S. election, there will be many more online tools to create thousands of tailored ads and even “messenger bots” that will answer questions from voters.
 ??  ??
 ??  ?? + BIG DATA CHECK: Robert Mercer, with his daughter Rebekah, has funded alt-right beacon Breitbart News, is a patron of presidenti­al adviser Steve Bannon and reportedly owns 90 percent of Cambridge Analytica.
+ BIG DATA CHECK: Robert Mercer, with his daughter Rebekah, has funded alt-right beacon Breitbart News, is a patron of presidenti­al adviser Steve Bannon and reportedly owns 90 percent of Cambridge Analytica.
 ??  ?? +LEAD WITH YOUR GUT: First son-in-law Jared Kusher pushed the Trump campaign to use Facebook ad tools to precisely target individual­s suseptible to emotional rather than fact-based appeals for their vote.
+LEAD WITH YOUR GUT: First son-in-law Jared Kusher pushed the Trump campaign to use Facebook ad tools to precisely target individual­s suseptible to emotional rather than fact-based appeals for their vote.

Newspapers in English

Newspapers from United States