Facebook has a problem it just canât kick: People keep exploiting it in ways that could sway elections, and in the worst cases even undermine democracy.
News reports that Facebook let the Trump-affiliated data mining firm Cambridge Analytica abscond with data from tens of millions of users mark the third time in roughly a year the company appears to have been outfoxed by crafty outsiders in this way.
Before the Cambridge imbroglio, there were Russian agents running election-related propaganda campaigns through targeted ads and fake political events. And before the Russians took center stage, there were purveyors of fake news who spread false stories to rile up hyperpartisan audiences and profit from the resulting ad revenue.
In the previous cases, Facebook initially downplayed the risks posed by these activities. It only seriously grappled with fake news and Russian influence after sustained criticism from users, experts and politicians. In the case of Cambridge, Facebook says the main problem involved the transfer of data to a third party â not its collection in the first place.
Each new issue has also raised the same enduring questions about Facebookâs conflicting priorities â to protect its users, but also to ensure that it can exploit their personal details to fuel its hugely lucrative, and precisely targeted, advertising business.
Facebook may say its business model is to connect the world, but itâs really âto collect psychosocial data on users and sell that to advertisers.â said Mike Caulfield, a faculty trainer at Washington State University who directs a multi-university effort focused on digital literacy.
Late Friday, Facebook announced it was banning Cambridge , an outfit that helped Donald Trump win the White House, saying the company improperly obtained information from 270,000 people who downloaded a purported research app described as a personality test. Facebook first learned of this breach of privacy more than two years ago, but hasnât mentioned it publicly until now.
And the company may still be playing down its scope. Christopher Wylie, a former Cambridge employee who served as a key source for detailed investigative reports published Saturday in The New York Times and The Guardian , said the firm was actually able to pull in data from roughly 50 million profiles by extending its tentacles to the unwitting friends of app users. (Facebook has since barred such second-hand data collection by apps.)
Wylie said he regrets the role he played in what he called âa full service propaganda machine.â Cambridgeâs goal, he told the Guardian in a video interview , was to use the Facebook data to build detailed profiles that could be used to identify and then to target individual voters with personalized political messages calculated to sway their opinions.
âIt was a grossly unethical experiment,â Wylie said. âBecause you are playing with an entire country. The psychology of an entire country without their consent or awareness.â
Recommended for you
Cambridge has denied wrongdoing and calls Wylie a disgruntled former employee. It acknowledged obtaining user data in violation of Facebook policies, but blamed a middleman contractor for the problem. The company said it never used the data and deleted it all once it learned of the infraction â an assertion contradicted by Wylie and now under investigation by Facebook.
Jonathan Albright, research director at the Tow Center for Digital Journalism at Columbia University, said Facebook badly needs to embrace the transparency it has essentially forced on its users by sharing their habits, likes and dislikes with advertisers. Albright has previously noted cases in which Facebook deleted thousands of posts detailing Russian influence on its service and underreported the audience for Russian posts by failing to mention millions of followers on Instagram, which Facebook owns.
Facebook is âwithholding information to the point of negligence,â he said Saturday. âHow many times can you keep doing that before it gets to the point where youâre not going to be able to wrangle your way out?â
The Cambridge imbroglio also revealed what appear to be loopholes in Facebookâs privacy assurances, particularly regarding third-party apps. Facebook appears to have no technical way to enforce privacy promises made by app developers, leaving users little choice but to simply trust them.
In fact, the enforcement actions outlined in Facebookâs statement donât address prevention at all â just ways to respond to violations after theyâve occurred.
On Saturday, Facebook continued to insist that the Cambridge data collection was not a âdata breachâ because âeveryone involved gave their consentâ to share their data. The purported research app followed Facebookâs existing privacy rules, no systems were surreptitiously infiltrated and no one stole passwords or sensitive information without permission. (To Facebook, the only real violation was the transfer of information collected for âresearchâ to a third party such as Cambridge.)
Experts say that argument only makes sense if every user fully understands Facebookâs obscure privacy settings, which often default to maximal data sharing.
âItâs a disgusting abuse of privacy,â said Larry Ponemon, founder of the privacy research firm Ponemon Institute. âIn general, most of these privacy settings are superficial,â he said. âCompanies need to do more to make sure commitments are actually met.â
âââ
Jesdanun reported from New York.

(0) comments
Welcome to the discussion.
Log In
Keep the discussion civilized. Absolutely NO personal attacks or insults directed toward writers, nor others who make comments.
Keep it clean. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
Don't threaten. Threats of harming another person will not be tolerated.
Be truthful. Don't knowingly lie about anyone or anything.
Be proactive. Use the 'Report' link on each comment to let us know of abusive posts.
PLEASE TURN OFF YOUR CAPS LOCK.
Anyone violating these rules will be issued a warning. After the warning, comment privileges can be revoked.