Former manager says he warned Facebook about potential privacy risks in 2012

Ex-Facebook employee had warned of data risks
Ex-Facebook employee had warned of data risks... 05:38

Facebook CEO Mark Zuckerberg will give testimony to three Congressional committees in two hearings this week. He'll address how Facebook's user data was improperly accessed by Cambridge Analytica.

The consulting firm used the information for politically-targeted advertising during the 2016 presidential election.

Around 87 million Facebook users had their information mined.

This all comes as Facebook suspended another data analytics firm, CubeYou, for allegedly collecting user data improperly. Facebook only took action after a news outlet alerted the company.

CubeYou told CBS News in a statement that it "takes great care to collect data in compliance with all relevant privacy provisions and laws."

Sandy Parakilas, who used to lead the team at Facebook responsible for data privacy violations on the company's app platform, says he warned Facebook in 2012 about potential risks, like the Cambridge Analytica scandal.

Sandy Parakilas lead the team at Facebook responsible for data privacy violations on the company's app platform from 2010-2012. CBS News

When asked on "CBS This Morning" Monday what he wants to hear from Zuckerberg, Parakilas said, "He needs to address the corporate governance of Facebook. He is unaccountable at the moment. He controls the company 100 percent. And so it's not a good situation to have someone who has that much power over a company of this much importance."

"You say he's unaccountable, but he's on the hook right now and being talked about every second," said co-host John Dickerson. "He's going in front of Congress, and they've just lost millions of dollars in the market. That's a lot of pressure on him; he seems pretty accountable."

"That's a lot of pressure, but keep in mind, no one can fire him," Parakilas said. "There is no independent board that can step in and say, 'Mark, you're not doing a good enough job, we need you to step aside.'"

Parakilas explained how outside app developers have been able to access Facebook users' private information:

"It's important to remember that apps on Facebook, when you use them, they ask you for permission to access specific kinds of data, whether it's your name or your e-mail address or your friends list or photos or other information. And once you click 'Allow' or tap 'Allow,' all that information passes from Facebook to the application developer.

"And the problem is that, once the data goes to the developer, there is no insight into what the developer is doing with the data, and there is no control by Facebook as to what they do. This has been a known problem since 2010."

"Why does Facebook allow all these apps to have access to our private data?" asked co-host Norah O'Donnell.

"It's a good question. The reason that they wanted to do that to begin with is they wanted developers to build really rich, full-featured applications that were social for the platform."

"That would then help Facebook make money," O'Donnell said.


Parakilas left Facebook in 2012, and went to work for two years afterwards for CitizenNet, a company that did social advertising, including on Facebook, working closely in partnership with his former employer. "I continued to work with the company, and I continued to believe in the company. It's only recently that I've started to become really concerned about some of the implications of what this data can be used for," he said.

Like what? "What's new is that there are companies like Cambridge Analytica that are able to use this data to understand and predict how you're going to vote. They can predict your personality type. That's why they wanted this data from Facebook.

"And according to Chris Wylie, who is the whistleblower of Cambridge Analytica, they're building all these models that can predict your behavior. And then they can use that information to go back into Facebook and buy ads that target you and try to manipulate you based on your personality and your voting preference."

Co-host Gayle King said, "Facebook says they have taken steps to prevent this from happening again, including banning app developers who don't agree to audits, making privacy tools easier to find, and showing users how to revoke data permissions. Is that enough?"

"They've taken some steps to address some of the most obvious concerns here," Parakilas replied. "Frankly, the way the platform was built to begin with wasn't built with the safety of users in mind, and that's a huge problem. They could have built it in a much different way where they had a much more limited set of features and they controlled the data, and they just allowed developers to access modules that would let you see friends and the like."

King asked, "Do you think you could have done more? Do you take any responsibility? If you saw this coming, do you think you should have shouted a little louder?"

"I probably could have done more, to be totally frank," he replied. "I saw some of the risks coming, and other people at the company obviously understood this risk as well."

When asked if he brought his concerns directly to Zuckerberg or to Facebook COO Sheryl Sandberg, Parakilas said, "I don't remember having a specific conversation with Mark or Sheryl, but this was well known at the company. There was a Wall Street journal article in 2010 about this issue, about a company called Rapleaf that was taking application data, personally identifiable data from Facebook, and passing it, getting that information and then selling it to ad networks."

So when he heard about the Cambridge Analytica breach, what did he think? "I thought, oh, no. This is something that they've known about, that I've known about, that I tried to raise some alarm about, and now it's being used for a really devastating purpose."

  • David Morgan

    David Morgan is a senior editor at and