Lawmakers had tough questions after Zuckerberg's last hearing. Here's how he responded.

The CEOs of the country's four largest tech companies will testify Wednesday before the House Judiciary's Antitrust Subcommittee, which is examining the market dominance of Amazon, Apple, Facebook, and Google. 

For Facebook's CEO, Mark Zuckerberg, appearances before Congress are becoming old hat: This will be his third such hearing since 2018.

The last time, on October 23, 2019, he left with a homework assignment: to answer more than 100 questions that Democratic representatives believed weren't sufficiently answered during his testimony. CBS News obtained his written responses, dated December 30, 2019, which have not been previously published. Here are five of the questions and answers followed by the entire document.

Follow CBS News' Antitrust hearing Live Blog here.

Questions about discrimination and civil rights

Representative Maxine Waters cited a study that found extreme disparities in how advertisers for job listings targeted their audiences, noting that "in one extreme case, advertisements for jobs in the lumber industry reached 72% White and 90% male audience."  

Waters: "What is Facebook doing to ensure that ads are not used to discriminate? Will you commit to making Facebook's ads algorithms transparent?"

Zuckerberg: Concerns about bias and fairness in algorithms are important — not just for Facebook, but for the industry at large. We take this issue seriously, and we want to be a leader in this space. We've built and continue to expand teams working on this issue. As part of the National Fair Housing Alliance ("NFHA") settlement, we've committed to studying the potential for unintended bias, including in our ad delivery algorithms, with input from the civil rights community, industry experts, and academics, and we've similarly committed to working toward the right solutions.

Waters: What assurances can Facebook give this Committee that you will not allow African Americans to be targeted for voter exclusion and voter suppression? Can Facebook promise this Committee that you will give the exact same protections in your community standards to guarding against census interference that you give to voting interference? 

Zuckerberg: On both the US 2020 election and the census, we fully understand the stakes.

Since 2016, we have prohibited misrepresentations about the dates, locations, times, and qualifications for voting and—ahead of the 2018 midterm elections in the US—we also banned misrepresentations about who can vote, qualifications for voting, and materials required to vote.

Our Community Standards also address other types of content about which civil rights groups have previously expressed concerns. For example, our policy on hate speech bans efforts to exclude people from political participation based on their protected characteristics such as race, ethnicity, or religion (e.g., telling people not to vote for a candidate because of the candidate's ethnicity, or indicating that people of a certain religion should not be allowed to hold office). We also prohibit threats of violence relating to voting, voter registration, or the outcome of an election. And more recently, we have updated our policies to prohibit calls to action or statements of intent to bring weapons to polling places.

With respect to the census, we recently announced a new census interference policy that bans misleading information about when and how to participate in the census and the consequences of participating. We are also introducing a new advertising policy that prohibits ads that portray census participation as useless or meaningless or advise people not to participate in the census. These policies are due in large part to the work being done with the civil rights community through our civil rights audit and represent the culmination of a months-long process between Facebook, the US Census Bureau, and experts with diverse backgrounds to develop thoughtful rules around prohibiting census interference on our platforms and making sure people can use their voice to be counted.

We look forward to continuing to meet with, listen to, and learn from the civil rights community as we work toward the same end goals of protecting the integrity of our elections and the census and preventing discrimination against and targeting of communities of color.

Representative Emanuel Cleaver: If Facebook is serious about addressing civil rights, why does the company not have any senior leadership with extensive experience in civil rights? Will Facebook commit to a focus effort to including a civil rights focus in its hiring of senior leadership?

By formalizing the (Civil Rights) Task Force, our goal is to create a long-term accountability structure at the company whereby we continue making progress on civil rights issues beyond the completion of the audit—and to embed civil rights considerations in the early stages of developing relevant products and policies.

The Task Force is made up of senior leaders across key areas of the company, including Product, US Policy/External Affairs, Operations, Advertising, Marketing, Diversity & Inclusion, Human Resources, Communications, Partnerships, and Legal. As such, the Task Force is in a position to ensure that we are effective in addressing civil rights issues that pertain to content policy, fairness in artificial intelligence, privacy, and elections.

We're also introducing civil rights training for all senior leaders on the Task Force and key employees who work in the early stages of developing relevant products and policies. We know these are the first steps to developing long-term accountability. We plan on making further changes to build a culture that explicitly protects and promotes civil rights on Facebook.

When it comes to hiring, we are dedicated to prioritizing diverse hiring and are committed to our goal of having a company where, in the next five years, at least 50 percent of our workforce is comprised of women, people of color, and other underrepresented groups. 

We have a diverse slate approach modeled after the Rooney Rule. 

This ensures that recruiters present qualified candidates from underrepresented groups to hiring managers looking to fill open roles, and it sets the expectation that hiring managers will consider candidates from underrepresented backgrounds when interviewing for an open position. We've seen steady increases in hiring rates for underrepresented people since we started testing this approach in 2015. We're also focused on increasing the diversity and inclusion capabilities of managers and leaders to build inclusive teams, departments, and organizations so that our products and community will benefit from the diverse perspectives of our people.

Questions about data collection and Cambridge Analytica

Representative Cindy Axne: How many data points does Facebook have on the average Facebook user?

As explained in our Data Policy, we collect three basic categories of data about people: 

  1. data about things people do and share (and who they connect with) on our services; 
  2. data about the devices people use to access our services; and
  3. data we receive from partners, including the websites and apps that use our business tools. Our Data Policy provides more detail about each of the three categories.

As far as the amount of data we collect about people, the answer depends on the person. People who have only recently signed up for Facebook have usually shared only a few things—such as name, contact information, age, and gender. 

Over time, as people use our products and interact with our services, we receive more data from them, and this data helps us provide more relevant content and services. That data will fall into the categories noted above, but the specific data we receive will, in large part, depend on how the person chooses to use Facebook. For example, some people use Facebook to share photos, so we receive and store photos for those people. 

Some people enjoy watching videos on Facebook; when they do, we receive information about the video they watched, and we can use that information to help show other videos in their News Feeds. Other people seldom or never watch videos, so we do not receive the same kind of information from them, and their News Feeds are likely to feature fewer videos.

The data we have about people also depends on how they have used our controls. For example, people who share photos can easily delete those photos. The same is true of any other kind of content that people post on our services. Through Facebook's Activity Log tool, people can also control information about their engagement—i.e., their likes, shares, and comments—with other people's posts. The use of these controls affects the data we have about people.

We also offer a variety of tools to help users understand the data Facebook has about them. These include the Access Your Information and Download Your Information tools available to Facebook users in their account settings. And to provide more transparency and control around these practices, we have been rolling out a new way to view and control your off-Facebook activity. Off-Facebook Activity lets you see a summary of apps and websites that send us information about your activity and allows you to disconnect this information from your account if you want to. For more information about this tool, please see our Help Center. We also participate in the Data Transfer Project, a collaborative effort with Apple, Google, Microsoft, and Twitter to build a common way for people to transfer their data between online services. The goal of this project has been to make it easier for services of any size to securely make direct transfers for data portability from one service to another and to make the process simpler for the people who use these services.

Rep. Alexandria Ocasio-CortezWhen, exact date including month and year, did you and Facebook COO Sheryl Sandberg first become aware of Cambridge Analytica's action to harvest the data of millions of Facebook users without their consent?

a. Follow-Up: Did anyone on your leadership team, including board members, know about Cambridge Analytica prior to the initial report by the Guardian on December 11, 2015? If so, who and when (exact date including month and year)?

b. Follow-Up: Did senior leadership, including board members, sell off Facebook stock ahead of the initial report by the Guardian on December 11, 2015? If so, who and when?

Facebook first became aware that Aleksandr Kogan may have sold data to Cambridge Analytica on December 11, 2015, when The Guardian published an article reporting that Dr. Kogan and his company, GSR, may have passed information the app had obtained from Facebook users to SCL Elections Ltd. ("SCL")/Cambridge Analytica. 

Facebook then banned Kogan's app from our platform and investigated what happened and what further action Facebook should take to enforce our Platform Policies. Facebook considered the matter closed after obtaining written certifications and confirmations from Kogan, GSR, Cambridge Analytica, and SCL declaring that all such data they had obtained had been accounted for and destroyed. 

(Editor's note: CBS News has previously reported that Facebook employees discussed "speculation" that Cambridge Analytica was scraping Facebook data in September 2015, three months before publication of The Guardian article described above.)

Facebook and Mark Zuckerberg became aware from media reporting in March 2018 that the certifications we received may not have been accurate. Facebook immediately banned Cambridge Analytica and other potentially related parties from distributing advertising on Facebook or from using other aspects of our service. We also removed the personal accounts of some of their officers.

All Facebook stock sales by our executive officers and members of our Board of Directors are publicly reported in SEC filings, which are available on the SEC website and our investor relations website. Facebook also requires that our executive officers and directors conduct all Facebook stock sales under a trading plan established pursuant to Rule 10b5-1 under the Securities Exchange Act of 1934 (subject to a limited exception for non-discretionary sales to cover tax withholding obligations in connection with restricted stock unit vesting). 

Through a Rule 10b5-1 trading plan, the executive officer or director contracts with a broker to sell shares of stock on a periodic basis, and the broker then executes trades pursuant to written parameters established by the executive officer or director when entering into the plan, without further direction by them. Facebook policy mandates that Rule 10b5-1 trading plans may only be entered into in an "open trading window" and are subject to a 90-day "cooling off period" before any trades may be commenced by the broker pursuant to the parameters set forth in the trading plan.

Read Zuckerberg's full set of responses here:

f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.