Crown image Columbia Law School

Acting FTC Chair Slaughter Speaks on Protecting Privacy and Data Security

Thank you for inviting me to speak with you all today. The President named me as Acting Chair just a few weeks ago, and I’m incredibly excited about the opportunity to lead the FTC in these challenging times. It is fitting that one of my first speeches in this new capacity will be about privacy and data security, because these issues are so important to consumers, to the FTC, and to today’s information economy.

I want to direct my remarks today to some questions you may be asking—where is the FTC headed? What’s new and different at the FTC under the new administration? With the reminder that my colleagues on the Commission have their own strong opinions on these questions, let me talk a bit, at a high level, about my own priorities. I’ll start by talking about my interest in more effectively deterring problematic privacy and data security practices generally. Then, I’ll say a few words about two substantive priority areas: responding to COVID and combatting racism.

I. Efficient and Effective Enforcement

Back in August—so, approximately six pandemic years ago— I testified before our Senate oversight committee in support of comprehensive data-privacy legislation.[1] The pandemic has only amplified the need for strong legislation at the federal level as it has pushed more of our work, our children’s education, and even our personal interactions online, exacerbating data risks.[2] In the meantime, I want to think creatively about how to make our current enforcement efforts even more effective. The FTC has worked hard to curb abuses in this space without the benefit of a federal privacy law, and, for most of the statutes we enforce, without civil penalty authority. But the questions I care most about as we move forward are:

  1. Are we doing everything we can to deter future violations, both by the particular company at issue and by others in the market?
  2. Are we doing everything we can to help wronged consumers?
  3. Are we using all the tools in the FTC’s toolbox to fully charge offenses and pursue misconduct?

I’ve supported many of the Commission’s privacy and security cases, like Equifax and TikTok, but for those of you who have followed the FTC’s privacy and security work closely, you’ll know that I dissented in cases like Facebook,[3] YouTube,[4] and Zoom.[5] When I dissented, in most instances it was because I believed that the Commission should have obtained stronger relief for consumers, including by pursuing litigation if we were unable to negotiate sufficient relief in settlement.

Two types of relief I want us to seek and believe we can achieve are meaningful disgorgement and effective consumer notice. The Commission achieved an innovative disgorgement remedy in the settlement with photo app Everalbum announced last month. In that case, we alleged that the company violated its promises about the circumstances under which it would use facial recognition technology.[6] As part of the settlement, the Commission required the company to delete facial recognition models or algorithms developed with users’ photos or videos.

We routinely obtain disgorgement of ill-gotten monetary gains when consumers pay for a product that is marketed deceptively. Everalbum shows how we can apply this principle to privacy cases where companies collect and use consumers’ data in unlawful ways: we should require violators to disgorge not only the ill-gotten data, but also the benefits—here, the algorithms—generated from that data.

A good example of effective notice is the Commission’s recent fem-tech case involving the Flo menstruation and fertility app. We alleged that Flo violated its promises not to share consumers’ sensitive information to third parties by sharing the information with Facebook, Google, and others.[7] An important remedy the Commission achieved in this case was to require the company to notify consumers of its false promises.

Notice lets consumers “vote with their feet” and helps them better decide whether to recommend the service to others. Finally, and crucially, notice accords consumers the dignity of knowing what happened. There’s a fundamental equity issue here: many people—including those who most need to know—won’t hear about the FTC’s action against a company they deal with unless the company tells them. So, I’ll be pushing staff to include provisions requiring notice in privacy and data security orders as a matter of course.

The other lesson we can take from Flo is the need to fully plead all law violations. As I mentioned in my joint statement with Commissioner Chopra on that case,[8] I believe we also should have applied the Health Breach Notification Rule to those facts and I’m glad we are conducting a review of this Rule, [9] which requires that vendors of personal health records notify consumers of breaches. In other cases, I have argued that we should have included unfairness counts.[10] In all of our cases, I want to make sure that we are analyzing all of the relevant laws and pleading all the violations that are applicable.

Finally, I think we need to think carefully about the overlap between our work in data privacy and in competition. Many of the largest players in digital markets are as powerful as they are because of the breadth of their access to and control over consumer data. The FTC has a structural advantage over our counterparts in other jurisdictions that focus exclusively on antitrust or on data protection. Our dual missions can and should be complementary, and we need to make sure we are looking with both privacy and competition lenses at problems that arise in digital markets.

So, on all these fronts, I am encouraging staff to be innovative and creative to ensure we are using the full panoply of tools available to the FTC in order to bring about the best results in our cases. We welcome your insights as researchers, including feedback as to which remedies best address particular types of harm and feedback on the effectiveness of our existing orders. To gather this feedback, I’ve asked the staff to plan a workshop aimed at increasing our understanding of the incentives in the marketplace and how best to ensure market players do a better job of protecting privacy and securing consumer data.

II. Protecting Privacy During the Pandemic

In addition to ensuring we are being as effective and efficient as possible across the board, I want to highlight two substantive areas of priority for me

The first priority is the pandemic. It’s been almost a year since many of us stopped commuting to work, going to restaurants, traveling, or seeing loved ones in person. Some 27 million Americans have been diagnosed with COVID, and more than 450,000 have died, even as the roll-out of effective vaccines gives us new hope for the future.

Responding to COVID requires an all-hands approach nationally, and the FTC has several important roles to play as part of the solution. Obviously, health concerns are paramount, but the pandemic is also fundamentally tied to a host of other challenges Americans are facing, ranging from COVID-related scams to privacy and security issues to an economic crisis. Let me identify a few areas that I am working closely with staff to pursue related to the pandemic.

The first is ed-tech. With the ubiquity of distance learning during the pandemic, the edtech industry has exploded. In 2020, U.S. ed-tech startups raised over $2.2 billion in venture and private equity capital—a nearly 30 percent increase from 2019.[11] Speaking from experience— I’ve got two kids in “Zoom school” downstairs—parents and children are relying on ed-tech more than ever. So, what can the FTC do in this space?

We’ve put out guidance for parents, schools, and ed-tech providers on protecting privacy. We’re conducting an industry-wide study of social media and video streaming platforms in which we’ve asked recipients questions about ed-tech services they provide. And we’re currently in the process of reviewing the COPPA Rule, where we received numerous public comments asking us to clarify how COPPA applies in the ed-tech space. We don’t need to complete our rulemaking to say that COPPA absolutely applies to ed-tech, and companies collecting information from children need to abide by it. Finally, we have to remember that there is an important equity angle to ed-tech too, exacerbated by the pandemic, which I will discuss in more detail in a bit.

Second, I’d like staff to take a close look at health apps, including telehealth and contact tracing apps. As in-person doctor visits have become rarer during the pandemic, more consumers are turning to telehealth apps and other apps to help them manage their health issues. A recent U.K. survey found that usage of health apps has increased by 37% since the pandemic began.[12] I already mentioned Flo, which happens to be our first health app case, but I’d like to see the FTC pursue more of these types of cases.

Finally, in 2019, we embarked on an industry-wide study of broadband privacy practices. As businesses, schools, governments, and communities have struggled to find new models for staying open, providing critical services, and keeping in touch, the importance of reliable Internet has grown. The largest ISPs added over 1.5 million customers in the third quarter of 2020, the last quarter for which statistics are available.[13] Given the urgent need to provide the public with some transparency regarding the privacy practices of these companies, I’d like the Commission to issue a report on this subject this year.

III. Racial Equity

The second—and related—priority issue I want to emphasize is racial equity; how can we at the FTC engage in the ongoing nationwide work of righting the wrongs of four hundred years of racial injustice. I have been speaking frequently about ways to attack systemic racism through antitrust law, but of course there is a lot we can do on the consumer protection side as well.

There is an overlap between racial equity and the COVID-related privacy issues I mentioned above.14 Among the things we know about COVID is that the pandemic is exacerbating the equity gaps in this country. One way that is true is in the world of the “digital divide.” As many as one in six kids lack the equipment necessary to participate in distance learning,[15] and nearly one quarter of kids lack reliable internet access—conditions that particularly affect rural, urban, and low-income families. [16] We also know that digital services can sometimes target vulnerable communities with unwanted content and that vulnerable communities suffer outsized consequences from data privacy violations.[17] We need to be wary about the ways in which lower-income communities are asked to pay with their data for expensive services they cannot afford.

There are also several other ways we can focus on closing the equity gap. One is algorithmic discrimination. Kate Crawford at Microsoft Research wrote an article a few years back with the memorable title, “Artificial Intelligence’s White Guy Problem,” in which she wrote: “Histories of discrimination can live on in digital platforms, and if they go unquestioned, they become part of the logic of everyday algorithmic systems.”[18] And research published in Science in 2019 demonstrated that an algorithm used with good intentions—to target medical interventions to the sickest patients—ended up funneling resources to a healthier, white population, to the detriment of sicker, Black patients.[19] Again, you can imagine that this in an area where the pandemic has made inequity worse; although this research was published pre-COVID, these potential effects will be worse in the pandemic-driven reality, where communities of color have been disproportionately disadvantaged.

As sophisticated algorithms are deployed in ways that impact people’s lives, it is vital to make sure that they are not used in discriminatory ways. The Commission has begun to engage in this area—for example, we issued a report a few years ago with guidance for businesses on illegal uses of big data algorithms, including discrimination. [20] Just over a year ago, I gave a speech at UCLA Law discussing the myriad harms of AI and algorithmic decision-making, [21] and a few months later, BCP staff issued excellent new guidance on best practices in using algorithms, including transparency, fairness, and accountability.[22] Going forward, I have asked staff to actively investigate biased and discriminatory algorithms, and I am interested in further exploring the best ways to address AI-generated consumer harms.

Another challenge is the development and deployment of facial recognition technologies, which can exacerbate existing racial disparities. The privacy implications of technologies that allow the identification of previously unknown individuals are obvious, and I know this Forum has done a lot of work on this issue. There’s clear and disturbing evidence that these technologies are not as accurate in identifying non-white individuals, [23] and on at least three separate occasions, Black men have been wrongfully arrested based on faulty facial recognition matches.[24] The Commission has challenged illegal practices relating to the use of facial recognition technology in the Facebook and Everalbum cases, and we’ll redouble our efforts to identify law violations in this area.

Finally, last summer, several news articles emerged about mobile apps’ use of location data to identify characteristics of Black Lives Matter protesters. This raised a firestorm on the Hill, with several members of Congress asking questions about this practice.[25] I’m concerned about misuse of location data generally, but in particular, as it applies to tracking Americans engaged in constitutionally protected speech.


These are just some of the privacy-related priorities about which I’m passionate.[26] I am deeply honored and grateful to lead the FTC at this critical time, and I’m excited to continue and expand upon the agency’s great work. Let me acknowledge, too, that the FTC is by no means working on these issues alone. In particular, I want to thank the researchers whose work will be highlighted in today’s program. Your work and that of privacy and security researchers like you is critical to the FTC’s mission. For many years now, the FTC has hosted our own annual conference, PrivacyCon, to shine a spotlight on the latest research, and this year’s event will take place—virtually—on July 27. Privacy scholarship is tremendously valuable to us in myriad ways, bringing transparency to an all-too-opaque ecosystem and shedding light on policy alternatives or solutions. Thank you for the work that you do, and I look forward to more partnership going forward.


[1] Hearing on Oversight of the Fed. Trade Comm’n: Before the S. Comm. on Commerce, Science, and Transportation, Opening Statement of Commissioner Rebecca Kelly Slaughter, Fed. Trade Comm’n (Aug. 5, 2020), cca_slaughter_senate_commerce_oversight_hearing.pdf.

[2] See, e.g., Christine Wilson, Coronavirus Demands a Privacy Law, Wall Street Journal (May 13, 2020),

[3] See Dissenting Statement of Commissioner Rebecca Kelly Slaughter Regarding the Matter of FTC vs. Facebook, Fed. Trade Comm’n (July 24, 2019), ok_7-24-19.pdf.

[4] See Dissenting Statement of Commissioner Rebecca Kelly Slaughter in the Matter of Google LLC and YouTube, LLC, Fed. Trade Comm’n (Sept. 4, 2019),

[5] See Dissenting Statement of Commissioner Rebecca Kelly Slaughter Regarding the Matter of Zoom Video Communications Inc., Fed. Trade Comm’n (Nov. 9, 2020),; Dissenting Statement of Commissioner Rebecca Kelly Slaughter Regarding Final Approval of the Settlement with Zoom Video Communications Inc., Fed. Trade Comm’n (Jan. 19, 2021), ughter_regarding_final_approval_of_the_settlement_with_zoom_2.pdf.

[6] Complaint ¶¶ 9, 23, 24, In the Matter of Everalbum, Inc. (Jan. 11, 2021),

[7] See Press Release, Fed. Trade Comm’n, Developer of Popular Women’s Fertility-Tracking App Settles FTC Allegations that It Misled Consumers About the Disclosure of their Health Data (Jan. 13, 2021),

[8] Joint Statement of Commissioner Rohit Chopra and Commissioner Rebecca Kelly Slaughter Concurring in Part, Dissenting in Part, In the Matter of Flo Health, Inc., Fed. Trade Comm’n (Jan. 13, 2021), flo.pdf.

[9] See Press Release, Fed. Trade Comm’n, FTC Seeks Comment as Part of Review of Health Breach Notification Rule (May 8, 2020),

[10] See, e.g., Concurring Statement of Commissioner Rebecca Kelly Slaughter In the Matter of FTC and State of New York v. Vyera Pharmaceuticals, LLC, Phoenixus AG; Martin Shkreli; and Kevin Mulleady, Fed. Trade Comm’n (Jan. 27, 2020), g_statement.pdf.

[11] Tony Wan, “A Record Year Amid a Pandemic: US Edtech Raises $2.2 Billion in 2020,” EdSurge (Jan. 13, 2021), %20Insights,up%2014%20percent%20from%202019.

[12] “Digital Health Habits in the UK: a Quin nationwide survey,” Quin (Oct. 2, 2020),

[13] Press Release, “About 1,530,000 Added Broadband in 3Q 2020,” Leichtman Research Group (Nov. 18, 2020),

[14] See Remarks of Commissioner Rebecca Kelly Slaughter, The Near Future of U.S. Privacy Law, Silicon Flatirons—University of Colorado Law School (Sept. 6, 2019), 19.pdf.

[15] Catherine E. Shoichet, “These Kids are Getting Left Behind When Schools Go Online,” CNN (July 31, 2020),

[16] See, e.g., Emily A. Vogels, Andrew Perrin, Lee Rainie, and Monica Anderson, “53% of Americans Say the Internet Has Been Essential During the COVID-19 Outbreak,” Pew Research Center (Apr. 30, 2020)

[17] See, e.g., Fed. Trade Comm’n, Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues (2016),

[18] Kate Crawford, “Artificial Intelligence’s White Guy Problem,” N.Y. Times (June 25, 2016),

[19] Ziad Obermeyer, Brian Powers, Christine Vogeli, Sendhil Mullainathan, “Dissecting Racial Bias in an Algorithm Used to Manage the Health of Populations,” 366 Science 447 (Oct. 25, 2019),

[20] Fed. Trade Comm’n, Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues (2016),

[21] See Remarks of Commissioner Rebecca Kelly Slaughter, Algorithms and Economic Justice, UCLA School of Law (Jan. 24, 2020), slaughter_on_algorithmic_and_economic_justice_01-24-2020.pdf.

[22] See Andrew Smith, Director of the FTC Bureau of Consumer Protection, “Using Artificial Intelligence and Algorithms,” Business Blog (Apr. 8, 2020),

[23] See, e.g., Brian Fung, “Facial recognition systems show rampant racial bias, government study finds,” CNN Business (Dec. 19, 2019),; Joy Buolamwini and Timnit Gebru, Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification (2018),

[24] See Kashmir Hill, “Another Arrest, and Jail Time, Due to a Bad Facial Recognition Match,” N.Y. Times (Jan. 6, 2021),

[25] See, e.g., Press Release, Reps. Eschoo, Rush, House Colleagues Demand Federal Agencies Cease Surveillance of Protests, Office of Rep. Anna G. Eshoo (June 9, 2020),; Press Release, Warren, Maloney, Wyden, DeSaulnier Probe Data Broker’s Collection of Data on Black Lives Matter Demonstrators, H. Comm. on Oversight and Reform (Aug. 4, 2020),

[26] See also Remarks of Commissioner Rebecca Kelly Slaughter, The Near Future of U.S. Privacy Law, Silicon Flatirons—University of Colorado Law School (Sept. 6, 2019), 19.pdf; Remarks of Commissioner Rebecca Kelly Slaughter, FTC Data Privacy Enforcement: A Time of Change, NYU School of Law (Oct. 16, 2020), _remarks_on_ftc_data_privacy_enforcement_-_a_time_of_change.pdf.

These remarks were delivered on February 10, 2021, by Rebecca Kelly Slaughter, acting chairwoman of the U.S. Federal Trade Commission, at the Future of Privacy Forum.