When you give an organization your data, and then that data gets exposed or stolen, you probably want to know about it. Seems simple enough. If a friend lost your sweater, you’d expect him to tell you. But a seemingly endless parade of massive data exposures—including, most recently, at Facebook and Google—reveal just how complicated that practice of disclosure can be.
Take Facebook’s massive data breach at the end of last month, which served as the first major test run of disclosure requirements in the European Union’s General Data Protection Regulation. Facebook could face more than $1.5 billion in fines under GDPR just for allowing the breach in the first place. But the company reduced the possibility of an even larger fine by disclosing the incident to regulators within 72 hours of discovering it—a GDPR requirement.
Network security and digital forensic practitioners note, though, that 72 hours isn’t very much time to investigate the scale and scope of an intrusion. That narrow window could also push breach victims to wildly overestimate the impact of a breach, or report unsupported findings to simply meet the requirement and hedge for later. Rapid public disclosure can also complicate active investigations and law enforcement inquiries.
“For GDPR, they want to know things like what categories of information were exposed and how many people were affected, but at 72 hours you almost never will know that definitively,” says Mark Thibodeaux, an attorney specializing in data privacy at the corporate law firm Eversheds Sutherland. “I think a lot of this legislation was designed in terms of databases where you’ve got tables that have customer names and addresses and credit card numbers and things like that stored in one monolithic kind of system. But what happens in most of these breaches is the bad guys get into email and other non-structured data, and so figuring out what they got is an exercise in looking through everything.”
“I think it’s possible for regulation to be done well, but it’s a dilemma.”
Mark Thibodeaux, Eversheds Sutherland
The Facebook incident illustrates that very dynamic. Its initial disclosure states that 50 million users were likely impacted by the breach, but the number could be as high as 90 million. Facebook also had incomplete information about specifics like the impact of the breach on third-party services that share user login infrastructure with Facebook. “The investigation is still early,” said Nathaniel Gleicher, Facebook’s head of cybersecurity policy, on September 28, the day of the disclosure. “[It’s] proceeding now so we can understand access or what types of activities were taken. As with any investigation in this space, it can be challenging to understand the full scope of activity.”
GDPR was conceived to be a broad and flexible framework, but its prescriptive elements can seem impractical or unreasonable. And this hints at the larger tension between the need for codified disclosure requirements, and the difficulty of making rules that account for all situations.
Those nuances came into sharp relief earlier this week, when Google announced that it would shutter its social network, Google+, following a vulnerability that exposed account details from as many as 500,000 Google+ users before the company found and patched the bug in March. The company had decided not to publicly disclose the flaw—and was under no legal obligation to, since there was no indication of data theft—but came forward because of a report in The Wall Street Journal.
“Our Privacy & Data Protection Office reviewed this issue, looking at the type of data involved, whether we could accurately identify the users to inform, whether there was any evidence of misuse, and whether there were any actions a developer or user could take in response. None of these thresholds were met in this instance,” Ben Smith, Google’s vice president of engineering, wrote of the company’s decision not to inform affected users.
Google’s choice not to disclose sparked debate. Institutions regularly find a fix flaws in their systems—a positive practice that helps strengthen data protections. Reporting every tiny remediation to a regulator could be impractical, and might discourage organizations from looking for bugs in the first place. But some data exposures do rise to the level of disclosure even when there isn’t evidence that data was actually stolen.
But who decides where that line is? Some legislators have proposed a rolling registry of events and remediations that everyone contributes to, so that no company gets singled out. But policy analysts fear information overload and practical issues with evaluating so many incidents.
“I think it’s possible for regulation to be done well, but it’s a dilemma,” Eversheds Sutherland’s Thibodeaux says. “In Europe you’re going to see a lot more notices based on incidents that would not require notice in the US, because of GDPR. Whether that’s a positive or negative thing for people we have to wait and see. And I think the regulatory agencies are a little overwhelmed with the number of investigations that have already come to them in the early days.”
“Different people can have different definitions of privacy and what data should remain private, and that can all be perfectly valid.”
Beau Woods, Atlantic Council
For now, the United States has a patchwork of state data breach disclosure laws and guidance from federal agencies without an overarching law like GDPR. California passed a statewide data privacy bill in June, but lobbyists have launched a bitter fight to revise (and potentially neuter) it before it takes effect in January 2020. The idea of developing a framework for managing responsibility and motivating proactive security defense is appealing, especially given the reality of the damaging data breaches that occur all the time, but developing the right approach has proved nearly impossible in practice.
GDPR is still in its early days, but some problems and unintended consequences of the legislation have already surfaced. This makes the idea of developing a similar type of law in US Congress particularly daunting. Though legislators have already expressed outrage at damaging data breaches, and proposed various potential approaches to dealing with them, policy analysts caution that even the most hands-off strategy have downsides.
“You can take the approach that ‘look, we’re lawmakers, we don’t know what’s going to be reasonable tomorrow let alone 10 years from now, but we expect you to apply reasonable security protections,'” says Beau Woods, an Atlantic Council fellow who studies cybersecurity policy. “This makes it more flexible, so courts can interpret what reasonable means and at least it’s dynamic not static and rigid. But then again, different people can have different definitions of privacy and what data should remain private, and that can all be perfectly valid. Which makes it hard to define what is ‘reasonable.’ It’s hard to say which approach is better.”
GDPR’s maturation, for better or worse, will be instructive for legislators around the world. But it seems the crucial element of disclosure in efforts to mandate it is an understanding that when and how disclosure happens has serious implications
More Great WIRED Stories