privacy

The Supreme Court’s Say on Surveillance?

Big national security news yesterday: a federal court judge has ruled that the NSA’s Section 215 metadata collection program is an unconstitutional violation of the Fourth Amendment.  TechDirt has a great wrap-up of Judge Leon’s opinion, but more than the excellent legal analysis on display, the case is one of the first big demonstrations of how the federal judiciary is being brought into the surveillance discussion post-Snowden.  The secretive structure of FISA Court, and the difficulty – if impossibility – of getting those cases into the Supreme Court or out into the sunshine made it very easy for the the courts to avoid judging the constitutionality of broad government surveillance.

Just last year in Clapper v. Amnesty International, the Supreme Court was able to side-step today’s question by holding that a group of international lawyers and journalists had no standing to challenge the FISA Amendments Act of 2008 because they could prove no harm.  The narrow majority deferred to the FISA Court’s ability to enforce the Fourth Amendment’s privacy guarantees, an assertion that has proven to be ridiculous. Snowden’s revelations have changed Clapper‘s standing equation, and this may force the Supreme Court’s hand.

After today, it appears all three branches of government may have a say in the future of the Fourth Amendment, and it seems likely they won’t be in agreement.  Involving the Third Branch in an active dialog about surveillance is essential not only because it can clarify the scope of Fourth Amendment but also because it may be in a position to break a separation of powers stalemate between Congress and the President.  In the end, the steady stream of lawsuits challenging the NSA’s activities may end up having a bigger legal impact than any congressional theatrics.

Read More…

Future of Privacy Forum Releases US-EU Safe Harbor Report

Today, some four months after we first announced it, my organization put out our Safe Harbor Report on the effectiveness of the U.S.-EU Safe Harbor in protecting EU citizen privacy and promoting trans-Atlantic data transfers.  That’s something of a mouthful, but I’m proud of my contributions to the report, which include the paper’s discussions on enforcement, government access to information (e.g., NSA activity), and some of the recommendations and case studies.  I now know entirely too much about trans-Atlantic data transfers under the program, so here’s hope the European Union doesn’t and suspend the Safe Harbor now!

Europe Misdirects Rage on the US Safe Harbor

This morning, the European Commission released its report on the state of the US-EU Safe Harbor, a mechanism that provides for international data transfers, proposing a series of recommendations designed “to restore trust in data flows between the EU and the U.S.”  Europeans have long been critical of the Safe Harbor — and America’s free-wheeling attitude toward privacy in general — but the Summer of Snowden provided a perfect pretext to “reconsider” the efficacy of the Safe Harbor.

America’s hodgepodge or “sectoral” approach to privacy has increasingly placed U.S. officials on the defensive, and there’s no question the Safe Harbor can be improved.  However, conflating Safe Harbor reform with justified anger about expansive NSA snooping is counterproductive.  First and foremost, while public and private data sharing is increasingly intermingled, government access to data is not the same as commercial data use.  The Safe Harbor was explicitly designed to protect the commercial privacy interests of EU citizens.

It was not created to address national security issues, and the Safe Harbor specifically provides an exception from its requirements “to the extent necessary to meet national security, public interest, or law enforcement requirements.”  As FTC Commissioner Julie Brill has noted, national security exceptions to legal regimes are not unusual.  For example, the HIPAA Privacy Rule permits the disclosure of private health information in the interest of national security, and even the EU’s stringent Data Protection Directive includes an exception for state security or defense.

Read More…

From Collected Criticism to “Slamming” an Attorney General

Last Friday, I helped draft a few thoughts on behalf of the Future of Privacy Forum regarding the New York Attorney General’s efforts to subpoena information from 15,000 Airbnb users in New York City.  We wondered about the breadth of the AG’s request, and suggested only that “wide grabs of consumer data by well-meaning regulators can have a serious impact on consumer privacy.”

Later that day, Kaja Whitehouse of the New York Post declared that FPF had “slammed” the AG, proceeding to pull some line from our “open letter” to suggest FPF was far more critical of AG than it intended–or certainly I intended.  Another victory for overstrong rhetoric against even-keeled moderation!

Sen. Markey’s Drone Aircraft Privacy and Transparency Act Summarized

On Monday, Sen. Markey introduced legislation designed to expand legal safeguards to protect individual privacy from invasion by commercial and government use of drones. The bill amends the FAA Modernization and Reform Act of 2012, which directed the FAA to integrate unmanned aircraft systems (UAS) into U.S. airspace by October 2015. The law, however, was silent as to the transparency and privacy implications of domestic drone use. Under pressure from advocacy groups and Congress, the FAA solicited public comment about potential privacy and civil liberties issues during its UAS test site selection process, ultimately suggesting only that UAS privacy policies “should be informed by the Fair Information Practice Principles.”

This section-by-section summary looks at how Sen. Markey’s bill would amend current law to establish national guidelines for domestic drone use.

Sec. 1 – Short Title

Drone Aircraft Privacy and Transparency Act of 2013

Sec. 2 –  Findings

The bill notes that the FAA projects that 30,000 drones could be in sky above the United States by 2020, and further, that current law provides for no explicit privacy protections or public transparency measures with regards to drone use by public or private entities.

Sec. 3 –  Guidance and Limitations for UAS

The major substance of this section details new requirements for data collection statements by commercial drone operators and data minimization statements by law enforcement. The bill’s provisions with regards to law enforcement appear to bolster significantly Fourth Amendment privacy protections. Agencies would be subject to a warrant requirement for any generalized drone surveillance absent exigent circumstances, such as (1) imminent danger of death or serious injury or (2) DHS has determined credible intelligence points to a high risk of terrorist attack. Moreover, any information collected that was unrelated to a potential exigency is required to be destroyed.

While these provide practical, procedural limitations on surveillance, the bill also forces law enforcement to consider how they plan to use drones prior to their implementation. Law enforcement offices will be required to file an explanation about any policies adopted to minimize the collection of data unrelated to a warrant-requirement, how excess data will be destroyed, and detailing any audit or oversight mechanisms. By making licenses contingent on these statements, the bill may encourage careful consideration of privacy challenges before law enforcement begins broad use of drones.

For commercial operators, the bill would prohibit the FAA from issuing licences without a statement that provides information about who will operate the drone, where the drone will be flown, what data will be collected and how that data will be used, including information about whether any information will be sold to third parties, the period for which information will be retained, and contact information to receive complaints. Depending upon how onerous these statement requirements become, this section may present some First Amendment challenges, particularly public efforts to advance newsgathering and the free flow of information.

The FAA would be charged with creating a publicly searchable website that would list all approved drone licenses, including copies of data collection or minimization statements, any data security breaches, and details about the time and location of all drone flights.

This section also calls for the Departments of Homeland Security, Commerce, and Transportation and the FTC to conduct a study to identify any potential challenges presented by drones to the OECD privacy guidelines. It would also require the current UAS rulemaking underway to take those privacy guidelines into consideration.

Sec. 4 – Enforcement

The section provides for concurrent enforcement by state authorities and the Federal Trade Commission under its Section 5 authority. It also allows for a private right of action for violations of either an entity’s data collection or data minimization statement. Remedies include equitable relief, and the greater of actual monetary damages or statutory damages of up to $1,000 for each violation.

Sec. 5 – Model Aircraft Provision

Finally, the bill provides for an exception for model aircraft.

***

Sen. Markey introduced a largely identical version of the Drone Aircraft Privacy and Transparency Act of 2013 earlier this year as a member of the House of Representative, and last year, as well.

Buying and Selling Privacy Essay Published by Stanford Law Review Online

My essay on how “Big Data” is transforming our notions of individual privacy in unequal ways has been published by the Stanford Law Review Online.  Here’s how they summed up my piece:

We are increasingly dependent upon technologies, which in turn need our personal information in order to function. This reciprocal relationship has made it incredibly difficult for individuals to make informed decisions about what to keep private. Perhaps more important, the privacy considerations at stake will not be the same for everyone: they will vary depending upon one’s socioeconomic status. It is essential for society and particularly policymakers to recognize the different burdens placed on individuals to protect their data.

Framing Big Data Debates

If finding the proper balance between privacy risks and Big Data rewards is the big public policy challenge of the day, we can start by having a serious discussion about what that policy debate should look like. In advance of my organization’s workshop on “Big Data and Privacy,” we received a number of paper submissions that attempted to frame the debate between Big Data and privacy. Is Big Data “new”?  What threats exist?  And what conceptual tools exist to address any concerns?

As part of my attempt to digest the material, I wanted to look at how several scholars attempted to think about this debate.

This question is especially timely in light of FTC Chairwoman Edith Ramirez’s recent remarks on the privacy challenge of Big Data at the Aspen Forum this week. Chairwoman Ramirez argued that “the fact that ‘big data’ may be transformative does not mean that the challenges it poses are, as some claim, novel or beyond the ability of our legal institutions to respond.” Indeed, a number of privacy scholars have suggested that Big Data does not so much present new challenges but rather has made old concerns ever more pressing.

Read More…

From Cyberspace to Big Data Podcast

In the run-up to the Future of Privacy Forum’s “Big Data and Privacy” workshop with the Stanford Center for Internet & Society, I’ve taken to podcasting again, speaking with scholars who couldn’t attend the conference.  First up was Professor Bill McGeveran, who prepared an essay looking over lessons from the 2000 Stanford symposium on “Cyberspace and Privacy: A New Legal Paradigm?”

Of course, now the buzzword has moved from cyberspace to big data.  McGeveran suggests big data is really seeing a replay of the same debates cyberspace saw a decade ago.  Among the parallels he highlights are (1) the propertization of data, (2) technological solutions like P3P, (4) First Amendment questions, and (4) the challenges posed by the privacy myopia.

The Toobin Principle as a Corollary to the Snowden Effect

Jay Rosen has a fantastic piece today on PressThink on what he calls the “Toobin principle“.  In effect, Jeffrey Toobin and a number of media figures have criticized Edward Snowden as a criminal or, worse, a traitor even as they admit that his revelations have led to a worthwhile and more importantly, a newsworthy debate. For his part, Rosen asks whether there can “even be an informed public and consent-of-the-governed for decisions about electronic surveillance”?

I would add only the following observations. First, an informed public may well be the only real mechanism for preserving individual privacy over the long term. As we’ve seen, the NSA has gone to great lengths to explain that it was acting under appropriate legal authority, and the President himself stressed that all three branches of government approved of these programs. But that hasn’t stopped abuses — as identified in currently classified FISC opinions — or and I think this is key, stopped government entities from expanding these programs.

This also begs the bigger, looming concern of what all of this “Big Data” means. One of the big challenges surrounding Big Data today is that companies aren’t doing a very good job communicating with consumers about what they’re doing with all this data.  Innovation becomes a buzzword to disguise a better way to market us things. Like “innovation,” national security has long been used as a way to legitimize many projects. However, with headlines like “The NSA is giving your phone records to the DEA. And the DEA is covering it up,” I believe it is safe to say that the government now faces the same communications dilemma as private industry.

In a recent speech at Fordham Law School, FTC Commissioner Julie Brill cautioned that Big Data will require industry to “engage in an honest discussion about its collection and use practices in order to instill consumer trust in the online and mobile marketplace.”  That’s good advice — and the government ought to take it.

MOOCs and My Future Employment Prospects?

Massive open online courses are a new, rapidly evolving platform for delivering educational instruction. Since their appearance just a half-decade ago, multiple platforms now offer dozens of free courses from leading universities across the country. However, as MOOCs work to transform education, they also seek to find ways to turn innovative educational experiences into viable business models. In many respects, this is the same challenge facing many Internet services today. Yet while many “free” Internet services rely upon their users giving up control of their personal data in exchange, this bargain becomes strained when we enter the field of education.

Education aims to encourage free thought and expression.  At a basic level, a successful learning experience requires individuals to push their mental abilities, often expressing their innermost thoughts and reasoning. A sphere of educational privacy is thus necessary to ensure students feel free to try out new ideas, to take risks, and to fail without fear of embarrassment or social derision. As a data platform, MOOCs by their very nature collect vast stores of educational data, and as these entities search for ways to turns a profit, they will be tempted to take advantage of the huge quantities of information that they are currently sitting upon.

As MOOCs look for ways to turn a profit, they will be tempted to turn to the vast stores of personal data that they are currently sitting upon.  It will be essential to consider the privacy harms that could result if this personal educational data is treated carelessly.

This is already some evidence that MOOC organizers recognize this challenge.  In January, a dozen educators worked to draft a “Bill of Rights” for learning in the digital age.  The group, which included Sebastian Thrun, founder the MOOC Udacity, declared that educational privacy was “an inalienable right.” The framework called for MOOCs to explain how student data was being collected, used by the MOOC, and more importantly, made available to others.  “[MOOCs] should offer clear explanations of the privacy implications of students’ choices,” the document declared.

In addition to Udacity, the leading MOOCs–Coursera and edX–can improve how they approach student privacy.  Most MOOCs have incredibly easy sign-up processes, but they are much less clear about what data they are collecting and using.  At the moment, the major MOOCs rely on the usual long, cumbersome privacy policies to get this information across to users.

These policies are both broad and unclear.  For example, Coursera states in its Privacy Policy that it “will not disclose any Personally Identifiable Information we gather from you.”  However, it follows this very clear statement by giving itself broad permission to use student data: “In addition to the other uses set forth in this Privacy Policy, we may disclose and otherwise use Personally Identifiable Information as described below. We may also use it for research and business purposes.”  More can be done to offer clear privacy guidelines.

Beyond providing clearer privacy guidelines, however, MOOCs also should consider how their use of user-generated content can impair privacy.  A potential privacy challenge exists where a MOOC’s terms of service grant it such a broad license to re-use students’ content that they effectively have the right to do whatever they wish. EdX, a project started by educational heavyweights Harvard and MIT, states in its Terms of Service that students grant edX “a worldwide, non-exclusive, transferable, assignable, sublicensable, fully paid-up, royalty-free, perpetual, irrevocable right and license to host, transfer, display, perform, reproduce, modify, distribute, re-distribute, relicense and otherwise use, make available and exploit your User Postings, in whole or in part, in any form and in any media formats and through any media channels (now known or hereafter developed).” Coursera and Udacity have similar policies.

Under such broad licenses, students “own” their exam-records, forums posts, and classroom submissions in name only. The implications of a MOOC “otherwise using” my poor grasp of a history of the Internet course I sampled for fun is unclear. This information could be harnessed to help me learn better, but as MOOC’s become talent pools for corporate human resource departments, it could bode ill for my future employment prospects.

At the moment, these are unresolved issues.  Still, as MOOCs move to spearhead a revolution in how students are taught and educated, providing students of all ages with a safe-space to try out new ideas and learn beyond their comfort zone will require both educators and technology providers to think about educational privacy.

Buying and Selling Privacy Paper

Judge Alex Kozinski has offered to pay $2,400 a year to protect his privacy. Meanwhile, Federico Zannier started a Kickstarter to “data mine” himself and ended up making $2,700. One’s rich and can pay to protect his privacy; the other’s not and is selling every bit of his info. I’ve posted my paper on this subject to SSRN.

Privacy Protections from FISA Court May Not Compute

This is cross-post on the American Constitution Society’s blog.

After the events of the past few weeks, a discussion presented by the American Constitution Center on the search for privacy and security on the Internet posed many questions but few answers. In an article on The Daily Beast, Harvard Law Professor Lawrence Lessig has noted that the “Trust us’ does not compute,” but after a contentious, technical discussion of both the NSA’s PRISM program and the cellular metadata orders, a panel of privacy law scholars were forced to concede that “trust us” is today’s status quo when it comes to programmatic government surveillance.

It wasn’t supposed to be this way. When the Foreign Intelligence Surveillance Act was first passed in 1978, the law was designed to “put the rule of law back into things,” explained Professor Peter Swire, co-chair of the Tracking Protection Working Group at the W3C and the first Chief Counselor for Privacy at OMB. The emergence of the Internet, however, changed everything. Intelligence agencies were faced with a legal framework that could not account for situations where “games like World of Warcraft [could be] a global terrorist communication network,” he said.

But even as communications technology has been made to serve bad actors, it has also ushered in a Golden Age of surveillance. Modern technology today can easily determine an individual’s geolocation, learn about an individual’s closest associates, and connect it all together via vast databases. Within the federal government, without strong champions for civil liberties, the availability of these technologies encouraged government bureaucracy to take advantage of them to the full extent possible. Absent outside pressure from either the Congress or the public, “stasis sets in,” Swire said.

Yet while service providers collect vast amounts of data about individuals, a combination of business practicalities and Fair Information Practice Principles which stress retention limits and data minimization mean that businesses simply do not keep all of their data for very long. As a result, the government has used Section 215 of the PATRIOT Act to collect and store as much information as possible in the “digital equivalent of the warehouse at the end of Indiana Jones,” said Professor Nathan Sales, who largely defended the government’s efforts at intelligence gathering.

The difficulty is that these sorts of data collection projects present important Fourth Amendment considerations.  In his passionate dissent in the recent Maryland DNA collection case, Justice Antonin Scalia joined three of his liberal colleagues to explain that the Fourth Amendment specifically protects against general searches and demands a particularity requirement.  However, a general search is exactly what an order permitting the collection of anyone and everyone’s cellular metadata appears to be.

Professor Susan Freiwald pointed out that the plain language of Section 215 is incredibly broad.  50 U.S.C. Sec. 1861 permits surveillance wherever “reasonable grounds” exist that surveillance could be “relevant . . . to protect against international terrorism or clandestine intelligence activities” where any individual, American citizen or otherwise, is “in contact with, or known to, a suspected agent of a foreign power.”  According to Freiwald, the plain language of the statute “doesn’t limit government investigations in any meaningful way.” What checks that exist are limited: Congress appears at best half-informed and the ISPs that are hauled before the Foreign Intelligence Surveillance Court (FISC) have been incentivized not to fight via the carrot of immunity and the stick of contempt sanctions.

“We’re waiting on the courts,” Freiwald said, suggesting that these programs “cannot survive review if the court does its job.”

Professor Sales countered that the FISC was already placing minimization requirements into the its orders, though he conceded he couldn’t know for sure if this was accurate.

Former U.S. District Judge Nancy Gertner interjected:

As a former Article III judge, I can tell you that your faith in the FISA Court is dramatically misplaced. . . . Fourth Amendment frameworks have been substantially diluted in the ordinary police case. One can only imagine what the dilution is in a national security setting.

What little we do know about the FISC suggests that it, too, is wary of the government’s behavior.  In a letter to Sen. Ron Wyden (D-Ore.) last fall, the Director of National Intelligence conceded that on at least one occasion the FISC found that the government’s information collection was unreasonable under the Fourth Amendment, and moreover, that the government’s behavior had “sometimes circumvented the spirit of the law.”

Unfortunately, the FISC’s full legal opinion remains classified, and the Department of Justice continues to contest its release.  This fact reveals the core challenge facing any sensible debate about the merits of government surveillance: our current understanding rests on incomplete information, from secret court decisions to the “least untruthful” testimony of government officials.

Louis Brandeis, who along with Samuel Warren “invented” the right to privacy in 1890, also wrote that “[s]unlight is said to be the best of disinfectants.”  A discussion about the future of privacy online that forces our best privacy scholars to repeatedly profess their ignorance and rests on placing our trust in the government simply does not compute.

Would Could Facial Recognition Privacy Protections Look Like?

Concerns about facial recognition technology have appeared within the context of “tagging” images on Facebook or how it transforms marketing, but these interactions are largely between users and service providers. Facial recognition on the scale offered by wearable technology such as Google Glass changes how we navigate the outside world.  Traditional notice and consent mechanisms can protect Glass users but not the use by the user himself.  // More on the Future of Privacy Forum Blog.

What’s Scary About Big Data, and How to Confront It

Today, the data deluge that Big Data presents encourages a passivity and misguided efforts to get off the grid.  With an “Internet of Things” ranging from our cars to our appliances, even to our carpets, retreating to our homes and turning off our phones will do little to stem the datafication tide. Transparency for transparency’s sake is meaningless.  We need mechanisms to achieve transparency’s benefits. // More on the Future of Privacy Forum Blog.

Domestic Drones Should Embrace Privacy by Design

On Wednesday, the FAA held a forum to seek input from members of the public on the agency’s development of a privacy policy for domestic civilian drones. If the unmanned aircraft industry wishes to encourage the widespread societal embrace of this technology, suggesting that drones do not present privacy challenges and moreover, arguing that our current legal and policy framework can adequately address any concerns is counterproductive. // More on the Future of Privacy Forum Blog.

1 2 3  Scroll to top