News

Joining PBS NewsHour to Discuss Facial Recognition Apps

An invitation to join PBS NewsHour to discuss the privacy implications of facial recognition apps was enough for me to push aside a lingering illness. I spoke with Amna Nawaz about public concerns with FaceApp, a photo filter app that allows users to transform their features by adding or removing wrinkles. I address questions about how images of people’s faces could be used, the (non)-implications of the app being based in Russia, and encourage viewers to reach out to allegedly concerned lawmakers to push for federal privacy legislation.

Big Data: Catalyst for a Privacy Conversation

This week, the Indiana Law Review released my short article on privacy and big data that I prepared after the journal’s spring symposium. Law and policy appear on the verge of redefining how they understand privacy, and data collectors and privacy advocates are trying to present a path forward. The article discusses the rise of big data and the role of privacy in both the Fourth Amendment and consumer contexts. It explores how the dominant conceptions of privacy as secrecy and as control are increasingly untenable, leading to calls to focus on data use or respect the context of collection. I quickly argue that the future of privacy will have to be built upon a foundation of trust—between individuals and the technologies that will be watching and listening. I was especially thrilled to see the article highlighted by The New York Times’ Technology Section Scuttlebot.

No Privacy/No Control

This week, the Pew Research Center released a new report detailing Americans’ attitudes about their privacy. I wrote up a few thoughts, but my big takeaway is that Americans both want and need more control over their personal information. Of course, the challenge is helping users engage with their privacy, i.e., making privacy “fun,” which anyone will tell you is easier said than done. Then again, considering we’ve found ways to make everything from budgeting to health tracking “fun,” I’m unsure what’s stopping industry from finding some way to do it. // More on the Future of Privacy Forum blog.

Playing Cupid: All’s Fair in Love in the Age of Big Data?

After a three year dry spell, OkCupid’s fascinating OkTrends blog roared to life on Monday with a post by Christian Rudder, cofounder of the dating site. Rudder boldly declared that his matchmaking website “experiment[s] on human beings.” His comments are likely to reignite the controversy surrounding A/B testing on users in the wake of Facebook’s “emotional manipulation” study. This seems to be Rudder’s intention, writing that “if you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site. That’s how websites work.”

Rudder’s announcement detailed a number of the fascinating ways that OkCupid “plays” with its user’s information. From removing text and photos from people’s profiles to duping mismatches into thinking they’re excellent matches for one another, OkCupid has tried a lot of different methods to help users find love. Curiously, my gut reaction to this news was that it was much less problematic that the similar sorts of tests being run by Facebook – and basically everyone involved in the Internet ecosystem.

After all, OkCupid is quite literally playing Cupid. Playing God. There’s an expectation that there’s some magic to romance, even if it’s been reduced to numbers. Plus, there’s the hope these experiments are designed to better connect users with eligible dates, while most website experiments are to improve user engagement with the service itself. Perhaps all is fair in love, even if it requires users to divulge some of the most sensitive personal information imaginable.

Whatever the ultimate value of OkCupid’s, or Facebook’s, or really any organization’s user experiments, critics are quick to suggest these studies reveal how much control users have ceded over their personal information. But I think the real issue is broader than any concern over “individual control.” Instead, these studies beg the question of how much technology – fueled by our own data – can shape and mediate interpersonal interactions.

OkCupid’s news immediately brought to mind a talk by Patrick Tucker just last week at the Center for Democracy & Technology’s first “Always On” forum. Tucker, editor at The Futurist magazine and author of The Naked Future, provided a firestarter talk that detailed some of the potential of big data to reshape how we live and interact with each other. At a similar TEDx talk last year, he posited that all of this technology and all of this data can be used to give individuals an unprecedented amount of power. He began by discussing prevailing concerns about targeted marketing: “We’re all going to be faced with much more aggressive and effective mobile advertising,” he conceded, ” . . . but what if you answered a push notification on your phone that you have a 60% probability of regretting a purchase you’re about to make – this is the antidote to advertising!”

But he quickly moved beyond this debate. He proposed a hypothetical where individuals could be notified (by push notification, of course) that they were about to alienate their spouse. Data can be used not just to set up dates, but to manage marriages! Improve friendships! For an introvert such as myself, there’s a lot of appeal to these sorts of applications, but I also wonder when all of this information becomes a crutch. As OkCupid explains, when its service tells people they’re a good match, they act as if they are “[e]ven when they should be wrong for each other.”

Occasionally our reliance on technology crosses not just some illusory creepy line, but fundamentally changes our behavior. Last year, at IAPP’s Navigate conference, I met Lauren McCarthy, an artist researcher in residence at NYU, who discussed how she used technology to augment her ability to communicate. For example, she demoed a “happy hat” that would monitor the muscles in your face and provide a jolt of physical pain if the wearer stopped smiling. She also explained using technology and crowd-sourcing to make her way through dates.  She would secretly video tape her interactions with men in order to provide a livestream for viewers to give her real time feedback on the situation.  “He likes you.” “Lean in.” “Act more aloof,” she’d be told. As part of the experiment, she’d follow whatever directions were being beamed to her.

I asked her later whether she’d ever faced the situation of feeling one thing, e.g., actually liking a guy, and being directed to “go home” by her string-pullers, and she conceded she had. “I wanted to stay true to the experiment,” she said. On the surface, that struck me as ridiculous, but as I think on her presentation now, I wonder if she was forecasting our social future.

Echoing OkCupid’s results, McCarthy also discussed a Magic 8 ball device that a dating pair could figuratively shake to direct their conversation. Smile. Compliment. Laugh, etc. According to McCarthy, people had reported that the device had actually “freed” their conversation, and helped liberate them from the pro forma routines of dating.

Obviously, we are free to ignore the advice of Magic 8 balls, just as we can ignore push notifications on our phones. But if those push notifications work? If the algorithmic special sauce works? If data provides “better dates” and less alienated wives, why wouldn’t we use it? Why wouldn’t we harness it all the time? From one perspective, this is the ultimate form of individual control, where our devices can help us to tailor our behavior to better accommodate the rest of the world. Where then does the data end and the humanity begin? Privacy, as a value system, pushes up against this question, not because it’s about user control but because part of the value of privacy is in the right to fail, to be able to make mistakes, and to have secret spaces where push notifications cannot intrude. What that spaces looks like, however, when OkCupid is pulling our heartstrings.

Future of Privacy Forum Releases US-EU Safe Harbor Report

Today, some four months after we first announced it, my organization put out our Safe Harbor Report on the effectiveness of the U.S.-EU Safe Harbor in protecting EU citizen privacy and promoting trans-Atlantic data transfers.  That’s something of a mouthful, but I’m proud of my contributions to the report, which include the paper’s discussions on enforcement, government access to information (e.g., NSA activity), and some of the recommendations and case studies.  I now know entirely too much about trans-Atlantic data transfers under the program, so here’s hope the European Union doesn’t and suspend the Safe Harbor now!

Europe Misdirects Rage on the US Safe Harbor

This morning, the European Commission released its report on the state of the US-EU Safe Harbor, a mechanism that provides for international data transfers, proposing a series of recommendations designed “to restore trust in data flows between the EU and the U.S.”  Europeans have long been critical of the Safe Harbor — and America’s free-wheeling attitude toward privacy in general — but the Summer of Snowden provided a perfect pretext to “reconsider” the efficacy of the Safe Harbor.

America’s hodgepodge or “sectoral” approach to privacy has increasingly placed U.S. officials on the defensive, and there’s no question the Safe Harbor can be improved.  However, conflating Safe Harbor reform with justified anger about expansive NSA snooping is counterproductive.  First and foremost, while public and private data sharing is increasingly intermingled, government access to data is not the same as commercial data use.  The Safe Harbor was explicitly designed to protect the commercial privacy interests of EU citizens.

It was not created to address national security issues, and the Safe Harbor specifically provides an exception from its requirements “to the extent necessary to meet national security, public interest, or law enforcement requirements.”  As FTC Commissioner Julie Brill has noted, national security exceptions to legal regimes are not unusual.  For example, the HIPAA Privacy Rule permits the disclosure of private health information in the interest of national security, and even the EU’s stringent Data Protection Directive includes an exception for state security or defense.

Read More…

From Collected Criticism to “Slamming” an Attorney General

Last Friday, I helped draft a few thoughts on behalf of the Future of Privacy Forum regarding the New York Attorney General’s efforts to subpoena information from 15,000 Airbnb users in New York City.  We wondered about the breadth of the AG’s request, and suggested only that “wide grabs of consumer data by well-meaning regulators can have a serious impact on consumer privacy.”

Later that day, Kaja Whitehouse of the New York Post declared that FPF had “slammed” the AG, proceeding to pull some line from our “open letter” to suggest FPF was far more critical of AG than it intended–or certainly I intended.  Another victory for overstrong rhetoric against even-keeled moderation!

Sen. Markey’s Drone Aircraft Privacy and Transparency Act Summarized

On Monday, Sen. Markey introduced legislation designed to expand legal safeguards to protect individual privacy from invasion by commercial and government use of drones. The bill amends the FAA Modernization and Reform Act of 2012, which directed the FAA to integrate unmanned aircraft systems (UAS) into U.S. airspace by October 2015. The law, however, was silent as to the transparency and privacy implications of domestic drone use. Under pressure from advocacy groups and Congress, the FAA solicited public comment about potential privacy and civil liberties issues during its UAS test site selection process, ultimately suggesting only that UAS privacy policies “should be informed by the Fair Information Practice Principles.”

This section-by-section summary looks at how Sen. Markey’s bill would amend current law to establish national guidelines for domestic drone use.

Sec. 1 – Short Title

Drone Aircraft Privacy and Transparency Act of 2013

Sec. 2 –  Findings

The bill notes that the FAA projects that 30,000 drones could be in sky above the United States by 2020, and further, that current law provides for no explicit privacy protections or public transparency measures with regards to drone use by public or private entities.

Sec. 3 –  Guidance and Limitations for UAS

The major substance of this section details new requirements for data collection statements by commercial drone operators and data minimization statements by law enforcement. The bill’s provisions with regards to law enforcement appear to bolster significantly Fourth Amendment privacy protections. Agencies would be subject to a warrant requirement for any generalized drone surveillance absent exigent circumstances, such as (1) imminent danger of death or serious injury or (2) DHS has determined credible intelligence points to a high risk of terrorist attack. Moreover, any information collected that was unrelated to a potential exigency is required to be destroyed.

While these provide practical, procedural limitations on surveillance, the bill also forces law enforcement to consider how they plan to use drones prior to their implementation. Law enforcement offices will be required to file an explanation about any policies adopted to minimize the collection of data unrelated to a warrant-requirement, how excess data will be destroyed, and detailing any audit or oversight mechanisms. By making licenses contingent on these statements, the bill may encourage careful consideration of privacy challenges before law enforcement begins broad use of drones.

For commercial operators, the bill would prohibit the FAA from issuing licences without a statement that provides information about who will operate the drone, where the drone will be flown, what data will be collected and how that data will be used, including information about whether any information will be sold to third parties, the period for which information will be retained, and contact information to receive complaints. Depending upon how onerous these statement requirements become, this section may present some First Amendment challenges, particularly public efforts to advance newsgathering and the free flow of information.

The FAA would be charged with creating a publicly searchable website that would list all approved drone licenses, including copies of data collection or minimization statements, any data security breaches, and details about the time and location of all drone flights.

This section also calls for the Departments of Homeland Security, Commerce, and Transportation and the FTC to conduct a study to identify any potential challenges presented by drones to the OECD privacy guidelines. It would also require the current UAS rulemaking underway to take those privacy guidelines into consideration.

Sec. 4 – Enforcement

The section provides for concurrent enforcement by state authorities and the Federal Trade Commission under its Section 5 authority. It also allows for a private right of action for violations of either an entity’s data collection or data minimization statement. Remedies include equitable relief, and the greater of actual monetary damages or statutory damages of up to $1,000 for each violation.

Sec. 5 – Model Aircraft Provision

Finally, the bill provides for an exception for model aircraft.

***

Sen. Markey introduced a largely identical version of the Drone Aircraft Privacy and Transparency Act of 2013 earlier this year as a member of the House of Representative, and last year, as well.

Buying and Selling Privacy Essay Published by Stanford Law Review Online

My essay on how “Big Data” is transforming our notions of individual privacy in unequal ways has been published by the Stanford Law Review Online.  Here’s how they summed up my piece:

We are increasingly dependent upon technologies, which in turn need our personal information in order to function. This reciprocal relationship has made it incredibly difficult for individuals to make informed decisions about what to keep private. Perhaps more important, the privacy considerations at stake will not be the same for everyone: they will vary depending upon one’s socioeconomic status. It is essential for society and particularly policymakers to recognize the different burdens placed on individuals to protect their data.

Buying and Selling Privacy Paper

Judge Alex Kozinski has offered to pay $2,400 a year to protect his privacy. Meanwhile, Federico Zannier started a Kickstarter to “data mine” himself and ended up making $2,700. One’s rich and can pay to protect his privacy; the other’s not and is selling every bit of his info. I’ve posted my paper on this subject to SSRN.

 Scroll to top