Heading to Boulder (YES!!!) for the annual all day Silicon Flatirons privacy conference on Thursday, Dec 4th.
Four panels inspired by recent controversial research endeavors, findings, and disclosures by private companies like Facebook and Target will be full of dynamic discussants and presenters. This post is my prep – the best collection was put together by James Grimmelmann here.
The first panel will debate whether we should be alarmed by the type of human social science currently taking place on online services. Moderated by Paul Ohm, this panel has the two main academic Zeynep Tufekci has posted one of the most read (by academics at least) responses to the Facebook study, arguing that the information asymmetry, ubiquity, opacity, and lack of choice create a dangerous research environment and how to frame the issue properly (I also think that all the law profs in attendance would find her post on research methods interesting and valuable). Tal Yarkoni‘s response is presented here (and disagreement expanded upon in Zeynep’s comments). He argues that nudges are and have been part of the world for a long time (subliminal advertising was investigated by the FCC in the 1970s after a wide spread freak out occurred and it’s apparently banned on television in the UK) and that these types of projects are not inherently good or bad – nudges can help us give to charities, vote, and make healthier choices. I’m not sure what Matthew Boeckman, Vice President of DevOps at Craftsy, is going to say but Craftsy looks awesome. Kashmir Hill, Senior Online Editor Forbes (and my favorite privacy writer) will hopefully discuss her thoughts on packaging this research under “improving services.” Another Forbes author wrote that you should leave FB because of the emotional manipulation research. Rob Sherman, Deputy Chief Privacy Officer at Facebook, will surely be defending these pursuits at FB and quelling fears by describing internal safeguards. Previous explanation from researchers here, ends with: “While we’ve always considered what research we do carefully, we (not just me, several other researchers at Facebook) have been working on improving our internal review practices. The experiment in question was run in early 2012, and we have come a long way since then. Those review practices will also incorporate what we’ve learned from the reaction to this paper.” Looking forward to getting the updates.
The second panel (moderated by YT) will focus on the changing nature of science and research, particularly focused on the public versus private divide. Ed Felton will be presenting (I think) on the impact of the gap between industry and research practices, namely that it will drive a wedge that prevents company researchers from being able to publish their work and that it will lead to academic researchers evading the cumbersome IRB process by collaborating with companies. Chris Calabrese, newly minted Senior Policy Director at the CDT will be responding. It looks like the CDT’s take on the is that when when user testing rises to the level of changing a user’s experience or engaging in analysis of private information that users don’t reasonably expect a service provider to examine, they should be actively notified. Long ago (2008) Aaron Burstein, now Attorney Advisor for Commissioner Brill at the FTC, argued that ECPA needed a research exception to conduct cybersecurity research and that corporate research should be vetted internally. One of my favorite humans, Jill Dupre, Assoc Director of ATLAS in CU Engineering, will offer insight into innovative technology studies research collaborations and models.
The third panel will be moderated by Harry Surden and consider whether “informed consent” is a viable concept in the big data age. James Grimmelmann told the Washington Post “What Facebook and OkCupid did wasn’t just unethical. It was illegal.” I hope that’s what he’ll be presenting on. In Wired, Michelle Meyer criticized the way the findings were presented, explaining the findings overstate what the research could possibly have known from the study. “The fact that someone exposed to positive words very slightly increased the amount of positive words that she then used in her Facebook posts does not necessarily mean that this change in her News Feed content caused any change in her mood.” Michelle will be also be presenting – bringing in unique expertise as a bioethicist. Personally, I want to know if not being a FB user is even an effective form of denying consent – just because I’ve never had a FB account doesn’t mean I haven’t been affected by the practices of the company. Discussants include Claire Dunne, the IRB Program Director at CU, and Janice Tsai, Global Privacy Manager at Microsoft Research.
The fourth panel will be moderated by my favorite Colorado attorney Nicole Day and look at the institutions for ethical review, with a particular focus on the history and current status of IRBs. If consent is out – then IRBs all over the place makes sense if they can do any better than individuals at assessing possible harms and outcomes. Omer Tene will be presenting (I think) on corporate IRB-like processes that are already in place and will spread, as well as what those processes should look like. Attorneys like panelist Jason Haislmaier (who is also a CU Law adjunct) are likely the people companies will and do consult on these issues. Ryan Calo, who was writing about this exact subject at least a year ago, provided the NYTimes the following:
“This is a company whose lifeblood is consumer data. So mistrust by the public, were it to reach too critical a point, would pose an existential threat to the company,” said Ryan Calo, an assistant professor at the University of Washington School of Law, who had urged Facebook to create a review panel for research. “Facebook needs to reassure its users they can trust them.”
Also contributing to this last panel will be FTC Commissioner Julie Brill, who is also doing a chat with Paul between the third and fourth panels.