Joseph Jerome on FTC IoT Privacy Report
Listen Live Today at 10AM on WebmasterRadio.fm.
FTC Report on the IofT
The Internet of Things (“IoT”) refers to the ability of everyday objects to connect to the Internet and to send and receive data. It includes, for example, Internet-connected cameras that allow you to post pictures online with a single click; home automation systems that turn on your front porch light when you leave work; and bracelets that share with your friends how far you have biked or run during the day.Six years ago, for the first time, the number of “things” connected to the Internet surpassed the number of people. Yet we are still at the beginning of this technology trend.Experts estimate that, as of this year, there will be 25 billion connected devices, and by 2020, 50 billion.
Last year, the Federal Trade Commission held a workshop on the privacy implications of the Internet of Things and just released a report of their findings. The report included recommendations in the following areas:
- Security – including security by design (building security into devices at the outset and not as an afterthought);
- Data Minimization – retaining information only as long as needed; and
- Notice and Choice for Consumers.
The Report, which appropriately does not call for IoT specific legislation, reflects the fact that the Internet of Things is in its infancy, and strongly supports context as a way to assess appropriate uses. The staff recognized concerns that a notice and choice approach could restrict unexpected new uses of data with potential societal benefits. They sensibly incorporated certain elements of the use-based model into its approach linking the idea of choices being keyed to context to take into account how the data will be used.
However, the report is overly cautious in that it recognizes that there are beneficial uses that can strain data minimization or could warrant out of context uses, but worries that allowing companies alone to judge the bounds of such uses without legislation would lead to mistakes. In many cases, the FTC already has the ability to use deception or unfairness authority to take action when a company creates risk to consumers without countervailing benefit. We hope the Administration’s soon to be released Consumer Bill of Rights charts options that can frame the parameters for out of context uses or data retention, by looking to codes of conduct and consumer subject review boards.
The Future of Privacy Forum (FPF) is a Washington, DC based think tank that seeks to advance responsible data practices. The forum is led by Internet privacy experts Jules Polonetsky and Christopher Wolf and includes an advisory board comprised of leading figures from industry, academia, law and advocacy groups.
Joseph Jerome serves as Policy Counsel at the Future of Privacy Forum. At FPF, Joseph’s issue portfolio focuses on big data and the Internet of Things, where he works educational privacy questions, smart technologies, and ethical uses of information. He also assists TeachPrivacy, a company founded by Professor Dan Solove that provides privacy and data security training programs.
Prior to working at FPF, Joseph served as a national law fellow at the American Constitution Society, where he organized events discussing human rights in the United States and a ten-year retrospective on the Department of Homeland Security. He is a graduate of New York University School of Law, where he was an International Law and Human Rights Student Fellow in 2010 and Boston University. He is a native of Davenport, Iowa.
In a recent Indiana Law Journal article entitled BIG DATA: CATALYST FOR A PRIVACY CONVERSATION, he notes:
The big data privacy bogeyman will only be excised through a combination of accountability, transparency, and ultimately, public debate. Yet this is bigger than a mere privacy conversation. The fundamental problem posed by big data may be less a question of how it impacts our privacy and more that it upsets our society’s sense of fairness. The debate around big data is often couched as something that implicates traditional privacy principles and that the uses and inferences drawn from our data invade our privacy, but this obscures the larger public policy challenge. We are increasingly threatened by abstract or inchoate risks to our autonomy and the state of our society, and no one has established the necessary trust to lead the way forward.