2017 Assembly Projects
THE 2017 ASSEMBLY COHORT came together around the challenge of digital security: how do we move beyond a world where virtually every computing device and network is insecure?
Over the four months they were here, they participated in a rigorous two-week seminar class - Internet & Society: The Technologies and Politics of Control - co-taught by Berkman Klein co-founder Jonathan Zittrain and MIT Media Lab Director Joi Ito.
For the rest of the program, the cohort members iterated and developed projects, centering around the original prompt. Below, we are thrilled to showcase the hard work that the Assemblers produced: Clean Insights, 3D Steganography, Information Fiduciaries and Data Transparency, Sherlock, and Undershare.
Clean Insights
Data collection has become the default for most companies. Unfortunately, the data that keeps a company running can inflict harmful side effects: security expert Bruce Schneier calls data a "toxic asset". The most privacy-conscious companies have proposed a simple solution to the problem: stop collecting data altogether. But abandoning data collection often means companies are "flying blind", an unacceptable trade-off for most CEOs. So, how do companies strike the right balance between preserving user privacy and driving successful product development?
Clean Insights is a privacy-oriented analytics tool that enables measurement of digital interactions in a safe, secure, and sustainable way. With Clean Insights, developers, data scientists, and others can extract value from data while preserving user privacy by discarding the "toxic" by-products typically generated by commercial analytics platforms. Their open-source Android SDK leverages techniques such as differential privacy, onion routing, and certificate pinning to facilitate this new level of control. In the time since its development, Clean Insights has been deployed on several applications.
3D Steganography
Digital fabrication enables distributed manufacturing, which in turn gives rise to distributed supply chains. Manufacturers can source parts with greater speed and flexibility than ever before. Unfortunately, this new paradigm has problematic implications for traditional quality control protocols. How can manufacturers verify and trust distributed production? For example, how can parts be traced back to their origins when they fail in service?
To facilitate trustworthy and distributed manufacturing, 3D Steganography proposes to incorporate information about the manufacturing process for an object into the object itself. 3D printing steganography offers a means by which hidden messages can be embedded in digitally fabricated objects by perturbing toolpaths meaningfully at machine resolution. The hidden message can then be read back after scanning the 3D object. If widely deployed, "fingerprinting" parts in this manner could bring a greater degree of accountability and trust to distributed supply chains, allowing manufacturers to quickly identify the source of faulty parts and adjust accordingly. Messages could be encoded in hexadecimal, binary, or any other digitally readable system and protected via encryption.
Information Fiduciaries and Data Transparency
"Should we treat certain online businesses, because of their importance to people's lives, and the degree of trust and confidence that people inevitably must place in these businesses, in the same way [we treat doctors, lawyers, and accountants]?" (Jack Balkin, Yale Law School]
In an environment where consumers are becoming increasingly concerned with personal data privacy and security, corporations have an opportunity to proactively reassure current and potential customers of their commitment to protecting user data. To that end, the Information Fiduciary project aims to create a forum for actors across the private sector and academia to work collaboratively towards an Information Fiduciary Consortium (IFC), a joint effort which advocates a tier-based approach and will work to find a balance between corporate and public interest. The unique differentiator of the consortium relative to other security compliance regimes is that it commits itself to directly providing value to consumers in a transparent and intuitive manner, building trust between companies and their customers. Companies must commit to fair security and privacy practices around user data collection, analysis, use, disclosure, and sale. Ultimately, the IFC would advocate for additional legal and financial incentives for compliant member companies.
Data Transparency is an in-progress visualization tool intended to complement the ideas and initiatives of the IFC. Fully developed, the tool would enable companies to update a decentralized database to document their collection and use of consumer data. Consumers could then use the tool to monitor when, how, and why their data is being accessed and disseminated. A demo of the tool is available here along with more information about the Information Fiduciary project.
Sherlock
Undershare
There is currently no mechanism for a non-technical user to easily share non-sensitive dataset attributes, facilitating discoverability without compromising data protections. Sharing data for the average user thus remains a binary choice - share everything or share nothing. As a result, dataset owners and qualified data users with whom they might be interested in sharing are often unable to find one another. Data discovery relies on unscalable personal contacts and relationships.
Undershare allows the user to share and discover datasets without exposing their sensitive contents, escaping this intractable "all or nothing" paradigm. Data owners can expose a limited set of dataset properties (e.g., row headers, or summary statistics) behind a "discovery" API. Those in search of data can query this API, follow up to negotiate deeper access, and exchange reputation-building feedback scores. This flexibility and control on the part of data owners has obvious applications for enterprise, academia, and the healthcare sector among others.
WRITING FROM THE INAUGURAL COHORT
Assembly is a meta-project, a grand experiment to bridge industry and academia. What are its strengths? What are its weaknesses? How can we make Assembly even better next year?
COMING IN FROM THE COLD:
A SAFE HARBOR FROM THE CFAA AND THE DMCA §1201
FOR SECURITY RESEARCHERS
Summary of the Piece
This paper addresses the problem of chilling effects on security research created by overbroad computer crime laws, namely, the Computer Fraud and Abuse Act and §1201 of the Digital Millennium Copyright Act. In order to avoid vague interpretations of these laws, this paper proposes a safe harbor for researchers who engage in a particular implementation of responsible disclosure. After disclosure of a vulnerability to the relevant vendor by the researcher, the two parties would engage in a specified vulnerability classification protocol, and based on the outcome of this protocol, the vulnerability would be assigned a remediation timeline, after the passing of which the researcher could publish the details without any risk of legal consequences.
Motivation and process
This paper, co-written by a cryptography Ph.D. candidate in the Assembly cohort and a student at Harvard Law School, arose out of conversations during the Internet & Society class about security researchers were often worried about the specter of lawsuits without real awareness of the particular legal realities. The interdisciplinary nature of the paper made it especially interesting to pursue, pushing each author to think in a manner different to that demanded by their normal specialty. The law student learned how to use protocol diagrams to make better arguments, and the cryptographer was pushed to consider the practical (and human) realities of lawmaking when designing solutions.