The workshop on Uses and Misuses of Connected Devices took place on 3rd–4th April 2019 at The Alan Turing Institute, London. See detailed agenda for the workshop.
Within the context of the Internet of Things (IoT), emerging technologies based on networks of sensors bring both opportunities and threats. New use cases are being explored across the public and commercial realms, but new risks of privacy invasion and other individual and social harms are also being created and debated. Connected devices are being deployed in and across a rapidly increasing range of ‘smart’ contexts, including the physical body, the home, the office, service infrastructures, transport, urban spaces, farms, forests, rivers and seas. Traditional notions of public and private space are disrupted. Data, including personal data, from these contexts are increasingly important for commerce, public services and law enforcement. Regulatory recognition of ‘privacy (or data protection) by design’ points towards making IoT compatible with ethical and legal principles and norms and has implications for data science.
We believe that a combination of perspectives and disciplines are required to address key questions:
- What new challenges do these developments pose for policy-related issues such as governance, ethics, privacy and other rights and freedoms, accountability, trust and transparency?
- How does the increasing availability of low-cost sensors affect the relationship between citizen and state in determining what is monitored and how the data is evaluated and used?
- What new possibilities and limitations are there for engaging the public as agents, subjects and critics of IoT?
- What light do ubiquitous sensing devices cast on the relationship between humans and material objects, on conceptions of space, on the self, and on autonomy?
The workshop was organised around four themes:
1. Capturing Data in Public and Private Spaces
The proliferation of networked devices across both public and private spaces has triggered concerns about surveillance and loss of privacy. Monitoring devices in public spaces such as CCTV have been usually justified in terms of security and crime-reduction, but ‘people tracking’ is increasingly a priority for city centre retailers and transport operators. Often it is hard to disentangle whether ‘smart city’ data capture is being carried out for public good or private profit. At the same time, a whole range of new techniques for tracking urban movement and attention are being deployed, such as signal strength from smartphones, cameras hidden in advertising panels, and intelligent street lights as sensor platforms.
While we have relatively little control over the deployment of tracking devices in public spaces, we are also inviting monitoring devices into our homes, and even our bedrooms, in the form of smart heating controls, voice-controlled devices like Amazon Alexa / Google Home, and internet-connected toys. Are normative boundaries between the public and private undergoing a fundamental change, or are we about to experience a move in which the thresholds of privacy are more firmly established?
2. Human Interfaces for IoT Systems
While some connected devices are intended to be undetectable in normal life, others are designed with human interaction in mind. Smartphones are the most obvious example, but there is a range of other devices, particularly in the realm of home automation, which serve both as sensors and actuators and are often underpinned by data processing in the cloud. We are becoming familiar with voice-controlled TVs and smart speakers but even garden parasols can now be opened via Amazon Alexa.
There is a paradoxical aspect to the interfaces that such devices expose. On the one hand, they are carefully designed to provide appropriate contextual feedback as part of the interaction; on the other hand, they typically conceal the data that is flowing out of the local context into a remote server. It could be argued that this duality is essential for usability. Yet it can also undermine trust and be viewed as deliberately misleading the user about the extent to which private data is being shared with unknown third parties. Can the interfaces of interactive devices become more transparent about how data is collected and processed?
3. Legibility of IoT-generated Data
We borrow the notion of data legibility from the core themes of Human-Data Interaction (HDI), where it is characterised as “making data and analytics algorithms both transparent and comprehensible to the people the data and processing concerns”. In particular, we are interested in how data that has already been collected about them can be made accessible and interpretable to people (as opposed to make the data collection process itself more transparent). There are a variety of use cases to be considered, such as: the routes and travel modalities that I use in my daily life; my Quantified Self data collection and use; my patterns in the way that electricity is consumed by appliances in my home; the length of time that I spend watching different genres of streaming content. Of course, just giving people access to datasets does not mean that they find it useful; nevertheless, there is considerable scope for developing techniques to make the experience of ‘reading’ data relevant and engaging.
4. Data Protection by Design for Connected Devices
With the enactment of GDPR, ‘data protection by design and by default’ (DPbDD) has become a legal obligation for the manufacture and deployment of connected devices. Particularly problematic is the notion of informed consent, since this raises questions of the sort mentioned in the preceding paragraphs and is further complicated when data flows across the boundaries of multiple processing systems and organisations.
A key issue is how DPbDD relates to the processes, stages and roles involved in the making and deployment of devices. Who should be responsible for ‘designing privacy (or data protection) in’? To what extent do innovation processes, and training for enacting them, prepare actors for the new obligation, and what changes are required in the education of designers and producers to enable them to fulfil it?
One challenging design factor lies in possible disparities between what a sensor is in principle capable of observing and what it is constrained (by hardware or software) to measure in a particular instance. For example, a microphone may be capable of recording all sounds in the human-audible spectrum but be configured so that it only captures noise levels in decibels. In other words, the device would be designed to be incapable of collecting personal data; nevertheless, it would be difficult for an observer to verify that this was the case. Conversely, a relatively dumb device could be supplemented with other data sources and machine learning to infer much more information about individuals than appears on the surface.
|Gilad Rosner (IoT Privacy Forum, Spain)||Permission / Permissioning / Permissionless: Three Faces of IoT Evolution|
|Alison Powell (LSE, UK)||Doing, Postponing and Evading Ethics: the politics and economics of ethics in IoT startups|
|Lachlan Urquhart (Edinburgh Law School, UK)||Designing and Regulating Smart Buildings|
|Alexandra Dechamps-Sonsino (Designswarm, UK)||Better IoT: It’s not just about the data|
|Phillip Stanley-Marbell (University of Cambridge, UK)||Hardware Privacy Guards for Integrated Sensor Systems|
|Grace Annan-Callcott and Cath Richardson (Projects by If, UK)||Connected devices and designing for safety|
|Adriana Lukas (London Quantified Self, UK)||Does it makes sense to talk about personal data when individuals don’t have a way of managing them?|
|Richard Mortier (University of Cambridge, UK)||On the Edge of Human-Data Interaction with the Databox|
|Paul Comerford (ICO)||IoT Regulation: Security and Privacy|
|Linnet Taylor (Tilburg University, NL)||Global Data Justice Project|
|Ewa Luger (University of Edinburgh, UK)||Human Data Interaction Network Plus|
|Carsten Maple (University of Warwick, UK)||PETRAS IoT Research Hub|
|Paul Coulton (Lancaster University, UK)||Prototyping Alternate Presents and Plausible Futures for IoT using Design Fiction|
|Ewa Luger (University of Edinburgh, UK)||Ethical Systems by Design?|
|Antti Silvast (Durham University)||Who ‘Uses’ Smart Grids?|