This post, on Part 3 of the Digital Innovation chapter ran long, so I’m cutting it into two posts.
Part 3: Creating a Trusted Process for Responsible Data Use (pp. 414-441)
a third core condition of digital innovation is instilling community trust that information collected in cities will preserve the privacy of individuals and be used for the greater good — while promoting the growth of new businesses and the rise of new tools to improve urban life. (p. 414)
In which Sidewalk Labs makes the case for a category called “urban data”
I tried going through these sections in chronological order, but they’re such a mess that I’m just going to summarize their points. See Dr. Natasha Tusikov’s comprehensive discussion of urban data and the Urban Data Trust for the final word on both.
This part of the report has two objectives: To describe and legitimate Sidewalk Labs’, um, novel data classifications.
“Some private spaces”
Sidewalk Labs sets the table with a couple of odd turns of phrase, such as:
Torontonians are also concerned about the collection and use of data gathered in the city’s public realm, publicly accessible spaces, and even some private spaces — whether or not that data identifies specific individuals. (p. 416)
What odd phrasing: “even some private spaces.” Are there other private spaces where people weren’t concerned about data collection? I would imagine that by definition people would be most concerned about data collection from private spaces, full stop. But, as we’ll see in a moment, this phrasing is key to setting the groundwork for the overall data-collection plan.
Turns out there are a couple of private spaces at play, such as “Private spaces accessible to the public, such as building lobbies, courtyards, ground-floor markets, and retail stores” (Digital Governance Proposals for DSAP Consultation, p. 13). So don’t set up in Quayside if you don’t want your customers and yourself being monitored.
Oh, and private spaces can also refer to “Private spaces not controlled by those who occupy them (e.g. apartment tenants)” (Digital Governance Proposals for DSAP Consultation, p. 13), which has a real creepy ring to it, especially when it comes to surveillance. Just imagine your access to affordable housing being contingent on agreeing to have Sidewalk Labs and who knows what other companies watching your every move. That type of deal would come uncomfortably close to coercion.
Which brings us to the second odd phrase in this section:
A second big theme heard during public consultation was that, in addition to personal and collective privacy, Torontonians are concerned with the ownership and stewardship of urban data. (p. 418)
Here, the odd phrase is “urban data,” a term that Sidewalk Labs invented and that only really hit anybody’s consciousness in October 2018 when Sidewalk Labs released its “Digital Governance Proposals for DSAP Consultation” document. Who was calling for this protection of “urban data,” and when? I don’t know.
What I do know is that from the very beginning Torontonians have expressed very strong concerns about the protection of personal data. That’s always what this has been about. Insisting that Torontonians were “concerned with the ownership and stewardship of urban data” smacks of Sidewalk Labs trying to fix the narrative to fit its interests.
Because here’s the thing: The protections Sidewalk Labs proposes for “urban data” only muddy the waters around this issue, and would effectively reduce the degree of individual consent sought.
What is urban data?
Sidewalk Labs does not make it easy for people to figure out how any of these terms work, because they present everything in tidbits and out of order. So, let’s try to make some sense of it, based only on this document.
Sidewalk Labs does not distinguish between public and private data, or personal and non-personal data. Rather, they distinguish between “transaction data” and “urban data.”
Urban data includes personal and non-personal data, aggregated data and de-identified data (although research increasingly demonstrates that no data can be permanently de-identified) (p. 417). It is data that is collected from the following spaces:
- Public spaces, such as streets, squares, plazas, parks, and open spaces
- Private spaces accessible to the public, such as building lobbies, courtyards, ground-floor markets, and retail stores
- Private spaces not controlled by those who occupy them (e.g. apartment tenants) (Digital Governance Proposals for DSAP Consultation, p. 13; creepiness quotient for the third category having been duly noted).
Note that the only spaces not covered by urban data are privately owned spaces.
Transaction data, meanwhile, covers “in which individuals affirmatively — albeit with varying levels of understanding — provide information about themselves through websites, mobile phones, or paper documents” (p. 415). Such data could be generated in public or private spaces.
Sidewalk Labs argues against the Urban Data Trust regulating “transaction data” because “the data collector is already accountable under applicable privacy laws …”
Then again, so is some personal “urban data,” to use Sidewalk Labs’ terminology.
Also, such data “arguably is not uniquely connected to public spaces, nor is it generally considered a public asset requiring additional protections within the public interest.” (p. 426)
The question here is, is the location of data collection the most relevant characteristic of data? Also, urban data, using Sidewalk Labs’ own definitions, covers data in both semi-private and private spaces. Sidewalk Labs’ argument makes little sense to me.
This redefinition of personal and non-personal data, or public and private data, into urban and transaction data, accomplishes two important objectives.
First (assuming it’s legal under Canadian privacy law), it gets around the problem of needing explicit individual consent from people moving through the smart city (in public, semi-private, and some private places) for the collection of personal data (as opposed to, say, a counter measuring the number of people passing over a bridge). I’m not a lawyer, and this document is anything but clear, so I would welcome any clarifications.
This move seems to be essential to the functioning of this type of Smart City, which depends on ubiquitous surveillance and data collection.
This mass consent would instead be provided by a central agency, in this case an Urban Data Trust. It would also probably depend on claims that this data can be de-identified, although as I’ve previously noted, it’s increasingly clear that de-identification is not a one-way street.
Although it only hints at it in this document (a single reference to “signage” on page 456, as far as I can tell) in an April 19 Medium post, Sidewalk Labs suggests that this “community consent” (Digital Governance Proposals for DSAP Consultation, p. 38) could be obtained by placing a variety of signs in the area to let people know how they were being monitored and for what purpose. But otherwise, this 1,500-page “plan” makes no mention even of how such consent could be obtained, sticking instead to musings about “being transparent” and “providing clarity” (p. 457). None of these goals necessarily have anything to do with consent.
Second, this urban-transaction distinction would leave “transaction data” wholly unregulated by the proposed data authority. Whereas, if Sidewalk Labs had concentrated on protecting personal data, they could have proposed a truly gold-standard policy that did just that. Instead, they’ve left the door open for mass individual surveillance while ensuring that Google’s personal-data pipeline stays wide open.
To be continued….