Surveillance City

Cameras outside your front door. Subways asking you to tap your card. Public Wi-Fi asking you to accept their cookies. More and more, it’s harder to live in the city without your data getting collected. In conversation with the Surveillance Technology Oversight Project’s executive director Albert Fox Cahn, Pratt Center's Scholar-in-Residence Lisa Berglund explores how to make sense living in a Surveillance City.

Photos of Lisa Berglund and Albert Fox Cahn. Text reads: In Conversation: Lisa Berglund and Albert Fox Cahn

Lisa Berglund: As planners, we love data. We love information. And we have this growing interest in new ways to collect data—whether it be through sensors, apps, and data logs. But maybe what planners aren’t as good at doing is understanding the legal implications and the potential threats to everyday people when we collect data in this way. 

Today, we are joined by attorney Albert Fox Cahn, the executive director of the Surveillance Technology Oversight Project, to help us parse these legal and ethical concerns. The Surveillance Technology Oversight Project (or S.T.O.P) is a New York based anti-surveillance group that pushes back on state and local surveillance with emphasis on the NYPD. S.T.O.P also looks at other governmental agencies and private companies that increasingly transform local communities here in New York and, by extension, around the world with these surveillance technologies.

Albert, do you want to begin by telling us how you got into this work?

Albert Fox Cahn: I’m a born and raised New Yorker, and basically I got into this because it was the cross section of all of these different interests I had. I grew up as an activist being targeted by police with surveillance, but I also grew up building computers in my spare time. 

And as far as the planning side of this: my uncle’s an art historian who writes a lot about urban planning, so that’s always been something I’ve thought about as part of surveillance as well. It taught me that surveillance isn’t simply this facet of government power. It’s not simply a corporate tool. It is part of how we design modern cities and decide who is welcome, who is included, and who is constantly put under threat.

LB: Imagine you, Albert Fox Cahn, are going from home to your office. Can you walk us through what forms of surveillance and data collection you encounter on that trip?

AFC: Well, it starts the moment I walk out the front door of my apartment where there’s a camera monitoring the hall. I call for the elevator, I go down, and there’s a log of that trip. I leave the front door and the smart entry system is cataloging that I left. You see biometric systems, like facial recognition, monitoring when we come in and out of these buildings. I walk down the street and everywhere there’s going to be more cameras. Some of those are owned by the NYPD, some owned by other city agencies, others owned by private companies that share those camera feeds in real time with the police department. More than 33,000 cameras are shared that way today. And some of them are just cameras like Ring doorbells and other private systems which are just one court order away from being turned into a policing tool.

I go to the subway and if I’ve forgotten to reload my MetroCard, I’ll have to use OMNY which will now track my identity via my credit card and the fact that I’m entering the subway system at that entry point. Well, as soon as I leave the subway, there’s going to be more of these cameras along the way. I get to the office building and more and more office buildings will require ID or require you to tap in when you get to the lobby. Another log of where I come and go. I go to the elevator and it wants me to swipe to show which floor I can go to tap into my specific office.

And that’s just the surveillance which is visible to me. The thing that is so terrifying is that for every one of the dozens of surveillance devices that I could see in that morning routine, there are tens—maybe hundreds—of other forms of data collection that are able to track me without my knowledge or consent. Whether it’s apps pulling my location data off my phone, whether it’s third-party biometric tracking using those camera systems, whether it’s other forms of data collection based off of payment patterns and other digital trails.

We’ve created a world where, unless you go to extreme, antisocial lengths to only engage in analog activities (to only use cash, to mask your face at all times... I’m not talking about a N-95 mask, I mean something out of a horror movie) and unless you’re going to really extreme lengths to hide your identity every moment of every day, suddenly we’ve lost the ability to navigate the city unseen and turned what I call the once open city into newly constrained corridors of control.

LB: And I think there’s a contrast that planners might be interested in here, which is this classic idea of the city as a place you go to be anonymous. It’s completely turned on its head at this point.

AFC: We think about, in 19th century literature in particular, how the city was where you went to escape the prying eyes of small town America. But we have those prying eyes in every corner of our cities now in digital form.

LB: I heard you describe the other day that there’s been a shift in social control between things that are physically manifested—like barriers, bollards, things like that—to something that is maybe more about surveillance technology, something that is everywhere and nowhere at the same time.

I think you really see those shifts in a lot of high investment areas. You now have private companies playing a bigger role in policing and these technologies are common for the sake of defending their investments in an area. These kinds of investments in these changing neighborhoods is the way that I’ve been thinking about these things.

AFC: Yeah. And I think there’s an interesting case study when you look at bank branches. 

So, there are these two banks I know in Brooklyn Heights. One is this ornate bank. It dates back to the late 19th century. It’s imposing: it’s built out of grand columns, thick bars, giant stones, all of these things to use physical force to keep an attacker from accessing the bank’s money. But what we see at the newer bank around the corner is open windows, customer service desks, all of these things that create a much more physically accessible space. But then we see every inch of that space covered by cameras, covered by other forms of surveillance And so they are both trying to create spaces that say to the person who holds an account there, “You are welcome,” and say to countless other people, “You are not welcome.” But one is doing it through analog protections and one is doing it through digital surveillance.

Map of every NYPD-owned CCTV camera in New York City
Map of every NYPD-owned CCTV camera in New York City. Provided by the Ban The Scan Coalition.

LB: I’m wondering if you could explain what you mean when you say that we’re being surveilled? What is the definition of surveillance for you?

AFC: Surveillance really just means any form of tracking. It can mean visual surveillance like cameras or police on the street. It can mean internet surveillance where you monitor my internet activity and who I communicate with. It can be behavioral surveillance where you use artificial intelligence to try to monitor my inner emotional light using artificial intelligence. Basically surveillance is this all-encompassing catchall of the age of mass tracking. And what we then focus on are the different constituent types of surveillance, trying to figure out which ones are most harmful and, well, how we can outlaw them.

LB: So a related question is: So what? Why do we care? What’s at stake here about being watched? Is it bad that we’re being watched? What would you say to someone who maybe makes the argument that I hear in response to my research: "So what? If you’re not doing anything bad, why do you care?" 

AFC: I’ve been there when an FBI SWAT team went to my client’s house, raided it and put the family through hell because they were considered suspects. I’ve been there when people were held for hours at JFK and other airports by customs officials who were willing to brazenly discriminate against Muslim travelers. There are times when, essentially, people have been targeted as being suspected supporters of ISIS because they went overseas to get medical treatment. I’ve seen what the cost can be to people whose lives can be completely upended simply because of what some database got wrong or what some artificial intelligence program misidentified. The threat by which people can be connected to and charged with the most egregious crimes is often so gossamer thin that we can’t even see it. And yet that’s enough to completely tie up someone’s life in knots.

But even as dangerous as the technology can be when it gets it wrong (and it gets it wrong a lot) and even as dangerous as it is when it blatantly discriminates (which it also does a lot), there’s an existential threat to democracy and an open society even when it gets it right. Because under surveillance, the technology can become a tool to target reproductive health, to target gender-affirming care, to target religious communities, political activism, and more. 

This type of mass surveillance is a threat to democracy and the most foundational autonomy upon which any viable political state is created. And so when I think about the threats here, I think it’s creepy, sure. I think a lot of people are making money based off my information, that’s not fun. But the thing that really drives me to do this work is the fear that we are building the technology that will undo our very political system.

LB: From a planning perspective, we, again, love data and we think in a more positivist sense that if we collect relevant data, we can make decisions that are optimal. These planning trends like smart cities inspired technologies and research methods in the name of efficiency, even environmental sustainability. We’re now seeing climate change monitoring becoming part of this surveillance apparatus. So, coming from a profession that just loves their data, there’s more questions that need to be asked here. But I think what a legal perspective brings to this is that it’s not just about making optimal decisions from a technical standpoint. I think number one, you cannot get perfect data this way. You absolutely can’t. And number two, there’s this question about just because we can, should we?

AFC: Yeah. I think when it comes to data and decision making, I always ask who’s getting the data? Who’s making the decision? Because in a lot of smart cities planning, we see consolidated and centralized data collection for some individual or small group of officials to make decisions. Where I think smart cities is at its best is where it’s actually pushing data out and democratizing decision making. One of my favorite smart cities’ innovations is the little sign by the bus stop that tells me how long I’m going to have to wait for the bus or how long I’m going to have to wait for the train. Now that doesn’t require collecting my individual data. It doesn’t impose a privacy threat. It makes my life appreciably better. And part of the reason why is because it’s allowing me to better make decisions as someone navigating the city rather than taking my data and then holding it hostage for someone else to make a decision about how I should best navigate the world.

LB: Do you think this is just a product of the times right now? We’re in this moment where we think that we need more of these kinds of technologies to make our cities work better. You just listed a couple of examples of when this is and isn’t true. So are these technologies needed to make our cities work better?

AFC: So at S.T.O.P. we developed the Just Cities technology principles, in partnership with a large cohort of civil society groups. It’s our framework for smart cities’ deployment. We look at when the technology is valuable, when it can be safely used, and when it’s actually a threat. 

And what we’ve seen historically with the adoption pattern of smart cities tech is it’s often driven by private companies, not civic officials or by public need. You’ll see tech companies that develop surveillance technology and then try to reverse engineer the way they’ll try to justify it to the public officials who are their clients. One of the most egregious examples of this was during COVID-19. There was this tech company that developed location-tracking, police technology to identify the location of students during a mass shooter incident. This exact same technology was then rebranded as a contact tracing tool. The claim being that if you use these exact same badges, you would be able to contact Trace COVID-19 exposures despite the fact that the technology was actually really poorly designed for that particular purpose. Then we saw them pivot yet again after the pandemic. And that’s just one of countless examples of times when these companies have tried to sell the product along very different lines depending on what’s in vogue, what they think will get most traction, and really without any of the design principles that would center the public in the question of what technology we should use.

The Just Cities Technology Principles developed by S.T.O.P. 1. Preventing police cooption Transit data should be used to meet transit needs only and should be kept safe from the police.  2. Needs first, data second Cities should collect and store only the data they need to meet identified transit needs.  3. Community engagement Data should only be collected after consulting the community about its transit needs.  4. Transparency and public oversight Cities should publicly announce how they collect, use, and protect transit data.  5. Equity Transit data should be used to make public transit more accessible: for example, to improve access to jobs and to healthcare.  6. Data minimization and privacy-protective data aggregation Cities should collect only the transit data they need and take steps to prevent that misuse of that data.  7. Weighing opportunity costs Before purchasing expensive “smart” tech, cities should consider investing those funds in better transit infrastructure, such as road repairs or newer buses.
The Just Cities Technology Principles developed by S.T.O.P.

LB: I’m wondering how you would like to see planners and lawyers work together to advocate for surveillance transparency in civic life generally?

AFC: When you look at a lot of the surveillance architecture, historically, planners all-too-often accept data collection as a public good or, at a minimum, a neutral act. And I think there needs to be a recognition that data collection can be harmful and that anytime you are collecting data as part of your work, you need to understand the full scope of risk. And I think lawyers can be helpful in working with planners to come up with the policies, the procedures, and oftentimes the new state and local laws that we need to protect different types of data that are being collected. For example, it is a choice that today all of the automated license plate readers that are used in New York City to monitor traffic and collect toll data are also accessible to the police department.

That is an affirmative choice that we made as a city through our legal system. But we could just as easily have made the choice to say that our people’s information is a protective right. “No, that information should only be used for toll collection. No, that data should not be turned into a weapon for ICE and the NYPD.” Lawyers need to be part of this conversation in thinking about how to turn those design principles about effective, proportional and protective uses of data collection into an enforceable reality.

LB: I wonder if you can talk a little bit about how legal mechanisms and pathways allow for both publicly and privately collected data, say from public spaces, get into the hands of the NYPD, for example?

AFC: So one of the things I’ve been pushing for a little while now is something I call legal firewalls. These are legal protections that prevent law enforcement access to data that normally they would be able to get. And the classic example of this is census data. Following the internment of Americans of Japanese descent during World War II, where more than 120,000 Americans were jailed for the crime of their ancestry, we had a deep fear of the census because that data was used by the FBI and military officials to imprison Japanese Americans. And so we passed this law that said, “You cannot use census data for any purpose but the census. Full stop. And if you do, it’s a felony, it’s a federal offense.” And that was needed to restore confidence in the census and allow for the census to work, because you can’t have the census be a viable and accurate reflection of the population if people are scared of giving their information to the government.

That is the principle we need to expand to more and more areas of life. Because today, generally, if surveillance takes place in public places, there’s no protection whatsoever. Not only is the NYPD free to install cameras in public streets, and to get camera feeds from private businesses, but anytime an individual or a corporation records someone, the police can obtain that footage by a subpoena or a warrant. And under the American legal system, the presumption is that police have the legal right to access all information.

Sometimes they can get it without any court process, sometimes a subpoena, sometimes a warrant, sometimes even a more sophisticated type of warrant called a Title III warrant for wiretaps. But we’ve never historically recognized that there should be pools of data that police never can get their hands on. And I think that, given the power of these systems to track us in public places and the power they give the police to engage in mass surveillance, we need to start walling off a lot of these data systems from law enforcement through new state and local laws and one day even federal statutes.

LB: If you could make sure that planners and development professionals knew one thing about what the possible implications of this stuff is, what would you want them to know?

AFC: I would want them to spend an hour talking to someone who has been raided by the FBI or who has been put into deportation proceedings because of one of these distributed data systems. It’s easy to think of these as far-flung concerns or hypothetical until the moment you see the tears rolling down someone’s face, and hear the quiver in their voice, and the rage, and the anguish at having their life upended because of how the government was able to weaponize the data that was collected from them, against them and their family.

I think we really need to have a culture of pronounced empathy when it comes to the way we collect data and use data. We need to recognize that those of us who may not fear this data being taken from us are really lucky and privileged to be in that situation and that we have to design cities that are protective of everyone who lives in them. Especially those who are put most at risk by the systems that would track their movements and allow police and other agencies to deconstruct their lives.

I would say technology doesn’t get to control the trajectory of our cities and our civilization. The same technologies are available in so many different countries, and yet we see the impact vary tremendously based on the culture surrounding their deployment and laws that control their use. And we get to decide, democratically and collectively, whether we want these technologies to create new tools of centralized monitoring and control, or if we want to prevent that Black Mirror dystopia from becoming true.

And there are a lot of people who are going to profit from the proliferation of these systems without the guardrails I’m talking about and who are going to urge this to be part of the cities we build. But I think that we all have a moral duty in the work we do to push back where we can see these harms unfolding because of our work and to really push towards the sort of civilization we would want to pass on to future generations, and not something that looks like it’s out of a half-baked science fiction movie.

LB: Yeah, absolutely. That sounds like a great and scary note to end on.

Date

15 Jun, 2023