Skip to main content
winter2014

Seeing Surveillance: Kazys Varnelis & Trevor Paglen in conversation with The Straddler

Kazys Varnelis and Trevor Paglen, New York, 2013                                       Photo: Jeremy Lee

On September 26th, The Straddler sat down with Kazys Varnelis and Trevor Paglen at Studio X, a Columbia University facility in SoHo, for a conversation about surveillance, network culture, and secrecy.

Varnelis, an architectural historian, is Director of the Network Architectural Lab at the Columbia University Graduate School of Architecture, Planning, and Preservation. He is co-author (with Robert Sumrell) of Blue Monday: Stories of Absurd Realities and Natural Philosophies (2007) and edited The Infrastructural City: Networked Ecologies in Los Angeles (2009). We previously spoke with Varnelis on network culture in our fall2011 issue.

Paglen is an artist and geographer who has worked through photography to make visible the world of military secrecy in the context of the War on Terror. His books include Torture Taxi (2006), I Could Tell You More But Then You Would Have to be Destroyed by Me (2007), and Blank Spots on the Map: The Dark Geography of the Pentagon’s Secret World (2009). Recent exhibitions of his work have appeared in Istanbul, Cologne, and Brazil. We previously spoke with Paglen about his work in our winter 2013 issue.

Varnelis & Paglen, September 26, 2013
THE STRADDLER: To what extent has accession to corporate surveillance in the form of online data collection paved the way for acceptance of state surveillance? That is, to what extent do you see public knowledge of the existence of surveillance in the corporate consumer sphere with apparently benign results having a conditioning effect on the population that makes it more willing to accept state surveillance? What if the Snowden revelations had come out before there was any awareness of corporate online surveillance?

KAZYS VARNELIS: I think they condition each other, and I wouldn’t necessarily put one in front of the other. I think with the USA PATRIOT Act, which was passed in 2001 without a whole hell of a lot of discussion, we moved much further into a world of government surveillance, and every year we continue to move deeper. At the same time, people have been well aware of corporations’ data-gathering capacities and interests for a while. I don’t think there is necessarily anything specific about any kind of awareness that people are coming to now. Think of the example of the pervasiveness and power of the credit score, which is the capacity of corporations to gather information on us for their purposes. This sort of thing is increasing all the time, and—even after the debates sparked by the NSA revelations—I continue to note how as a culture we are not protesting this to the degree we should be. That’s something that disturbs me quite a bit, and I wonder what are we ever going to do to shake ourselves out of that situation.

THE STRADDLER: One of the themes in your work, Trevor, is the frustration experienced by not quite seeing, and not quite knowing. Related to this theme is your exploration of the contradictions produced by secrecy—your point that you cannot have secrets without a constellation of entities like buildings, courts, and juridical arrangements that must have some level of visibility.

When it comes to online surveillance, which is basically comprehensive and searchable digital record keeping—a collection of servers existing somewhere, searchable by algorithms available to people with “appropriate” clearance—how can we think about trying to see online surveillance, and how can we think about trying to know online surveillance?

TREVOR PAGLEN: That is a big question that I’ve been thinking a lot about, and a lot other people have been thinking about, too. I think that’s one of the interesting things about the Snowden documents—especially with the first few releases that came out. The first release was the FISA court order showing the collection of all the Verizon telephone data. Well, we sort of all knew that already, but it was important to have that document. That really sparked a big debate, and became a big story in a way it hadn’t been up until that point. So I think that’s an example of that program becoming visible in a way that was intelligible to a lot of people. It wasn’t abstract—it was a concrete piece of evidence. And I think that’s why that particular story resonated so much.

But beyond that, when you want to look at how the guts of these programs work, it’s very, very difficult because you need to have very specialized kinds of knowledge in order to recognize what’s going on. For example, there was a story that came out in early September about the fact that the NSA was deliberately compromising algorithms used for public encryption. A couple of researchers from Microsoft had discovered that something fishy was going on with these algorithms several years ago, and they gave a paper at the 2007 CRYPTO Conference at UC Santa Barbara asking if the NSA had put in a backdoor because there were some obvious weaknesses. Well, the Snowden documents reveal that that’s exactly what the NSA had done. That’s the sort of thing I mean by having to have very specialized eyeballs in order to see the technical details of what some of these programs look like. That’s something that’s very tricky, and it’s something that those of us who are politically and critically minded are going to have to think about. We’re going to have to develop new techniques of seeing because more and more of the world consists of machines talking to other machines using a language that we are not familiar with. How do you see algorithms and how do you see networks? People are working on these kinds of things—you know, developing protocols for detecting surveillance, or detecting whether or not parts of the Internet are censored from one place to the next—but it’s difficult because it is very much at odds with our object-oriented perception.

THE STRADDLER: Kazys, there is a passage in your book Blue Monday which bears on just this. You’re talking about One Wilshire, which is a 39-story building in Los Angeles constructed in the modernist style in 1966. It originally served as an ordinary office building, but now more than half its floors are filled with only cables and servers. You and your co-author, Robert Sumrell, write, “just as the facades of International Style modernism dissimulated invisible dealings within, the fiber optic net’s presence does not reveal what passes through it.” And then you go on to say, “The promise of a world of clarity and transparency is undone by the reality that modern telecommunications constitute a web that refuses to become visible. Yet again the modern tries for the transparent through technology, yet again it fails.” As more of the world’s activity is routed through places like One Wilshire—so many floors of which look like a lot of nothing to the naked eye—what are the potentials for seeing and knowing what’s going on?

VARNELIS: I think the kind of work that Trevor is doing, or the sort of work done by the Center for Land Use Interpretation—out of which the whole One Wilshire investigation came—is really crucial. It resists a drive towards much greater secrecy, and much greater invisibility by the very same organizations, institutions, and corporations that wish us to be completely transparent. You asked Trevor about the physical aspects of some of these places, in terms of their visibility. Well, just outside our door, caddy corner across the street from us is the Varick Street Detention Center, which is a facility being used to detain illegal immigrants. Previously it was also used as part of the extraordinary rendition program. Now this is a building in which you can go to the post office, and get your passport, and do other things that are ostensibly public. It presents itself as a place in which you think you know what goes on. But then, a few years ago, one prisoner died of pneumonia amidst talk of overcrowding and unhygienic conditions. Lawyers from the New York State Bar Association have raised concerns about what is going on there, and yet here it is—very much present, but very much hidden, right here in SoHo, in the realm of media and Silicon Alley.[1]

201 Varick Street, Manhattan                                                                                       Photo: Jeremy Lee

THE STRADDLER: This idea that you can discover horrors in otherwise banal or ordinary settings is a big part of work you’ve done, Trevor. One image of yours in particular that comes to mind in this regard is a photo you shot out on Long Island that looked like it could have been of a small, local insurance or real estate office.

PAGLEN: That’s part of a project that’s called “An Everyday Landscape.” There are a series of them, and it was structured by picking a topic and then finding as many documents as I could having to do with that topic. So the one you’re mentioning out on Long Island was Sportsflight Airways, Richmor Aviation, Dyncorp, Central Intelligence Agency. It was sort of two front companies—hybrid companies that on the one hand are real companies, but on the other hand also do secret work for the CIA. The project was about finding receipts, court documents—whatever kinds of paper trails that a company generates in the world, and going to and photographing all the places that are mentioned, and even trying to find and photograph all of the people who are mentioned.

An Everyday Landscape: Sportsflight Airways, Richmor Aviation, Dyncorp, Central
Intelligence Agency
, Trevor Paglen, 1996-2006, 2013
56 pigment prints, 11 x 15 in. each
Courtesy of Metro Pictures, New York; Altman Siegel, San Francisco; Galerie Thomas Zander, Cologne.

THE STRADDLER: And so rather than having something like, say, a military test center or proving ground, as in your “Limit Telephotography” series, where there is a clear boundary between the public and the secret—and you are on the perimeter trying to see in—this project was eerie in a different way because it felt like you were documenting a general tendency towards invisibility intertwined with the everyday whereby what is secret is present and also hidden everywhere around you.

PAGLEN: Yes, that was one of the points of that project. You follow these paper trails and what you end up with are a bunch of photographs that look like you’ve just taken ordinary snapshots on the street.

An Everyday Landscape: Sportsflight Airways, Richmor Aviation, Dyncorp, Central
Intelligence Agency
, Trevor Paglen, 1996-2006, 2013
56 pigment prints, 11 x 15 in. each
Courtesy of Metro Pictures, New York; Altman Siegel, San Francisco; Galerie Thomas Zander, Cologne

THE STRADDLER: What about the relationship between the methodological logic underlying something like the identifications made by the drone program and those made by domestic corporate and government surveillance? For example, in the drone program, there are what are called “signature” strikes—targeted assassinations based not on positive identifications, but on a connection of data points that becomes the basis of actionable inferences about people. With domestic surveillance, you have a situation in which there is a ceaseless recording of data points which can be connected in any way, now or in the future, to draw an—and, perhaps, any—inference about someone. And yet, the sheer volume of data will necessarily strip many of these data points of their meanings because interpretation will require wresting them from adjacent, contextualizing points. Is there a relationship between the logic of “signature” strikes and the logic of domestic surveillance in this way?

PAGLEN: Well, I mean, in terms of a direct connection, I don’t think that domestically you’re going to have a scenario where the NSA is going to create a profile about you and then a drone is going to come to your house and blow you up. But I do think what is going to happen, and what is happening already, is something like Kazys mentioned: the expansion of the logic behind something like your credit score—the quantification of one’s behavior, or one’s personality, in order for corporations to do risk assessment. You know, is the amount of your car payment going to be dependent upon whether or not there are pictures of you drinking beer on Facebook? Just as an example. This is the kind of logic that I absolutely see expanding going forward. And anyone who’s had a problem with their credit score knows how difficult it is to fight those algorithms. And that’s not something that’s happening in the future—that’s something that’s already happened, that is happening more each day, and that we haven’t yet learned how to see.

VARNELIS: I think you’re absolutely right, and I think they are related in the sense that they are both the products of the same kind of mindsets, the same kinds of technologies. And both have that same kind of inevitable capacity for normal failures based on the fact that we’re using data that is inevitably reduction, that inevitably has noise in it, and that is fed into algorithms that will inevitably be imperfect and yet are exceedingly difficult to correct.

THE STRADDLER: Are there two aspects of failure in this context that are related? You have the credit-score type of problem by which an algorithm creates a category of people who are “risks,” and within this category are people who have been miscategorized—even according to the algorithm’s own criteria—and who have to assume the burden of proving they belong in another category—or, if we are talking about drone strikes, of trying to evade the crosshairs. As a company, or as a government, you probably don’t care about those people—in the sense that overall risk you are exposing yourself to is presumably lower if you err on the side of mistakes that overestimate the perceived threat. But related to this problem is the inevitable failure of the algorithm, with all of its inputs, to predict in every case. For example, someone with a good credit score is going to fail to make car, mortgage, or credit card payments. Or, in the context of government surveillance or the drone program whose stated aim is to prevent terrorism, someone is going to evade detection or targeting in spite of the fact that data points exist on them. And don’t those failures then create a compulsion to expand the algorithm, which both increases the aspects of life that are fed into predictive technologies, and the number of people who are mistakenly put into, say, the wrong tranche?

VARNELIS That’s very possible, and for me the most immediate concern in this context is the mindset behind something like Obama’s response when he was asked about the Snowden revelations. He basically said he knew that data that was being gathered and he thought it would be enough that he knew, that the American people could trust him. That’s a frightening mindset, and one we should not accept when it comes to these technologies, and creating checks on these technologies.

Still so many of us think, “Hey, I’m not doing anything wrong, I have nothing to hide,” and so we end up acceding to some form of the logic that these programs and these technologies are beyond the purview of anything but the most pro forma accountability and scrutiny.

THE STRADDLER: Now that the public knows more about corporate surveillance—and, perhaps more saliently, about the broad reach of state surveillance—what are the potential long-term effects on society of a broad public awareness that so much of life is now under surveillance? That is, what will the effect be—not necessarily of the surveillance per se, but of the awareness of surveillance?

PAGLEN: The obvious thing that happens in a surveillance society is that it becomes very conformist. People check themselves. You are sixteen and you want to go check out a radical book from the library or something like that and maybe you won’t do that in a society in which you know you’re under constant surveillance. And the cumulative effect of that is to create a society that is very conservative, and very conformist. So it has huge cultural effect right there.

But I actually think that the larger issue is that the mechanisms of surveillance are quite dramatic, and that a lot of people haven’t thought through what they mean. A lot of the conversation that happens around surveillance is still really individualized in terms of this question Kazys just mentioned: “If you have nothing to hide, what do you have to worry about?” It still comes back to this relationship between the individual and the state. You know, how much control does the state exert over an individual? But there’s much more to how the surveillance society plays out over the long term than that. For example, if you build a surveillance society, you create a giant power imbalance between the people and the state. That’s kind of an abstract thing to point out, but the whole point of a democratic society is that people have more power than the state. When you create mechanisms where that is not true, then you are creating a situation in which a democratic society is increasingly difficult to approach and maintain.

The second part of that is that we have to look at what the erection of the surveillance state means in the context of other things in society contributing to inequality, unrest, and insecurity. On the one hand, this has to do with the growing disparities of income and wealth. As numerous people have pointed out, we are living in a society that is unequal in a way that it hasn’t been in almost a century. In societies like that, unrest gets fomented and anti-democratic tendencies are encouraged. On the other hand, we are facing an environmental crisis the likes of which human experience on the planet has never seen. Wildfires, Sandy-like events—these are increasingly becoming the norm. So the question is, looking towards a future in which these forces are major sources of insecurity and instability, what are the tools that the state will have available to address those insecurities and instabilities? We’re living in a moment where the welfare state has been badly damaged, and in which there is tremendous pressure to dismantle institutions that provide economic security and food security. What tools will the state have available to quell unrest and try to manage a population under these circumstances? I’m concerned that if you build this button that says “the surveillance state,” and that’s the only button that’s available, that’s the button that’s going to get pressed over and over again in the future.

THE STRADDLER: Kazys, protecting or preserving what is left of privacy—and reclaiming some of what has been lost—would presumably now require the support of new institutions and infrastructures. What form might those infrastructures take—or what form do we already see them taking?

VARNELIS: I wonder if it wouldn’t be possible to build a coalition of interests between the left and the right—or even the liberals and the right—around the problems of the state. In the sixties, the state was criticized by the new left for being a monolithic entity that was a bureaucratic, militaristic, and suppressive force in the context of the Cold War. And on the right, there was the opposition to big government and the welfare state. Under Clinton, these unfortunately, but productively, came together to eliminate some of the good things about the welfare state.  So could a similar unholy alliance form in which you include people who are believers in, say, the right to bear arms, and who might also believe that Obama is exceedingly dangerous because he’s going to try to take away your guns, force Obamacare down your throat, and impose UN government on the United States. Pair these people with individuals who might be more Democratic, or outright leftists, who feel that these issues of privacy and corporate and government surveillance are exceedingly urgent and important. These alliances have been made before. Could we find them now?

It may be that if we’re to fight this thing, we need to create such alliances. Couldn’t we put aside our concerns about broader political agendas in order to focus on a single, pressing issue. I think of the Electronic Frontier Foundation as an example. They were very influential in the nineties and now they are working on issues related to surveillance and patent trolls. Theirs is a voice I would much prefer to hear over the voice of Wikileaks, for example. Theirs is a voice that was deliberately designed to embrace liberal and libertarian concerns about the power of government. Of course, there are many problems with the Silicon Valley libertarian position—but, as a single-issue organization, they are very valuable albeit not big enough or loud enough to really be heard. One problem, of course, is that the current generation of Silicon Valley wealth, from Apple to Google, is based on practices that involves massive amounts of data harvesting, so I very much doubt we’d see these individuals contribute to causes for privacy unless they somehow felt it hurt their bottom line.

From the standpoint of art and culture, there are also things that can be done. Whether it is the Center for Land Use Interpretation or other organizations that support individuals who resist the culture of secrecy and say that it is our right to find these things out, or organizations that seek to build audiences around these issues, or someone like Trevor, who is very clearly investigating political secrets and trying to reveal what’s there. I think that these kinds of things are very positive and necessary. It may be a kind of long war, if you will, of making these moves and engaging in these resistances. But it’s a necessary one.

[1] See Markowitz, Peter. “Barriers to Representation for Detained Immigrants Facing Deportation: Varick Street Detention Facility, A Case Study.” Fordham Law Review. 78. (2009): 241-274. and Bernstein, Nina. “Immigrant Jail Tests U.S. View of Legal Access.” New York Times 1 November 2009.

Website | + posts

Leave a Reply