The NSA and Tech Change, Part II: The Dialectic of Strategy and Counter-Strategy

Nathan Andrew Fain's comment on my last post was so interesting, I thought I would respond to it here. In that post, I briefly explored—and mostly asked questions about—how the NSA's programs, like PRISM, may be shaping technological change. As many know, there is a long—several hundred year—history of defense spending and priorities influencing science and technology, and I wanted to ask how government surveillance programs might do the same. 

In his comment, Fain considered the flip side of my point, namely how the Snowden Affair might encourage others to change technologies. He wrote, "The NSA programs, or more accurately the revelation of them, will push in ernest the development of subversive technologies." He went on to talk about John Gilmore and the cypherpunk movement, which sees cryptology and the avoidance of surveillance as potential loci for social change. I knew nothing about this movement, know little more now, but am hoping to learn, first by reading this book. Fain's comment is fascinating, and I encourage everyone to read it as well as to check out his website, deadhacker.com

I'd like to examine Fain's comments through the lens of technology studies by thinking for a moment about strategy and counter-strategy and how this dynamic shapes technologies and the practices that surround them.


When I was in grad school, I spent a good bit of time wondering how hacking influenced technology. This was during the time that I was reading Cyril Stanley Smith, who talked about how inventors and innovators often have a tacit connection to their medium won out by a great deal of experience. This connection leads to a sense of "play"; invention becomes a kind of second nature. Smith's account reminded me of hackers I knew, who seemed to have an easy and fluid relationship with computing and who enjoyed nothing more than the thrill of doing what was not to be done. But did hacking do anything (technologically) more than stress out systems managers and induce better security programs? Before Fain's comment, I had not considered the inverse of this dynamic, that ever expanding surveillance systems fostered technologies of concealment, and that it wasn't only criminals and terrorists who wanted to escape detection but also techno-libertarians, cyber-anarchists, and the like.

The dialectic of strategy and counter-strategy is an essential part both of technological change and changes in how we use technologies. The phenomenon is as true of business as it is of war, but I will give a few examples from the latter. In the Viet Nam War, the United States found new strategies for the helicopter, especially through the famous 1st Cavalry Division. Helicopters enabled novel kinds of troop movements and air support during battles, but the Viet Cong quickly adapted to the technology. They would sit and wait for helicopters to come in, before lighting them up as they neared the ground, turning the vehicles' inhabitants into sitting ducks. In another, perhaps apocryphal, example, the M1 Garand rifle that US soldiers used in WWII made a loud 'ping' sound when it had run out of ammunition. In close range combat, Japanese soldiers would wait to hear that sound before rushing the US troops. The US soldiers developed a counter-strategy, however. Working in two man teams—a sniper and an assistant—one soldier would use the rifle to make the 'ping' sound. When the Japanese soldiers began their charge, the sniper would already have their position lined up in his sights. While two examples focus on changes in practices, there are plenty of examples of strategy and counter-strategy shaping technological systems themselves, such as when, during WWII, scientists at Harvard realized that radar was under development at the MIT RadLab and playfully jammed the signal from across the Charles River.


It seems that Fain is almost certainly right. The revelation of the NSA's programs will be watershed moment for many people, some subset of which will actually work to produce new technologies for maintaining privacy. I think the real question is whether people will adopt these systems of cryptography and use them in everyday life. In a long theoretical essay that I finished recently and will probably never publish, I spend a lot of time discussing how scholars in technology studies have concentrated for too long on how technologies are "constructed," or achieve their final form. Often the more important issue is whether technologies are adopted, especially whether they are adopted on a massive scale. At this point in time, sadly, economics is more helpful than history or sociology (because people in the latter fields have talked too much about construction). One significant exception is the work of the rural sociologist and communications scholar, Everett Rogers, whose Diffusion of Innovations (1962 and many subsequent editions) is still the gold standard for studies of technological adoption. Price and effort are always important factors in whether potential users adopt a technology, but other factors can also play a role.  

Being pissed off could be one such factor that trumps cost and effort, but keeping information secret takes time, discipline, and at least a modicum of technical know-how. We live in a society where barely anyone reads terms of service and dwell in a land of flashing DVD player/microwave oven/cable box clocks. I have recently seen tech savvy people, such as the members of mailing list Interesting People (also the interesting account here), sharing public keys. What percentage of the population will be willing to go to such lengths to protect their privacy? (I foresee a study, if one hasn't been done yet, where economists push people to put a monetary value on their privacy and find that the value is $0. Not that such studies tell us much of anything at all.) Also, as one friend put it after reading Fain's comment, "The NSA has 50 nerds for every one of the cypherpunk nerds."

Yet, these last thoughts are getting me off track. The question of my last two posts has been this: How are the NSA programs influencing technological change? Fain must be right to point out that to answer this question we should look not only at the NSA programs themselves but at how people are reacting to those programs. To add one final thought, my last post argued that we should think about how knowledge produced through the NSA's programs spills over into the other sectors of society. To be perfectly symmetrical, we should also attend to spill over from the efforts of cypherpunks and other such dissidents. How will the technologies and practices they produce come to influence even those in society who are too apathetic and lazy to work for their own privacy?

Popular posts from this blog

Chocoalate...Sink and Float

A Novel History of Psychology

Pre-science/Prescience and the History of the Future