Wednesday, December 1, 2010

Backscatter machines and the management of risk

...in a completely administered world, superelevated safety standards and bureaucratic claims to perfection turn hazards that pass through the finest technological sieves into an internal threat to social rationality and systems. On the one hand, economy, law, science, policy with their present constitution and aims are not in a position really to dam up and forestall the hazards; on the other, the institutionalized safety pledge they furnish constitutes the embodiment, as it were, of the non-existence of hazards. Thus proof of the hazards becomes a proof of institutional failure. -- Ulrich Beck, Ecological Politics in an Age of Risk
Beck is, of course, speaking here about ecological risks, but when I first read this paragraph, my mind turned immediately to the current uproar over the implementation of new airport security measures.

Although I've expressed my opinions on the backscatter machines (as well as the enhanced pat-downs) privately, I've been trying to think through the airport as a site where multiple risks are subjected to different types of management. The most obvious of these risks is that of terrorist attack, which is subject to management by the Transportation Security Administration. But, as we've seen over the past few weeks, airports are also sites where other risks -- medical, sexual, and personal -- are managed. And I think one of the best ways to approach the issue is by looking at the backscatter machine as a risky artifact.

So what are the backscatter machines, exactly? Beyond being a piece of technology that emits x-rays and then produces an image from the radiation that bounces off the object to be imaged, what does a backscatter machine actually do? Reading the machine as an artifact in the context provided by Beck, we can think of it as a device that embodies both the presence and management of risk.

In Beck's formulation, the institutions of modernity are ill-equipped to manage both hazards (which are naturally-occurring) and risks (which are man-made). The technologies of governance are not designed to eliminate risk -- even if they were, they would be incapable of doing so. The discourses of contemporary science and politics necessitate a constant moving-forward, and it is this momentum that generates situations of risk. But a part of this moving-forward is a promise -- sometimes implicit, but often explicit -- on the part of these institutions that they will make the world safer, that they will guarantee safety. That they will eliminate risk. The problem, says Beck, is that when risks and hazards are proven to be real -- when the underwear bomber sets his pants on fire, or a terrorist plan is stopped before it is even begun -- this is evidence not of an averted catastrophe, but rather of an institutional failure to live up to their promise. Risk still exists. Therefore the institution has failed.

The backscatter machine is designed as a technical solution to this problem. Design a machine that will catch terrorists, and the risk is eliminated. But everyone knows that the backscatter machine cannot possibly catch all terrorists. It manages the risk, but does not eliminate it. And in doing so, it becomes a problematic artifact.

The backscatter machine also negotiates other risks, one that are not directly under the purview of institutionalized protection from terrorist attack. It is at once a manifestation of fears about radiation and health, as well as fears about privacy and embodiment. The arrival of the machine in airports, meant to alleviate the risk of a terror attack, brings to center stage the risk of cancer, the risk of sexual assault, the risks of surveillance and the sacrifice of privacy. The enhanced pat-downs accomplish a similar feat, at once working to manage terrorism while at the same time creating the circumstances under which women, children, people with disabilities, and other vulnerable populations are forced to decide which interventions they are willing to accept. In this sense, the airport becomes a site of the management of multiple, intersecting, and competing risks. A pregnant woman trying to manage the risk of radiation exposure and sexual assault is confronted by an institution attempting to manage the risk of terrorism. This problem can only be solved one way -- deciding which risk is more important and privileging the management of that over all others. This should be, and is (I think), an unsatisfying conclusion.

What Beck provides us, I think, is not so much a solution to the problem as an alternative way of thinking. The management of risk by technoscientific means has been a consistent (and consistently problematic) feature of modernity. By shifting our focus away from technological approaches to the management of risk and towards a more reflexive understanding about how risks are generated and how we can talk about them, we might be able to move away from always reaching for the scientific fix. Of course, this would require not only a rethinking of airport security, but also an overhaul of our entire institutional framework to drag it into a century where our problems are perhaps not best dealt with under the rubric of the nation-state.

Or we can just keep yelling at each other about porno-scanners and underwear-bombers. It's not up to me.

No comments:

Post a Comment