At the bank. At the mall. They are watching you. The question is not, Am I being watched? It is, Who is doing the watching?

In the wake of the terrorist attacks, we are hearing more about biometric technologies that can potentially pick a terrorist out of a crowded room, thus providing a measure of security in a climate of fear. Biometrics refers to advanced identity verification techniques that use personal characteristics such as fingerprints, facial patterns, and so forth to identify individuals. These features of the technology cause some to raise the cry of Big Brother.

In the months before the attacks on the World Trade Center and the Pentagon, privacy, particularly as it applied to commercial transactions on the Internet, was an issue of national concern. In fact, an outcry against the mammoth bookseller Borders over plans to implement a biometric identification system (to control shoplifting) led to the company to shelf the effort.

Fast forward to the United States, post‑9/​11: Security and safety from terrorists has become the number one concern, and biometric companies are scrambling to get their security technologies in the hands of the FBI and airports. The biometric technologies that have been used by casinos across the nation to catch cheating card players are now being deployed in airports and jails nationwide.

Many people are rightly suspicious of the new role that technology will play in preventing future terrorist attacks. Innocent individuals might find themselves under the stare of the “eye in the sky.” But we should not eliminate the technology that is in place — a technology that promises beneficial applications, such as locating a lost child, preventing against fraud, and ferreting out terrorists.

Biometrics has rapidly moved from a technology of the future to the forefront of the battle to ferret out terrorists and proactively protect America. Visionics, one of the leaders in the field of biometrics, has made a name for itself by offering its face‐​scanning technology free to the FBI. While both the ACLU and House Majority Leader Dick Armey (R‑Tex.) contend that the technology provides little payoff in actually apprehending criminals, it stands to reason that the kinks will be worked out if there is a societal demand for the technology. In fact, Visionics contends that its surveillance system successfully identified an individual wanted by authorities in the UK.

While many civil libertarians balk at the idea of this technology, there are some who realize that the problem is not with the technology itself but with the people who control it. As Prof. Dorothy Denning points out in her piece “Why I Love Biometrics”, the “liveness” (faces, eyes, voices, etc.) of biometric technologies means that the user will not have to remember the many secret passwords that most of us use on a daily basis.

More important than the implementation of the technology itself may be the simultaneous introduction of guidelines to limit government’s exploitation of the technology, particularly with respect to halting impulses to broaden surveillance against ordinary individuals. Of imminent concern is the desire of various state governments to implement biometric technology into Department of Motor Vehicle identification, in effect, turning this into a national identification system. There are many dangers inherent in a national identification card, better explained in detail in a recent op‐​ed by my colleague Robert Levy, but suffice it to say that this is one of the risks of putting biometric technologies in the hands of overzealous government officials.

Technology is a tool to assist us in restoring safety to a society that has been violated. But it is just that — a tool. Just as a hammer will not pound a nail unless swung by a human hand, technology alone is not the antidote to our ills. But with a modicum of restraint on the part of legislators and a degree of understanding of the post‑9/​11 world by civil libertarians, technology can be an important component in countering the terror that has gripped our society.