The Washington Post has a defense of the FISA bill that shows a breathtaking level of naivete. Consider this sentence, for example:
The measure requires an individualized, court-approved warrant to conduct surveillance targeted at Americans’ communications with those overseas and — in an expansion of existing FISA protections — at Americans abroad.
It’s true that the bill contains language nominally prohibiting surveillance “targeted at” a particular American. If the NSA wants to spy specifically on Tim Lee in St. Louis, it will need to get an individualized FISA warrant to do so. But what the Post fails to mention is that while an individual warrant would be required to intercept just my communications, no warrant would be required to intercept all international calls by St. Louis residents. As long as no particular St. Louisans were the “target” of the surveillance, and as long as foreign intelligence was “a significant purpose” of this surveillance program—an easy standard to meet—nothing would prevent the government from also using the information intercepted for a variety of other purposes, such as catching people engaged in tax evasion or online gambling.
Moreover, precisely because of the lack of judicial oversight of such dragnet surveillance programs, it’s not clear that the prohibition on “targeting” Americans will have any teeth. Here’s what’s likely to happen: the NSA will develop a variety of sophisticated software algorithms to scan all the traffic intercepted for various patterns of interest to the NSA and other federal agencies. The NSA could conceivably use hundreds of different filters that single out particular communications based on a variety of criteria—keywords, unusual patterns of calls or emails, communications with current suspects, and so forth. The judge reviewing the “certification” for such a program would be required to wade through hundreds of pages of documentation describing what the software did—probably written in dense, technical language and then translated into lawyer-speak. I’ve got a computer science degree, and I doubt I could tell whether the algorithms so described “targets” Americans; certainly no 70-year-old judge is going to be able to do so.