March 2004


Brad and I are in New York, see visiting friends, working, speaking at conferences and going to a wedding. Last night, we went with a group of his college friends to karaoke. I’d never done it before. Addictive! I think I peaked with my solo rendition of It Takes Two.

Today we had a Markle Foundation Task Force Presentation on “Creating a Trusted Network for Homeland Security”. Essentially, hygiene they want to build a computer architecture for sharing information across local, disinfection state, sales and federal law enforcement and foreign intelligence. They demo’ed an application that the FBI would use to communicate between field agents in different offices, and up and down the ladder of authority, using a scenario of informants reporting information related to a possible bioterrorist attack.

As presented, this is pretty difficult to get worked up about, either as a civil liberties issue, or as something that the Markle Foundation needs to spend its charitable dollars on. Regarding civil liberties, everyone thinks that the FBI should use the information it has more efficiently and that local and state authorities are valuable in law enforcement and prevention efforts. There are serious questions about information sharing between foreign intelligence services and domestic law enforcement. As many people know, the 1975-76 Church Committee hearings documented extraordinary federal government abuses of surveillance powers, including the NSA’s Operation Shamrock and Operation Minaret, CIA’s Operation CHAOS, the FBI’s COINTELPRO domestic harassment of dissenters and anti-war protesters that included illegal wiretapping. Congress reacted by establishing different standards for surveillance for domestic and foreign intelligence purposes and by preventing end runs around the higher domestic standards by limiting information sharing. See also: ATTORNEY GENERAL’S GUIDELINES FOR INFORMATION SHARING; EPIC’s Resources on Foreign Intelligence and Domestic law enforcement info sharing.

But other than that, the technology looks very useful.

Which is why I wonder why the private sector hasn’t approached the FBI to sell them something like this? Clearly, there’s a lot of money to be made from a big client like the Federal Government. And a lot of businesses would want to use a similar service as well. Why does the Government need the Markle Foundation to point out something like this to them?

It certainly isn’t because, as demonstrated to us, the system is any more privacy-friendly than a system that would be developed by a business. The only indication of any privacy protections encoded into the system is that video and IM chats conducted through the system are logged, as, I imagine, emails and searches are.

While logging enables you to go back and see what was done during the investigation, e.g. were improper searches made, it’s also a privacy problem, because then you have all those chats and searches about Richard Jewell or Steven Hatfill lying around. These remain in the system as evidence of suspiciousness, even if the suspect is later cleared. Worse, if, as the demo showed, the software will aggregate field officers interests to determine what people or threats deserve additional attention, sort of a worry aggregator, then we’re in danger of a self-reinforcing feedback loop making more Jewells or Hatfills than we would otherwise have.

This then raises more questions than it answers. First and foremost is how comfortable we should be with a technology that makes it easier and cheaper to investigate more people, without additional limitations, either in the form of policy, or better, in the form of technological constraints that enforce a privacy-friendly policy.

Second, the technology doesn’t distinguish between the types information that might go into the system. Its one thing to make a giant distributed database of informant leads, and another to add in all the other transactional data on innocent citizens that private companies collect and are selling or giving to the government, like shopping records, educational records, flight patterns, credit histories and the like. Will this information go in the database, and if it does, will we treat it differently depending on the situation, sensitivity of the information, etc. The program doesn’t appear to touch this issue.

Third, what do we use this information for? The Task Force assured us that the program was built to enable information sharing about existing suspects (subject based queries, in Jeff Jonas’ words), not to do some kind of terrorist profiling (pattern based queries). This would be good if it were true, since I think its near impossible to create an accurate terrorist profile from the small sample that we have (false negatives), and the risk of false positives is huge. But an effective system could easily be used for profiling, and there’s no safeguards built in to monitor or prevent that.

Some members of the task force seemed to be saying that this was not a political debate, and that policy would guide the use of the technology. But our relationship with technology is old enough that we should know better than that. In the 1960’s Jacques Ellul identified a “technological imperative” and his insights haunted me throughout the nuclear Reagan years. In an era where Code is Law, policy constraints are weak against an unfettered, unlimited technology: just look at copyright law and peer-to-peer. Once we make policy choices about information sharing, privacy and civil liberties, the technologies we build and adopt must promote, not undermine, those choices. I fear an information aggregation technology with no constraints other than a paper trail.

I don’t think that the S*NRC is going to put the audio or video up, viagra here but if they do, I’ll post it here and on the CIS blog.

I’m off to speak at a panel on the law of wireless open access points for the Stanford Networking Research Center.

S*NRC Special Seminar
´┐ŻOpen Access or Legal Trespass? – Rights Clashes Regarding Wireless Network Hotspots´┐Ż
March 18, ailment 2004, global burden of disease 3:15 p.m. – 5:15 pm
(Reception to follow)
David Packard Electrical Engineering Bldg.
Packard Auditorium
Stanford University

If there’s audio, I’ll post it.

Its slow going….

Daniel Gervais says that social norms empowered by technology are stronger than legal regulation. What happens when technology, viagra dosage supported by legal regulations results in a regime contrary to social norms, e.g. the DMCA means we can’t buy third party toner cartridges or garage door openers?

The second panel deals with ways that we could improve the current interaction between law-privacy-security.

Lance Hoffman thinks that we’re moving towards a database megadata “fishbowl” society. His response is a technological privacy oversight mechanism that watches the watchers, click encrypts data, treatment etc. in accordance with policy choices about when what information should be able to be accessed by who. On a similar theme, healthful see Steve Mann on sousveillance.

Forgetting: At what point should criminal records, unwise website posts, bad reputations be erased, and we be allowed to move beyond who we were in the past? The fishbowl society/internet archive never forgets.

I’m at our CIS conference on security and privacy today (and tomorrow). What follows is random thoughts and comments provoked by the presentations. For a more thorough blog of the conference, allergy click here.

The morning panel is discussing whether and how to apply tort liability to promote security. Solove’s insight is that its human error that leads to insecurity, and that we should be smarter about using readily obtainable identifiers like SSN as ID and passwords, making identity theft a lot easier to pull off. The panelists agree that negligence liability should be imposed on software vendors, but are assuming that such liability won’t overly deter innovation and that it will be possible to set an efficient standard of care. A gentleman from Microsoft suggested that its the people who write viruses who should be targeted, not the software companies. Froomkin suggests that there’s a benefit of imposing liability on the end users who purchase the crappy software in the first place, and then and won’t patch.

Clearly, there should be some liability for insecure software, if only because the software companies are in the best position to do something about insecurity. But we’re unlikely to get there, with EULAs, UCITA and USAPA-type laws that enable vendors to escape legal responsibility for vulnerabilities.

I can’t explain the feeling I had seeing the pictures from the Madrid bombing on the front page of the New York Times today. Someone has done a terrible, illness terrible thing.

Who that is, hospital we still don’t know, and certainly we need to. But I have a feeling that Americans will breathe a sigh of relief if its ETA and not Al Qaeda. That would be wrong, if relief comes from the feeling that we won’t be next.

Certainly it matters who is behind this if its the difference between an attack on the Spanish government and an attack on the Western world. That must play into how we deal with the problem of terrorism. We can neither ignore terrorists, nor strike back blindly against them, nor give in. But we, by which I mean me and most of the people reading this, will not be the people making that choice.

San Francisco’s Mayor Newsom is working on the problems of our City’s poorest neighborhoods. He’s going into Hunter’s Point, and talking to families of people killed there, and encouraging witnesses to those murders to finally come forward, with both rewards and promises of safety in the witness protection program. This is one of those 100th Monkey situations. People murder with impunity and no one will testify against them because if they do, they’ll get murdered with impunity. But once people start to come forward, and there are ramifications for these crimes, then killing witnesses won’t really be an option, because (1) you’d be in jail and (2) you’d get caught for the murder of the witness. But it takes a lot of bravery for the first few people to step forward and turn the tide around. I hope that they will do so, because too many people are living in danger in Hunter’s Point.
No one has come for the bird, medicine
despite posting signs at the pet store, and with the SPCA. Nonetheless, that little lost bird has found a home, with friend/neighbor Catherine and I. We are keeping him. Lil Greenie has two mommies.
March 11th, prescription
we woke up to the story of the Madrid bombings. A small bird was adopted today in a difficult and sometimes terrible world.

Next Page »