Independence Day @ ILaw!

(entry last updated: 2003-07-04 13:43:31)

Happy July 4th! Today’s topic, appropriately, is privacy – particularly since we have learned that there are 5 cameras in this classroom, and it’s not clear who/what is connected to them.

Everyone is trickling in, at a little slower rate than usual – it’s clear that everyone had a good time last night, and the early arrivers got to hear Charlie reveal another secret of Jonathan’s past – his dancing partner at the Brazil ILaw.

It looks like Jonathan will be starting us off today, after some announcements from Larry. A lunch barbecue today; and the wrap up today is about your questions. Surveys will be available and we hope you can take time to fill them out.

Jonathan and Molly Van Houweling will be speaking about privacy.

Molly and I have been talking about how to do this, in that it’s something people feel strongly about it, but when you get into it it becomes “a big pile of moosh.” (Term of art? <G>)

So, the topic is going to be why we don’t like teaching about privacy.

So, what is privacy – what do we mean?

We start with things that we want control over… …collection of personal data; …use of personal data; …personal environment; …vital personal decisions. (Note a return to notions of control, and who has it)

Classical concerns are the government, industry and other people, all of whom might collect information or undertake actions that challenge these domains of control. This generates a matrix of issues. So we get some special topics within that framework. For example, the state’s involvement in birth control issues; identity theft. But we have special ones that we want to tackle in this session; and then we’ll move onto other issues in the discussion.

So, a classic issue – the government collecting information about you. First, Carnivore – tool for essentially wiretapping at ISPs. It’s not just collecting communications, but also filtering on specific characteristics under a court order.

The issue is that Carnivore grabs everything first, then spits out the ones that meet the filters set in the court order. So, it “reads” everything; rather than those specified elements. Now there are some legal remedies, potentially.

Suppose, for example, the filter erroneously belches out your message; a fourth amendment violation, yet no easy way to prosecure. Kyllo is a case looking at surveillance via thermal imaging of a suspected marijuana grower. At the Supreme Court, this was found to be a 4th amendment violation,

In Scalia’s opinion (the majority), this was agreed to be an excludable search because the technology was not generally available, meaning it was a special tool of loaw enforcement – but, as technology evolves, that means that the characteristics of and limitations on legitimate searches will change.

So, we see a see-sawing of the technology/counter-technology fight in this area, just as we see in copyright/circumvention/etc. The maintenance of an equilibrium, upset by technology, leads to a strange kind of cat and mouse game, where first one side, then another, cries “foul.”

A term of art – “reasonable expectation of privacy” – a context that is always in motion. This term emerged to articulate notions of privacy outside the home, where the notion of privacy starts. Kyllo now means that technology can erode notions of privacy in your home, a tradition domain of presumptive privacy.

Chris Kelly: chief privacy officer of the firm Excite! speaks up.

This feedback loop between our individual relations and the rules of law; the law is usually the instrument of response to violations of privacy, leading to exclusion of the information at trial. When it’s a question of individuals and governments, other recovery is necessary.

So, let’s speak of cookies. As we have been told, cookies are a technical solution to the fact that HTTP is a stateless protocol – there is no memory from event to event. A cookie is a data deposit on a client machine by the server. The server can put any datum onto that machine in the form of a cookie, labeled as belonging to this particualr server (breadcrumbs left in the woods, for example; or blazes on a trail). Up to this point, no particular privacy problem.

Now, let’s look at how cookies are actually used. Say, I go to a dog page at About.com. I get a cookie that says, visited a dog site. Then I go somewhere else, where I might fill out a form; then somewhere else where I query some specific things. And my dog preferences seem to come along with me. A so called cookie consortia agrees to pool cookies (or DoubleClick, a banner ad company, compiles information about me as to run into DoubleClick banners on all my sites). So, we get targeted ads – a privacy problem?

Other tracking instruments are possible – IP addresses or MAC addresses. And thus your computer gets indentity

It still may not be a problem; can’t I turn off cookies, or wipe them? Sites are designed to throw cookies, so the advisory dialogs are inescapable and unending.

There are two ways this is attacked – the US way and the European way. The US approach is to say that it’s all about consent and your expectations are set – so the FTC focus is on the privacy policy is publicized, and the consumer makes the choice to visit the site or not. Leading to the opt-in vs. the opt-out debate.

We see that privacy is terribly important, unless I can get frequent flier miles <G> This seems to indicate that there’s nothing really troubling here.

Maybe not – here’s a possible story. A surfer goes to a www site and considers a record to buy. By suggesting it to someone else, he goes to the site and finds that the same album is offered at a different price. Your mouse droppings make it possible for data miners to develop a stunningly focused description of your preferences and your behavior. Is this a problem?

Not obviously. Seems like a good thing. But with a certain degree of information, noxious discrimination in treatment might emerge.

What about the idea of selling my privacy? Like selling a kidney?

Let’s try this. The IRS doesn’t like tax cheats; but don’t want to spend money. So they did a statistical analysis of tax returns and the incidence of tax cheating, leading to the DIF formula. This formula becomes the basis for deciding whether or not to audit. This formula was requested under the Freedom of Information Act, prevented from release as a challenge to national security.

Total Information Awareness; with the collection of a host of innocuous data, combined with analysis and pattern data, judgments can be made – and a result emerges that cannot be explained. So someone can be damned, without recourse to questions of due process/probably cause – how did this happen is a question that cannot exactly be answered. A Minority Report world. Moreover, a host of things we have said shouldn’t be a basis for decisions (gender, age, race, etc.) suddenly are taken into consideration.

What are the alternatives? Investigation by individuals have their own problems.

Information wants to be free, right? So privacy becomes a counter example to Barlow’s thesis – some information should not be free.

A comment raises my thought – it’s the fact that I can interrogate and query a process, rather than rely upon a computer program based on statistics – the due process issues are the thing that is troubling.

Jonathan points out – what if we just use the tool as a screen, and then we go on to do a formal investigation using traditional methods.

And, we are fundamentally confronted with a politically determined problem – how to balance the desires for certain government functions (e.g., security) in exchange for giving up privacy.

So what tools might be used: (a) stop the collection; (b) limit the uses of data; (c) audit the uses of the data – the sort of rights in the EU Data Directive. So people have the opportunity to check, but most people don’t.

Q: Transparency is the thing I care about; as long as I can verify that they are doing what they ought to be doing, it’s not a problem. Z: So what if the government asked you to fill out a form about everything? A: Well, no. But reasonable collection is OK with me.

Ray: I’m concerned about Big Brother, but it’s little brother – recourse and accountability for data collection by firms. Without that, there’s no reason to expect that industry will do the right thing in this space.

Commenter: People seem to be willing to give up amazing things – 30% are willing to give out their physical location in exchange for a free sandwich. If this is the situation, is it possible that the idea of privacy insurance could develop?

Z: The theory of these sorts of markets are the ability to construct a financial/risk instrument that can compensate for certain kinds of losses. We might be able to set up identity theft insurance, since we can quantify the costs; but loss of privacy? What’s the monetary damage to be compensated – how to set it.

Molly: OK – I’m getting a little more scared. But there’s more out there still. Let’s talk about RFID chips.

Z: The next cookies – how the internet is becoming part of the real world. RFID are devices that function as physical cookies. Jonathan has a RFID in his dog – the dog runs away often – and now the dog can be recovered. If it’s good enough for a dog, isn’t it ok for my child? Why not RFID my kid – we have them in razor blade packages now, for example. So, now I can point an antenna at a home and I can ID the products that are in there. Or, as you walk into a car dealership, they can scan your clothes RFIDs and know whether it’s Armani or Gap slacks – and make appropriate judgments.

Comment: It seems like this is Napster inside out – Fred von Lohmann as the Scott McNealy of file sharing. Is there a creative commons license functional equivalent.

Let’s talk about P3P – a technical tool for expressing your personal privacy preferences – a standardized set of of questions that allow you to specify your privacy preferences – and it will screen your WWW access activities. So this helps with the opt-in/opt-out transaction costs. But I can still elect to give up my data in dribs and drabs. Plus, P3P is just a way of automating the contract – enforcement is still an issue,

Larry: There is also a part of the story where Microsoft plays the good guy. IE6 rolled P3P into the browser, with the default being to reject cookies by default from wites without P3P facility – leading to a dramatic increase in WWW sites that implement P3P – so tech plus the market power does offer up a little more facility.

One more topic – so now I should worry about my own failings and my accidental mouse droppings. What about the things that I need to make available, like my name.

Like the Nuremberg Files – a WWW site that includes names, addresses, and other public listings, all of people with whom this group has differences. A host of noxious things, done in public

Note that the Nuremberg prosecution was done under the theory of limiting threats, rather than a privacy case.

Two other cases: the SpamHaus project – list of know spam operations – the Nuremberg files for spammers. (spamhaus.org); the NC sex offender and public protection registry. Convicted offenders, released after doing their time, are identified, located and characterized.

Comment: Don’t forget web bugs – little graphics in pages to track cetain kinds of actions without the user knowing

Comment: Maybe we have started in the wrong place – shouldn’t the first thing be a discussion of what we mean by privacy – what is the policy objective that we are trying to devise instruments to serve.
Molly: This is why this is hard – the issues are both wide and deep, and articulation of all these elements is terribly complicating.

Comment: This has shown why this is complex and difficult. It seems like being given a choice of what to give up makes a difference. The difference between the US and EU approaches is that ALL personal data in the EU is presumptively regulates in the EU, while that is not the case on the US – only certain data is treated that way. It makes for a huge difference, particularly the regulatory overhang

Lisa Rein: A clarification on IE6 and P3P – yes it did raise the bar; the bad news is that it was a platform-specific implementation that other programs have a hard time working with it.

Z: Note, this is the state of things after 10 years of internet technology. And already the world of privacy has dramatically changes, as has been out perception of our visibility in the world. So far, the law has proven to be unweildy in this space, and the technology that is being given to consumers is not actually facilitating the objectives that we would expect that they are being set up to help.

Molly: the 9/11 dimension has changed the notion of what represents reasonable expectations of privacy – another key domain

(close)