EKR: November 2005 Archives

 

November 30, 2005

The New York Times reports that TSA is planning to make some significant changes in airport screening procedures:
  • Allow scissors with blades under 4 inches and tools with blades under 7 inches (but not knives).
  • Randomized secondary screening. (Hey, didn't they used to tell people that it was randomized before?)
  • Varying the screening procedure. This would allow you to do intensive screenings of say carry-on one day and shoes on another.
  • Pat-downs of the lower body.

All of these seem like sensible moves, though of course, I tend to think they could go further and allow knives. Then I could bring my Leatherman.

 

November 29, 2005

Today's Slate has a somewhat confusing and confused article about Internet governance and ICANN.
ICANN hasn't been doing a bad job. For one thing, there have been no major outages in its seven years as cyber traffic cop. Nevertheless, in the months leading up to the summit, a group of countries (most notably Brazil, Cuba, Iran, and Zimbabwe) pressed the United Nations to assume ICANN's functions, while members of the European Union clamored to dilute American control.

This fundamentally misunderstands ICANN's role, which is administrative, not operational. It's true that ICANN has responsibility for who runs the root servers, but the only one they actually operate is L. So, even if they wanted to cause an outage, it's hard to see how they'd actually manage it, unless you assume that the root server operators are singularly careless when they take updates from ICANN.

The Web has become just another front in the battle between the United States and the rest of the world, and Tunisia was a convenient time and place to vent strong anti-American feelings. Although the United States government has not meddled in ICANN's operations yet, our U.N. brethren fear that an America with a unilateral foreign policy will eventually become an America with a unilateral Internet policy. Other countries have every right to be suspicious. If it wanted, the U.S. government could take over ICANN and block Internet traffic to a nation that harbors terrorists. It could access the databases that house domain names and use the information to take down computers serving up anti-American rhetoric or locate state enemies.

Well, sort of. In principle the US government could instruct ICANN to reassign say Iran's TLD to some other organization, but this wouldn't block traffic to addresses in Iran, just stop people from resolving addresses in .ir. And, of course that assumes that the roots would actually accept that redelegation, which isn't actually clear. Now, the US Government could instruct ICANN to instruct IANA to redelegate netblocks in Iran to somewhere else, but as I've noted before, IANA has only limited latitude in how it does those delegations, and it's not clear that operators would accept route advertisements that were clearly in violation of policy.

Now, as for the question of privacy violations First, many registrations are available through whois, so there's no need to do anything special to get information. Second, remember that ICANN doesn't actually have anyone's personal information: the registrars have that. So, in order to get that information, ICANN would have to threaten the registrars with losing their delegations. That's a pretty blunt instrument which would clearly strike at the heart of ICANN's legitimacy. And, of course, it's not clear why you would bother with any of this. Finally, most of the major registrars are based in the United States, so the US government could simply cut out the middleman and subpoena the information from them directly.

The best solution might simply be to allow any country that wants the job to host the DNS system. How? Peer-to-peer networks like BitTorrent.

Here's how it could work, according to computer security researcher Robert G. Ferrell, a former at-large member of ICANN. Countries that choose to house Torrent servers would receive a random piece of the DNS pie over a closed P2P network, with mirrors set up to correct data by consensus in the case of corruption or unauthorized modification. No one country would actually physically host the entire database. In essence, everybody would be in charge, but no one would be in control. Isn't that how the United Nations functions anyway?

I'm starting to hear a lot more about people being interested in P2P naming, but I don't think it's something we're even close to ready for. For now, let me just say that it's a lot more complicated than it sounds, especially in the area of establishing who has the rights to which name. It's definitely not what one would call a solved problem.

 

November 28, 2005

Paul Martin has just lost a no-confidence motion, which means an election in mid to late January.

For those of us in the US, this is clearly a critical moment. We cannot afford for Canada to become a failed state, with Jouale-speaking, hockey stick-wielding hooligans firing moose-dung across the border into Detroit, let alone the unthinkable: another fishing blockade. And yet with so much of the US military deployed in Iraq, a full-scale invasion and peacekeeping mission is of the question. We may be forced to fall back on.. shudder... diplomacy.

 

November 27, 2005

While watching Van Helsing on Friday night, I kept picturing Steven Sommers planning this film:
So, there's this guy Van Helsing, played by Hugh Jackman, and he kills vampires. But it's not just a vampire movie, see. We'll put in a werewolf, and Frankenstein's monster, and that Igor guy from Young Frankenstein, and uh... who else, that Mr. Hyde guy from the League of Extraordinary Gentlemen and we've got these naked brides of Dracula, and Kate Beckinsdale in a corset, but we'll put some ruffles on top so it looks Transylvanian.

Oh, yeah, the plot. Yeah, we'll need one of those. Lemme think. Well, Van Helsing is a monster hunter, like a supernatural Dirty Harry, but sexier and more mysterious because he's lost his memory. And he's sent to Transylvania to save Kate Beckinsdale, who's from a family of monster hunters. And, there's this guy who's like Q from James Bond who's got these gadgets for Van Helsing to use, like an automatic crossbow and a light bomb for killing vampires. And then Dracula has these eggs, like in Alien, and he wants to hatch them and make a master race of vamp... Uh, dude can you pass the bong?

As an added bonus when the horse and carriage crashes, it bursts into flame.

The bit about the horse and carriage was pointed out to me by Andrew Houghton.

 

November 26, 2005

In case you haven't noticed, the 9th Circuit is hearing Gilmore v. Gonzales on Dec 8th in San Francisco. Anyone else planning on going?
 

November 25, 2005

What's the one piece of information you would expect to have in an article entitled "Report: Cocaine use up in Europe"? That's right, how much it's up. Unfortunately, that's the one piece of information you don't get in this article.
About 9 million people in the European Union, 3 percent of all adults, have tried cocaine, while up to 3.5 million are likely to have to have used it in the last year and 1.5 million took the drug in the past month, the report found.

"Historically, cocaine was a fairly rare drug in Europe," said Paul Griffiths, scientific coordinator for the European Monitoring Center for Drugs and Drug Addiction, which published the survey. "Then in Britain, the Netherlands and Spain it became increasingly available in big cities and now it is very visible in national statistics in these countries and our concern is there could be further diffusion in Europe," he said.

In Britain and Spain, more than 4 percent of 15-34 year-olds consumed cocaine in the past 12 months, close to the level in the United States, where cocaine has been a problem for longer than in Europe, the report said. In those two countries, the number of young adults taking cocaine, often described as a trendy recreational drug, exceeded those using ecstasy and amphetamines.

Despite the rising trend in Britain, the growth in cocaine use there appears to be stabilizing at historic high levels while other EU members such as France show signs of catching up, Griffiths said.

The impact of increasing cocaine use is also showing up in health statistics. Although deaths attributed to cocaine use alone are rare, the drug played a role in 10 percent of all drug-related deaths, meaning there could be several hundred deaths per year linked to cocaine in the 25-nation bloc, the report said.

Worse yet, it's not in the EU's press release, which leads with the same cocaine use is up message.

However, if you read the actual report, you get a quite different picture of the sitation. Here's the relevant summary:

Clear-cut European trends in cocaine use, based on population studies, are still difficult to identify (see section on cannabis trends). However, warnings about increases in cocaine use in Europe have come from several sources, including local reports, focused studies conducted in dance settings, reports of increases in seizures indicators and some increases in indicators related to problems (deaths, emergencies). Recent cocaine use among young people increased substantially in the United Kingdom from 1996 until 2000,

but has remained relatively stable since then, although moderate increases have been observed in recent years, and in Spain (91) from 1999 to 2001. Less marked increases were observed in Denmark, Italy, Hungary, the Netherlands and Austria (in local surveys) and, with oscillations over the 1990s, in Germany (Figure 11).

And turning to Figure 11 (last-year usage for 15-34), we see a figure that doesn't even remotely show an overall increase:

At best, this figure would let you claim that cocaine use is up from the mid 1990s in some countries. The big, clear, trend, is in the UK, but even there it's pretty misleading to say that cocaine use is "up". More accurate would be to say that there was a big jump from 1994-2000 and then things have been pretty flat since then. Indeed, lacking error bars on these figures, it's pretty hard to draw any real conclusions about the size or even existence of any increase since 2000.

Figuring that out requires going straight to the data set. Let's take the UK data. The sample size in 2004 was 8590 with 4.9% of users claiming to have used cocaine in the previous year. Working backwards, this tells us that they had 416 positive responses. Using standard formulas for the accuracy of a point estimate of population proportion, we get a point estimate of 4.9% (as expected) with a 95% confidence interval of (4.45%-5.35%). The rest of the data points (4.5% in 2000, 4.0% in 2001, and 4.3% in 2003) are from similar sample sizes, so this shouldn't give you a real warm feeling about claiming that use is "up" in the 2000-2003 time frame.

The summary of the French data in the Reuters report is even more misleading. The national French estimate for 1999 and 2000 (the only years for which there is data on cocaine) was .5%. The Metropolitan data is up from .4% in 1995 to .7% in 2002, but this is with a tiny sample size of 724, so the 95% CI is (.1%-1.3%) and you clearly can't draw any conclusions about a trend, and certainly not that France is "catching up" to the UK.1

As usual, there are two failures here. The first is that the researchers presented a not-that-accurate bottom line summary of their results without appropriate caveats about uncertainty or inter-country variation. The second is that the press accepted these results uncritically without bothering to go read the report itself. None of this is to say, of course, that cocaine use isn't up in the EU. Rather, we just can't tell that from this data.

1. There seems to be something funny about the France line, because Fig 11 appears to show totally different data points (and for different years) than are listed in the relevant table. I'm working from the table, but I may have missed something here. Not that the "trend line" in the graph is very convincing either.

 

November 24, 2005

Last night I went to an interesting lecture at Berkeley by Stephen Maurer (slides here. One of Maurer's topics was about how to distinguish between acceptable and unacceptable interrogation practices. Clearly there's some consensus that certain practices (e.g., detention, just asking questions) are OK and that others (the rack, electric shock, aren't). The tricky part is where to draw the line. But what struck me was that drugs was on Maurer's list of methods that there's a consensus aren't OK. I agree with that assesssment, but I think it's interesting to ask why that should be.

In theory, an interrogation drug that simply got you to tell the truth seems like a really humane form of interrogation. You just give the subject the drug, he tells you what you want to know, and there's not need for any of those tiresome (and tiring) beatings, electrocutions, etc. So, what's wrong with drugs?

Fundamentally, it seems that there are three problems: The first, pointed out by Maurer, is that this isn't how interrogation under drugs actually works, because we don't have some magic "truth serum". Instead, the conventional practice--at least when such things were being studied--was to attempt to induce a state of artificial psychosis and hope you could destroy enough of the victim's personality that he would tell you what you wanted to know. This obviously doesn't sound super-humane and it's easy to understand why you would want to prohibit it, if it's the state of the art.

The second problem, also pointed out by Maurer, is that bodily autonomy makes a convenient bright line, and interrogation under drugs clearly cross that line. However, just because a line is easy to draw doesn't mean it's the right one. A secondary problem with this explanation is that this particular bright line doesn't actually match up well with people's other intuitions. In particular, many people seem to be comfortable with the kind low-end physical abuse (slapping, etc.) that the CIA has apparently approved even though it clearly violates bodily autonomy, but yet not to be comfortable with drugs.

The final reason, which is sort of the inverse of the first, is that drugs seem to cross another line: towards mind control. It's one thing to temporarily drug someone in order to get information out of them, but it was a clear objective of a lot of the CIA work to permanently reprogram their personality--a la the Manchurian Candidate. That's something that strikes people--rightly in my opinion--as well over the line, and one might imagine that if you had effective non-mind-control drug-based interrogation techniques, they could be converted to mind control techniques. So, maybe better to keep a bright line against all such techniques.

None of this is really that convincing of course, and like many debates about interrogation--or ethics in general--it's not clear that there really is any set of principles that leads to intuitively acceptable results.

 

November 23, 2005

One of the most difficult parts of designing a mechanical timepiece is the escapement, the interface between the regulation system (e.g., the pendulum) and the drive system, and to a great degree the history of clock design is really the history of escapement design. Unfortunately, I always find it pretty hard to visualize the operation of an escapement from the static diagrams in books. I was therefore very happy to find Matt Headrick's Horology Page, which not only has an 80+ page eBook on escapement design but also animated simulations of the operations of a number of major escapements.
 
Nagendra Modadugu pointed me to a story about a French woman who tried to open an aircraft door during flight:
Sadrine Helene Sellies, 34, was placed on a good behavior bond after pleading guilty in Brisbane Magistrates Court to endangering the safety of an aircraft.

...

She walked toward one of the aircraft's emergency exits with an unlit cigarette and a lighter in her hand and began tampering with the door, prosecutors said. But a flight attendant intervened and took Sellies back to her seat.

Note that there really isn't any danger to the aircraft here, because you can't open the door on an airliner when the door is pressurized. Good thing, too, since this would be a fairly attractive terrorism target.

 

November 22, 2005

Of course, one potential interpretation of the Bush/Cheney "we won't cut and run" message is that it's simply posturing. One classic game theoretic strategy is to signal that you're committed to some strategy--or even to publicly commit yourself to it (How do you win at chicken? throw your steering wheel out the window). The problem is that that that strategy only works if your opponent would be better off conceding than fighting and losing. It's not at all clear that that's the case in Iraq: do the insurgents have anything better to do than continue to fight? And if it's not, then this strategy doesn't buy you anything. It's particularly problematic in this case because the threat is only semi-credible. True, Bush can refuse to withdraw, but if the war is going really badly in 3 years, then the next president and Congress will likely be forced to.
 
Cryptome has posted some pictures of suicide bomber vests, some constructed by the army for training purposes. The top set of pictures is especially interesting because it has some close-up pictures of pockets that appear to contain drill bits and nuts, presumably designed to generate shrapnel when the bomb goes off.
 
The Bush Administration has reacted badly to the suggestion that the US withdraw from Iraq. The standard line, of course, is that we can't "cut and run", but we need to be realistic. Even if you don't believe Fred Kaplan's argument that the US will to need to withdraw sooner rather than later, the US's resources aren't unlimited, so there need to be some decision criteria for when you decide that it's a lost cause and are willing to withdraw. This doesn't apply just to Iraq, of course. You need go/no-go points for any project. This is just the one under discussion.

However, the problem in this case is that this in an adversarial environment, and so publishing your decision criteria gives an advantage to the adversary. To make matters worse, we don't have a similar kind of insight into our adversary's strategic position, so it's very hard to determine whether the US is making progress. All we have is the observables. This differs from a conventional war where you have reasonable intelligence about the number of remaining enemy forces.

Imagine that you create some simple benchmark, e.g., the number of bombings per day has to be below X by December 2006 or we declare defeat. Now, come December next year, this goal hasn't been met. This could either mean that the adversary has lots of remaining capabilities or that they've concentrated all their efforts to meet this goal and force us out. (You'll recall the arguments that the Tet Offensive really represented a major depletion of North Vietnam's capabilities if the US had just been willing to stay the course). The above analysis assumes that the adversary's objective is to drive the US out. If they want to keep the US in and continue to bleed our resources, it's even easier to control the situation. Just back off a bit and let the US meet whatever milestones they have set, then start up again.

In general, it's an incredibly disadvantage to have your enemy know your strategic position when you don't know theirs. This is always a problem in democracies and the standard solution is to have a detailed strategy known to the leaders but only tell the public the very broad details. The problem, of course, is that this relies on the public being willing to trust that the leadership knows what it's doing, and now that the Bush administration has self-evidently lost that trust, it badly compromises the US's position.

 

November 21, 2005

EFF has filed a class action class action lawsuit against Sony BMG. Texas is also suing them. As I noted previously, this whole incident has been really good news for DRM-haters. With any luck, Sony will take a big financial hit to go with the PR hit they've already taken, which should give future DRM vendors substantial pause.
 

November 19, 2005

According to ABC News, here are the six interrogation techniques
1. The Attention Grab: The interrogator forcefully grabs the shirt front of the prisoner and shakes him.

2. Attention Slap: An open-handed slap aimed at causing pain and triggering fear.

3. The Belly Slap: A hard open-handed slap to the stomach. The aim is to cause pain, but not internal injury. Doctors consulted advised against using a punch, which could cause lasting internal damage.

4. Long Time Standing: This technique is described as among the most effective. Prisoners are forced to stand, handcuffed and with their feet shackled to an eye bolt in the floor for more than 40 hours. Exhaustion and sleep deprivation are effective in yielding confessions.

5. The Cold Cell: The prisoner is left to stand naked in a cell kept near 50 degrees. Throughout the time in the cell the prisoner is doused with cold water.

6. Water Boarding: The prisoner is bound to an inclined board, feet raised and head slightly below the feet. Cellophane is wrapped over the prisoner's face and water is poured over him. Unavoidably, the gag reflex kicks in and a terrifying fear of drowning leads to almost instant pleas to bring the treatment to a halt.

According to the sources, CIA officers who subjected themselves to the water boarding technique lasted an average of 14 seconds before caving in. They said al Qaeda's toughest prisoner, Khalid Sheik Mohammed, won the admiration of interrogators when he was able to last between two and two-and-a-half minutes before begging to confess.

Apparently as long as we're not applying electric shock to their genitals it isn't really torture.

 

November 18, 2005

As you've no doubt heard the music industry wants Apple to replace their $.99/song flat rate pricing at the iTunes Music Store with variable pricing.
Virgin and HMV music retailers both recently opened digital music stores in the U.K. that sell downloads using variable pricing.

"Not all music is worth the same," Mulligan said. "Madonna's latest album is worth more in the early weeks of release than an old Pink Floyd album. Though consumers will have to pay more for some songs, they will also get many much cheaper. Online stores can thus run sales in the same way that traditional music retailers do."

It's obvous why the music companies think they want this: they think they can extract more money from users. I'm not so sure. Remember that because the marginal cost of production is essentially zero, any particular price point is basically arbitrary from the consumer's perspective. $.99 has the advantage of being a focal point, and so fair-seeming (note that the price is also .99 in both Canadian dollars and Euros). If customers start thinking too much about the arbitrariness of the prices, they may decide that those prices are basically unfair and record companies may find that they aren't able to sell any music at $.99.
 

November 17, 2005

This morning I woke up with a cold. That's the bad news. The good news (for both me and you) is that I have a highly evolved strategy for managing it using a complex and finely tuned array of prescription and nonprescription pharmaceuticals, and I'm now motivated to write that up. But first a warning: I've developed this strategy via a combination of reading the literature and trial and error, which means that (1) it's suited to the cluster of symptoms that bug me and (2) while I'm generally satisfied, it's not something I'd be willing to write up in the NEJM.

Basic principles
I want to start with some basic principles which it's important to keep in mind throughout the discussion. The first thing you need to do is accept that you're going to be sick for a while. Your goals for cold management are two-fold: minimize the duration of the time you're sick and suppress as many of the symptoms as possible. The good news is that we have drugs that will suppress most of your symptoms. The bad news is that they have side effects. The result is that cold management is to a great degree about managing the side effects of the various meds you're taking to suppress your cold symptoms.

Because of the side effect issue, you generally want to stay away from all-in-one cold preparations, like NyQuil. These generally consist of a bunch of common cold drugs shoved into a single pill. They're designed to suppress all the symptoms everyone has, which means they probably suppress symptoms you don't have, which means you're getting side effects you don't want. (The big one here is that a lot of them contain antihistamines, which make you drowsy). Accordingly, I generally advise sticking with individual drugs to get the effects you want. It's more work, but you get finer control.

Managing Congestion
My major cold complaint is nasal congestion. If I just feel general malaise, I can lie around and watch TV, read, or sleep, but when I'm congested, I'm constantly miserable and it makes it hard for me to sleep--which makes it hard to get better. So, getting uncongested is job number one. I have a three prong strategy here.

  1. Flonase is a corticosteroid nasal spray that is used to treat nose-related allergies (I hear that Beconase works well too). If you're taking it on a daily basis anyway, then you're probably in good shape here, but you might want to up the dosage to 100 mcg (two sprays)/nostril/day if you're not on that already. If you're not taking it on a daily basis (I only need it during allergy season) you want to start on it as soon as you feel the cold symptoms. I find that Flonase does a pretty good job of suppressing congestion (and sneezing and runny nose), but the problem is that it takes 2-3 days to take effect, and even then you're probably not 100% clear. That's where prongs two and three come in. One note here: Flonase is prescription only, so you may have trouble getting a prescription for cold management.
  2. Nasal spray (I use oxymetazoline Hcl) is by far the most effective decongestant I've ever found. The problem is that if you use it regularly you can get rebound congestion) when you stop. I deal with this by only using nasal spray at night and using the minimum possible dose. I find that one spray per nostril is typically fine and as I get better and not quite so stuffed up I alternate one nostril every other night. But if you're only using nasal spray at night, then you're still congested during the day. This is where pseudoephedrine comes in.
  3. Pseudoephedrine is an oral decongestant. You can get either an immediate release form (30 mg tabs) or an extended release (120 mg over 12 hours or 240 mg over 24 hours). I generally prefer the extended release form for two reasons. First, I find that I get more consistent relief throughout the day. Second, pseudoephedrine is a mild stimulant and I find that the extended release form makes me less jittery. The stimulant effect also means that pseudoephedrine is not so great to use at night, so that's where I use the nasal spray.

Cough
After a couple of days of nasal congestion and post-nasal drip, you're likely to be coughing pretty badly. The state of the art in non-prescription cough relief is dextromethorphan (the stuff in Robitussin DM) I recommend Delsym, which is just dextromethorphan in an extended release package, so you get 12 (well, really about 8) hours of cough relief, which allows a good nights sleep.

Delsym works pretty well, but the gold standard for cough relief is opiates. The standard drug seems to be this stuff called tussionex, which is an antihistamine combined with hydrocodone (the active ingredient in Vicodin). The truth of the matter is that it's the opiate that's really doing the job, and pretty much any opiate will do. If you have some Vicodin (hydrocodone + acetaminophen) hanging around, that will work just as well. As an added bonus, the opiate will help you sleep, which, as I've said, is crucial for getting healthy.

Aside from being hard to get your hands on, the other problem with the opiates is that they're sedating, so you're not going to be getting much done. Stick with Delsym for the day and save the opiates for nighttime use.

Aching
This is more of a flu symptom, but people do get it with colds too. One word: ibuprofen. On the other hand, if you're popping Vicodin to deal with your cough, you probably aren't going to have much of an aching problem.

Insomnia
Of course, if you feel like crap, especially if you're congested (see above), you may have trouble falling asleep. I generally avoid antihistamines for cold treatment. I don't find that they work that well and they're pretty sedating, which interferes with getting stuff done. Actually, I suspect that these problems are related, since I suspect that for a long time there was a drowsiness/effectiveness tradeoff (until Seldane and the rest of the non-drowsy antihistamines came out), so the all-in-one cold med makers had to compromise.

In this case, however, drowsiness is a feature, not a bug, so there's no need to compromise. What I recommend here is diphenhydramine (benadryl). Diphenhydramine is much more sedating than the standard antihistamines that go into cold meds (typically chlorpheniramine or brompheniramine maleate)--it's actually used as a nonprescription sleeping aid. 50mg (two tablets) is a pretty good dose.

Another option, of course, is to use a prescription sleeping drug such as Halcion or Ambien, but my general experience is that for this application diphenhydramine is a better choice, probably because it's helping suppress the cold symptoms as well. The one downside is that it can leave you a bit drowsy in the mornings, so use sparingly.

Minimizing the duration
So far I've mostly been focusing on symptom suppression, because it's not clear that you can do much to actually shorten the course of a cold. The only treatment that's shown any real promise at all is zinc nasal spray, but there's only one small study that shows it works, and there's the whole anosmia issue to consider. I've been known to use zinc nasal spray, but I can't strongly recommend it.

However, even if there's nothing you can do to shorten the duration of a cold, you can definitely lengthen it, mainly by not taking care of yourself, not getting enough sleep, etc. This is a particular problem if you're doing a good job of symptom suppression, since you may feel good enough to almost forget that you're sick.

A note for athletes on training while sick
One question that athletes often want to know is whether they can train when they're sick. The standard rule of thumb is that if the symptoms are above the neck it's safe to exercise but if it's below the neck you shouldn't. I don't know of any actual scientific basis for this, however. What I generally do is follow this rule but cut back to mostly easy cardio workouts until I feel better. Actually, I generally find that you have to take it easy even after you feel better, since your athletic performance tends to be suppressed even after you've started to feel better the rest of the day. If you take morning heart rates, they can be a good indicator of when you're ready to train again.

 

November 16, 2005

Robert Charles Wilson is one of my favorite SF authors. His specialty is wildly speculative science fiction (often of a religious nature) but concentrating on the human element. His latest novel, Spin is another excellent outing. One day, the stars just go out and upon investigation it's found that the Earth is surrounded by a semi-permeable barrier that's opaque to light. So far, the premise feels a lot like Greg Egan's Quarantine, but Wilson takes things in a completely differnet direction. Upon some investigation it turns out that everything inside the "Spin Membrane" is running at a radically different time scale from the rest of the universe: one year inside is 100 million years outside, so the lifetime of the Sun is about 50 years.

Wilson tells the story from the perspective of Tyler Dupree, who's sort of a bit player in the whole thing. He's the childhood friend of someone who eventually becomes extremely important in humanity's response to the Spin. Tyler's a bit important himself, but mostly we get to see how other people respond through his eyes. And unlike in many science fiction novels, the other people mostly have real, plausible personalities--at least enough that you're interested in their stories.

Most of Wilson's other stuff is good. I particularly recommend The Divide, and A Bridge of Years.

 

November 15, 2005

Ed Felten has been busy ably documenting the Sony DRM rootkit and the even bigger debacle that is their uninstaller. Now, it's true that DRM is inherently problematic to write, but there's no reason why it had to be as clumsy and intrusive as Sony's is. Now, this is obviously bad news if you've got this stuff on your computer, but it's potentially good news for the world at large. In general, the public has trouble getting worked up about DRM or even the more aggressive anti-copying measures the content providers want to use. But something this offensive has the potential to get ordinary users annoyed, so this incident may turn out to be a net win.
 

November 14, 2005

Kenneth Neil Cukier has an article in the Nov/Dec issue of Foreign Affairs about control of the Internet. Unfortunately, while the broad message (the EU and China want to take control away from ICANN) is right, it's full of errors, which in many cases are seriously misleading. Here's what I've noticed so far:

Any network requires some centralized control in order to function. The global phone system, for example, is administered by the world's oldest international treaty organization, the International Telecommunication Union, founded in 1865 and now a part of the UN family. The Internet is different. It is coordinated by a private-sector nonprofit organization called the Internet Corporation for Assigned Names and Numbers (ICANN), which was set up by the United States in 1998 to take over the activities performed for 30 years, amazingly, by a single ponytailed professor in California.

...

One of the most cherished myths of cyberspace is that the Internet is totally decentralized and inherently uncontrollable. Like all myths, this one is based on a bit of truth and a heavy dose of wishful thinking. It is true that compared with the century-old telephone system, the Internet is a paragon of deregulation and decentralization. In four critical areas, however, it requires oversight and coordination in order to operate smoothly. Together, these areas constitute the "domain name system" of addresses, with which users navigate the Internet and send e-mail.

The four areas Cukier is talking about are (1) domain names (2) IP addresses (3) root servers, and (4) standards. First of all, only domain names and root servers are part of the "domain name system". Second, ICANN's control of the DNS really devolves to their control (partly by contract and partly by moral suasion) of the roots. Third, you might get the impression from this that ICANN controls IP addresses and standards, which is only sort of true. ICANN does have the contract to do IP address allocation, but they do so under the direction of the IETF and as documented in RFC 2050. And of course, ICANN doesn't control standards at all. They're made by IETF and to a lesser degree W3C and a host of other standards organizations.

Third are what are called root servers. Some form of control is needed in the actual machines that make the domain name system work. When users visit Web sites or send e-mail, big computers known as root servers match the domain names with their corresponding Internet Protocol numbers in a matter of milliseconds. The database is the world's most important Rolodex. Yet due to a technical hiccup that occurred when the network was young, there can be only 13 root servers, some of which provide data to mirror sites around the world. As a result, somebody must decide who will operate the root servers and where those operators will be based. Because the system evolved informally, the root servers' administrators are diverse, including NASA, a Dutch nonprofit organization, universities, the U.S. military, and private companies. Today, all told, ten root servers are operated from the United States and one each from Amsterdam, Stockholm, and Tokyo.

This paragraph is really confusing because it gives you the impression that DNS service is somehow basically centralized and that somehow the root servers know the address of e.g., www.educatedguesswork.org. But as I've written before, this isn't how things work. DNS is a distributed system. The root servers just delegate to the TLDs (.com, .net, etc.) (This database is actually quite small). The actual resolution of the next level of domains is done by entirely different servers.

By the way, the 13 root servers thing is kind of interesting. When you contact the root servers, the response contains NS records and A records telling you which name servers are authoritative for the root zone (.). The maximum guaranteed size of a UDP datagram (without fragmentation) is 512 bytes, which is just large enough to fit 13 results. This limits the number of root nameservers to 13, though many of them are anycast.

Until 1998, the Internet was overseen almost exclusively by one man: Jon Postel, a computer science professor at the University of Southern California. As a graduate student in the 1960s, he was among the handful of engineers who built the Internet. For the next 30 years, he managed it on behalf of the Department of Defense's Advanced Research Projects Agency, which funded the Internet's initial development.

This bit about Postel running the Internet is oft-repeated but basically a serious overstatement. It's true that Postel did a lot of important stuff: In particular running IANA and the RFC Editor, but many of these tasks were fundamentally administrative and done under direction of the IETF. In particular, Postel certainly never wrote all the Internet technical standards himself, though he did have a hand in some of the key ones. Postel did have a pretty free hand with name assignment (again, of the TLDs) but even then, his control was pretty limited. When Postel tried to move the root, the Clinton administration came down on him pretty hard and he had to back down. It's important to remember that Postel's control depended in large part on the cooperation of others who respected his judgement. People didn't have to do what he said, but they did in many cases because they trusted him.

ICANN's private-sector status, moreover, has helped keep the Internet free from political interference. When in 2002 members of the Federal Communications Commission were asked by their counterparts at China's Ministry of Information Industry why Taiwan had been allocated its own two-letter domain (".tw"), the commissioners could pass the buck to ICANN and breathe a sigh of relief.

Actually ICANN didn't have discretion about whether to allocate .tw either. RFC 1591 (another IETF document, of course) defines the criteria for allocation of ccTLDs and ties them closely to ISO 3166. ISO 3166 defines .tw and so ICANN allocated it.

Watching the United States go to war in Iraq despite global opposition, these diplomats saw ICANN as yet another example of American unilateralism. What would prevent Washington, they argued, from one day choosing, say, to knock Iran off the Internet by simply deleting its two-letter moniker, ".ir," from the domain name system?

I hear this stuff a lot, but realistically I don't believe that it's going to happen, because the root server operators would revolt and if they didn't the big network operators would arrange to replace them because disrupting the network is bad for business.

At the end of the day, what's wrong with this article--and most of the other coverage I read of the Internet governance issue is that it assumes that the Internet is basically a command and control system, and so whoever has nominal control has real control. The truth of the matter is that it's a collaborative effort and that common practice is more important than top-down direction, especially in the area of standards, which are almost entirely dependent on vendor and user deployment. Failure to realize this--and the bogus assumption that somebody has to be in charge--is what leads to absurdities like the notion that somehow Jon Postel controlled the entire Internet. Until people realize this, they'll be in the same position as Seabright's Communist official who asked who was in charge of the bread supply to London.

 

November 12, 2005

LA transit planners are considering building an 11 mile tunnel between Orange and Riverside counties (presentation here) The Times article I link to above focuses on people's concerns about earthquake safety--which does seem like a plausible concern:
Litschi said engineers were waiting to see if the committee chooses the tunnel option before doing more studies on the Lake Elsinore fault system, but acknowledged that seismic activity is a ''major concern.''

Local officials have worked closely with a British engineering company that has helped build some of the largest tunnels in the world and has concluded that the tunnel is ''viable and feasible,'' said H. Tony Rahimian, a consultant who helped devise the tunnel proposal.

''A tunnel is actually a very safe place. We don't want to run it through the faults and we're going to avoid that,'' he said.

Orange County traffic is really appalling, so clearly something needs to be done. Unfortunately, neither the Times article nor the project web site provides any clear indication of the level of concern one should have over seismic stability.

 

November 11, 2005

My presentations from IETF 64:

Also worth checking out: the Minutes of the TLS Working Group.

 

November 10, 2005

For some unknown reason, here on the IETF network any attempt to go to www.united.com feeds you into a redirect cycle with the following response (line breaks inserted for clarity).
HTTP/1.1 200 OK
Date: Fri, 11 Nov 2005 06:26:41 GMT
Server: Apache/1.3.26 (Unix)
Connection: close
Content-Type: text/html

<HTML><HEAD><META HTTP-EQUIV=Refresh CONTENT="0;
URL=http://www.united.com"><META NAME="keywords"
CONTENT="UAL,United,United AirLines,Untied AirLines,untied airlines,
United Air Lines,Untied Air Lines, untied air lines,Mileage Plus,Milage
Plus,mileage plus,milage plus, United Mileage Plus,Untied Mileage Plus,
Untied Milage Plus,untied connection on the web,United vacations,
United Vacations,Untied vacations,Untied Vacations,United Express,
Untied Express,united express,untied express,United Groundlink,
United Ground Link,united groundlink,united ground link,frequent
flyer, Business One,Busness One,business one,busness one,
Business 1,Busness 1,Silver Wings,Silver Wings Plus,Sliver
Wings,Sliver Wings Plus, silver wings,silver wings plus,sliver
wings,sliver wings plus,United Shuttle,Untied Shuttle,united shuttle">

Strangely, it works fine from my home network. I wonder if this is somehow an artifact of being on a Canadian network and it's crazily trying to forward me to some Canadian web site. Another possible explanation is that we're under attack or some gateway is trying to be helpful.

Anyway, if you want to connect to United from the IETF network. https://www.ua2go.com/ci/Login.jsp?return_to=eug_pur will get you there. For added fun, go to: https://www.ua2go.com/ and check out the iPlanet Web Server default page.

 
For some time now, RPSEC has been trying to develop requirements
for routing security. That work is somewhat stalled and SIDR
appears to be an attempt to acknowledge that and move forward
with developing IDR security technology in the absence of a complete
requirements analysis.

There were presentations by Russ White (soBGP), Steve Kent
(sBGP) and Marcus Leech (psBGP) on their respective proposals. 
There wasn't too much discussion of the relative merits of each
proposal. 

There seemed to be general enthusiasm for forming some WG, but there was
contention about what should be on the charter. The two big possible
work items are (1) setting up the infrastucture for authorizing who is
currently responsible for a given AS and (2) protocols for actually
carrying authenticated routing path information. A number of the routing
types (Sue Hares in particular) want to focus on (1) and put off (2)
in deference to RPSEC completing. The counterpoint concern is that
RPSEC is making very slow progress and that gating progress on that
work is problematic.

I don't have a good recommendation here... It does seem like
RPSEC is kind of bogged down, but I'm also not sure we can do a
good job on this project without some kind of clear idea of what
we're trying to accomplish, especially in view of the confusing
state of RIR record keeping.
 

November 9, 2005

I'm currently sitting in the IETF plenary and I've only been able to be on the wireless network about 50% of the time--pretty much par for the course in this IETF, and probably a bit worse than average. Even at a good IETF meeting, the network generally doesn't work the first day. I'm not a radio engineer, but it's never been clear to me why it's so hard to get this technology working and keep it working. It's true that the IETF is a particularly challenging environment, but you'd think that after 5 years or so of experience with wireless networks this size, we would have figured it out. Anyone understand what's up? Is it that the radio characteristics of each venue are really different? That they use different equipment each time? All of the above?
 

November 7, 2005

As I noted previously, the basic redistricting algorithm that Prop 77 requires appears to be fundamentally sound. On the other hand, it's clearly being promulgated for political advantage. How to reconcile these two? Check out Mark Kleiman's post from Oct 17, where he reposts a reader's argument that it's the contiguity guidelines for the redistricting that are biased.
My "no" vote is predicated on the following section of the proposed law:

(f) District boundaries shall conform to the geographic boundaries of a county, city, or city and county to the greatest extent practicable. In this regard, a redistricting plan shall comply with these criteria in the following order of importance:

(1) create the most whole counties possible,

(2) create the fewest county fragments possible,

(3) create the most whole cities possible, and

(4) create the fewest city fragments possible, except as necessary to comply with the requirements of the preceding subdivisions of this section.

(g) Every district shall be as compact as practicable except to the extent necessary to comply with the requirements of the preceding subdivisions of this section. With regard to compactness, to the extent practicable a contiguous area of population shall not be bypassed to incorporate an area of population more distant.

This is a recipe not for competitive districts, but for neutralizing Democratic votes as much as possible by concentrating them in urban-county bantustans.

Just look at this table if the effect isn't immediately clear. LA, Santa Clara, Alameda and SF would be around 20 CDs with 65/35 D/R splits. SD, Orange, San Berdoo, Riverside, Sacto and Contra Costa would be around 18 CDs with 45/55 D/R splits.

As an under-the-radar attempt to defang California's urban Democratic voters, it's crass partisan, cultural and class warfare of the worst kind. Ugh, ptui.

Uggabugga has a nice graphic showing how "compact districting" may not be as "neutral" as it sounds. Steven Hill at Mother Jones looks at the issue more systematically: with Democrats concentrated in urban areas, Republicans can win a majority of the seats with a minority of the votes.

This actually raises the interesting policy question about what "fair" means. The reason we have geographic districting rather than e.g., completely at large elections or random assignment is that we imagine that people who live close together have aligned interests and so should end up voting for the same representatives. It's not clear (at least to me) that any geographically oriented districting scheme can simultaneously satisfy the goal of having electoral outcomes that mirror the population at large.

 

November 6, 2005

On my trip through SFO on my way to Vancouver, I once again refused to take off my shoes and was rewarded with a trip through the GE Entryscan explosive scanner. Basically, you step into this pod and there are glass doors right in front of you. There's a bunch of red LEDs in the form of a hand telling you to stop. These big air puffs shoot out at you and then you wait for the machine to process your sample and display the green OK signal telling you that you're not in imminent danger of exploding. The processing time is about 15 seconds, which gives you plenty of time to wonder whether the residual cocaine on the money in your pocket (you did know this also scans for narcotics, right?) is about to set off the scanner and give you a free strip search. That's it, though. Clearly there's lots of science there but it's all on the inside. And I doubt the TSA types are going to let me take the back off the unit and look inside.
 

November 5, 2005

News.com reports that Westchester County NY is considering requiring a firewall on all wireless networks operated by "commercial businesses":
Politicians in Westchester County are urging adoption of the law--which appears to be the first such legislation in the U.S.--because without it, "somebody parked in the street or sitting in a neighboring building could hack into the network and steal your most confidential data," County Executive Andy Spano said in a statement.

The draft proposal offered this week would compel all "commercial businesses" with an open wireless access point to have a "network gateway server" outfitted with a software or hardware firewall. Such a firewall, used to block intrusions from outside the local network, would be required even for a coffee shop that used an old-fashioned cash register instead of an Internet-linked credit card system that could be vulnerable to intrusions.

...

The proposed law has two prongs: First, "public Internet access" may not be provided without a network gateway server equipped with a firewall. Second, any business or home office that stores personal information also must install such a firewall-outfitted server even if its wireless connection is encrypted and not open to the public. All such businesses would be required to register with the county within 90 days.

This seems like the kind of well-meaning but basically useless measure you get when people who don't understand technology try to make rules for it. The reasoning goes something like this: Wireless networks are insecure. That's bad. Firewalls are used to secure your network. Therefore, business should be required to install firewalls.

Even if you believe (which I don't) that counties should be in the business of regulating people's network security, there are two problems with this proposal. First, there's no real evidence that open APs are the major threat to the security of commercial networks. After all, lots of intrusions happen over the Internet. The number of people who could potentially break into your system over the Internet vastly exceeds the number of people in the local area attached to your AP. And there's no talk here of requiring businesses who don't operate wireless networks to have firewalls.

Second, the requirement to have a firewall on your "gateway server" is basically meaningless. These days, some kind of firewall is a standard feature on even extremely low end wireless routers. And, of course, it's trivial to have a firewall but not configure it correctly. Unless Westchester is going to get into the business of certifying people's actual installations, just making people sprinkle on some firewall pixie dust is unlikely to have much of an effect.

 

November 4, 2005

On Nov 8, California voters will have an opportunity to vote on Proposition 77, which replaces the current legislative redistricting scheme with one performed by a panel of 3 retired judges. As I've mentioned before, the current redistricting scheme is extremely susceptible to gerrymandering and Prop 77 claims to fix that.

I haven't studied the Prop 77 redistricting scheme in detail but my initial impression is that it's superior to the current system in some important respects. The judges are selected by a complicated and partially (but not completely) random process which requires that there be representatives from both political parties and that they agree unanimously to any plan. This creates a barrier to Texas-style redistricting designed to consolidate the party of a single political party, but is still susceptible to collusion designed to reduce the number of competitive races.

One semi-weird feature of the proposed scheme is that the new districts need to be approved by voters but that happens in the first election after they're proposed and that election is performed with the new districts. That's probably not that great a design, but I don't see that it's really fatal. A second weird feature--and one that's being objected to widely--is that Prop 77 requires the next redistricting to happen immediately rather than in 2011 when it would otherwise occur. On the one hand, the current districts are gerrymandered and so this probably isn't too bad, but this still smacks of trying to change the rules in midstream, which has created a lot of resistance to the measure.

At the end of the day, I'd probably prefer an automated random process (despite the known problems), but I tend to think that the proposed scheme is superior to what we have now. However, it also seems that you could construct a similar scheme without some of the features that people find objectionable in this one.

 

November 2, 2005

Bruce Schneier writes:
We also need "SHA2," whatever that will look like. And a design competition is the best way to get a SHA2. (Niels Ferguson pointed out that the AES process was the best cryptographic invention of the past decade.) Unfortunately, we're in no position to have an AES-like competition to replace SHA right now. We simply don't know enough about designing hash functions. What we need is research, random research all over the map. Designs beget analyses beget designs beget analyses.... Right now we need a bunch of mediocre hash function designs. We need a posse of hotshot graduate students breaking them and making names for themselves. We need new tricks and new tools. Hash functions are a hot area of research right now, but anything we can do to stoke that will pay off in the future.

I think Bruce is mostly right here. Certainly, none of the cryptographers I know feel comfortable enough to recommend a new function for general use. And unlike block ciphers, we didn't have even a modest-sized pool of pre-existing algorithms that people felt were probably OK though not ideal (think IDEA, Blowfish, etc.) and were in wide use. Part of the problem here is the common heritage of so many of our hash functions and part of the problem is that the Wang attacks on SHA-1 took people by surprise, unlike with DES, where people had known it was aging for years and had had plenty of time to develop alternatives. Another part of the problem is that designing hashes wasn't very sexy. That's certainly changed, though.

 

November 1, 2005

At the NIST hash function workshop yesterday ( liveblogging by Bruce Schneier), there was a lot of discussion about what the impact of an actual demonstrated collision in SHA-1 would be, as opposed to the "theoretical" 2^64 difficulty attacks published by Wang.

There are three reasons why you might think that an actual collision would be important. The first is that you don't 100% believe that Wang's attacks work, since after all they've never actually been tested. In that case, the existence of a collision is proof that the attacks actually work. I'm not qualified to have an opinion here, but my impression is that most cryptographers think that the attacks are likely to work. Assuming that they work, then 2^64 is totally within reach of a large-scale distributed computation mechanism, so it's basically a matter of time before someone publishes a collision, even if the analytic attacks are never improved.

The second reason is that you think that the attacks work but that the public (as opposed to the cryptographic community) would react differently to the news that there was an actual collision to the news that there was an actual collision. This could either be because they haven't correctly marked Wang's papers to market or because they would overreact to the news of an actual collision. This doesn't seem right to me. In fact, I suspect that most people who have heard that SHA-1 is broken (and there aren't that many of those) don't even know that no actual collision has been published.

The final concern is the impact of a single collision. As has been observed by Daum and Lucks, you can exploit a single collision to generate an arbitrary number of pairs of documents with the same message digest but that display differently. As I've noted before, I don't think that this is a very significant attack (though Georg Illies presented an interesting paper yesterday showing how to exploit this for some fairly primitive file formats). Most of the good attacks using collisions (e.g., to forge a certificate) require the ability to generate collisions in at least semi-real-time, rather than starting from a single fixed pair.

Bottom line: I expect to see a published collision in SHA-1 in the next few years. I doubt that it will cause widespread panic. If anything, it might make people feel better to get a feel for how much effort it took. All this assumes, of course, that no new analytic attack is published; and so the level of effort stays around 2^64 and the only difference is that a lot of computing power is applied to the problem. If the analytic attacks improve significantly, all bets are off.