EKR: September 2005 Archives


September 30, 2005

Last weekend I picked up the DVD set of the first season of House M.D. ($41.99 at Amazon).1 Objectively, it's ludicrous: Dr. House (Hugh Laurie) is a misanthropic, crippled, vicodin-addicted doctor specializing in diagnosis. Each episode features a patient with some baffling set of symptoms, requiring House to figure out what the (obscure) problem is, which he typically manages to do right as the patient is about to die. Naturally, he breaks all the rules, bullying patients and doctors alike in an effort to solve the problem of the week, saving the patient in the process.

Despite the absurdity of the premise, Hugh Laurie is convincing and hilarious as House. It's particularly hard to believe that this is the same man who used to pay the foppish, good natured, upper class English twit Wooster. In fact he's so convincing that apparently when director Bryan Singer saw his audition tape he thought Hugh Laurie was American.

If this sounds like the kind of thing you think you'll like, it probably is.

1. BTW, it appears that pretty much every program that's ever been on TV, no matter how lame, is available as a DVD box set. Exhibit 1: McMillan and Wife. Exhibit 2: Remington Steele. Exhibit 3: Hart to Hart. Scarecrow and Mrs. King isn't out yet, though.


September 29, 2005

The Times has yet another article about the Authors' Guild complaining about user book sales:
The study's findings are similar if different in scale to other recent studies. Ipsos BookTrends, a commercial research company, has reported that used books account for about 8 percent of overall sales of general-interest books to consumers.

Publishing companies and authors have long expressed concern over used-book sales, saying they cannibalize potential sales of new books and, because they generate no royalties for authors or revenue for publishers, they harm the ability of authors and publishers to make a living.

"It certainly is a threat," Paul Aiken, the executive director of the Authors Guild, said in an interview. The guild has complained in particular about Amazon.com, whose Internet site offers consumers the ability to buy used copies of a book on the same screen where it offers new copies. In many instances, used copies are made available for sale by outside parties almost as soon as a new book goes on sale.

Let's be realistic here. Most authors don't make much money from their books, and an 8% decrease in their royalties probably isn't that substantial. On the other hand, having more people read your books is extremely valuable, and since the people who buy your book used are probably less likely to have bought them new, the cost to you is probably fairly low.

What I find interesting here is the claim that the used copies are available almost as soon as the books go on sale. It seems to me that there are three likely possibilities:

  1. The people selling these books didn't like them and don't want them.
  2. The people selling these books have finished them.
  3. The people selling these books got them for free, e.g., as advanced copies from the publisher.
It's not clear what the overall effect of (1) and (2) are. After all, the seller did buy a new copy. Moreover, to some extent used sales drive up the initial price of the book because people who would otherwise have been hesitant to buy are now willing to do so, secure in the knowledge that they can resell. Moreover, if the books are on resale almost immediately after release, then I would imagine that category (2) is initially pretty small, since it takes a while for most people to get through books.

From the author's perspective, it's the third category of used book sales that is a real problem. Publishers give out "review copies" of free books in an effort to generate buzz. The authors don't get paid royalties for these books and if they're being resold, that's money directly out of the author's pocket. I don't know how many of these books end up on the resale market, but that's a question that should be pretty easy to answer by marking the review copies and then randomly sampling the used market.

Note that it's also not clear how much substitution there really is. Ghose et al. have studied the issue and claim that the level is pretty small, but I'm still working through their paper and don't yet have a firm opinion on the quality of their results.


September 28, 2005

Save My Ass is a subscription service that automatically sends flowers to someone of your choosing (like your wife or girlfriend). You tell it about significant dates, plus they schedule deliveries at random intervals throughout the year.

Superficially this looks like a good deal: you get the credit for being regularly thoughtful without having to expend any effort other than the one time subscription. But consider, it's true that women (in my experience, flowers are usually sent by men to women and not the other way around) like flowers, but what's primarily going on is that sending flowers is a signal that you cared enough to think about it. That signal worked because it used to be expensive to send; you actually had to be thoughtful. Save My Ass lets you send that signal without really being thoughtful. So, does the recipient of the gift realize that and value it less?

My intuition is probably not. Emotional reactions are seldom logical and as Robert Cialdini observes, even clearly fake gestures of liking elicit powerful emotional responses (pointed out to me by Kevin Dick). And since our culture trains us so thoroughly that flowers are a signal of liking, the mechanism should still work in the short term.

On the other hand, if this becomes popular I expect the effect to diminish. A generation of women raised in an environment where flowers can be sent without any additional effort is likely to value them much less.


September 27, 2005

WaPo reports on a Senate bill which would allow DNA collection of anyone who was arrested by Federal authorities. The ACLU offers an interesting objection:
"DNA is not like fingerprinting," said Jesselyn McCurdy, a legislative counsel for the American Civil Liberties Union. "It contains genetic information and information about diseases." She added that the ACLU questions whether it is constitutional to put data from those who have not been convicted into a database of convicted criminals.

The provision, co-sponsored by Kyl and Sen. John Cornyn (R-Tex.), does not require the government to automatically remove the DNA data of people who are never convicted. Instead, those arrested or detained would have to petition to have their information removed from the database after their cases were resolved.

Privacy advocates are especially concerned about possible abuses such as profiling based on genetic characteristics.

"This clearly opens the door to all kinds of race- or ethnic-based stops" by police, said Jim Dempsey, executive director of the Center for Democracy and Technology, a digital policy think tank.

This is certainly true if the way that you do this is to retain a DNA sample, but that's not the only way to build things. The way that DNA matching works is that you compare the pattern of specific base code sequences called Variable Number Tandem Repeats (VNTRs). If you record the VNTR pattern of a given sample, you should use the stored pattern to compare against the unidentified sample (you'd want to do this anyway to allow rapid search) and then discard the original sample. The VNTRs leak only a very small amount of information, so it's not clear how bad the privacy situation really is. I'd be fairly surprised if, for instance, it leaks information about diseases, though it's fairly possible that it leaks information about race. Of course, all this is predicated on the assumption that the samples are discarded after they are fingerprinted. It's not clear that that will be the case.


September 26, 2005

The FCC has issued a notice of proposed rulemaking that CALEA applies to ISPs, meaning that they'll be required to enable "lawful intercept" (i.e., wiretapping). As I read things, the writeup, this only applies to "interconnected VoIP providers", those which connect to the PSTN. That's good from a complexity perspective since calls that are VoIP terminated on both ends sometimes use end-to-end encryption, which can't be easily tapped by the provider. (See here for my writeup of the VoIP wiretapping options).

September 25, 2005

Union City is refunding over 3,000 red light tickets after it was discovered that the yellow cycle was set to only 3 seconds rather than the 4.3 seconds required for 45 MPH by State law. This was discovered after one Dave Goodson got a ticket and checked the timings for himself.
When first questioned by a reporter about the timing discrepancy, police officials said they believed the lights were set correctly and that Jocson and Dalisay simply had quoted Goodson the incorrect value. They promised, however, to investigate the matter.

The investigation revealed that the yellow signal was set too short at every photo-enforced intersection. Rather than place the blame on any one individual, the investigation pointed to a process of checks--any one of which would have pointed out the problem earlier--that all failed.

Foley said he and other officials were well aware of the minimum standards and had assumed that Jocson and Dalisay also knew them. Still, city engineers are required to provide a monthly audit of the signal settings. Those audits, which would have alerted officials to the problem, were never received, he said. By 10:21 a.m. last Saturday, all of the intersections had been set to the state-required standards.

There's no particular reason to believa that this is was anything but an honest mistake, but it's equally obvious that there's an agency problem here: the city makes a lot of money ($136 per conviction in this case) off of fines for red light violations and so has extremely little incentive to correct timings that are too short. To make matters worse, too short yellow timings are associated with increased collision frequencies [*] cameras are preferentially emplaced in intersections with high collision rates, which suggests that intersections with red light cameras are more likely to have too-short timings.


September 24, 2005

Palm's next Treo will run Windows Mobile. This must have been a pretty hard decision for Palm. On the one hand, PalmOS has been obsolete for years and they clearly needed to update their operating system. On the other hand, Palm's big advantage has always been the high quality of their UI. I haven't seen Windows Mobile lately, but it's predecessor Windows CE was pretty clearly inferior to PalmOS from a UI perspective. Palm has a pretty large market of people who are used to their platform and have put up with lousy platform performance to avoid switching. Even if Windows Mobile is better in some abstract sense, if Palm doesn't do a pretty good job of keeping the UI the same, then patients will start evaluating their other options rather than just reflexively buying the next generation Palm.

September 23, 2005

From SANS:
The trend of putting trojaned downloads on software distribution sites continues unabated. A Korean site, officially **unaffiliated** with the Mozilla, Thunderbird, and Firefox development teams, distributes a Korean version of Mozilla Suite 1.7.6 and Thunderbird 1.0.2. Turns out, a couple of days ago, evil versions of Mozilla and Thunderbird for Linux appeared on this site. When installed, they would infect ELF binaries in /bin. The malware included a backdoor, although it had little spreading potential. Still, that's why, when you upgrade, make sure you download from a couple of mirrors and check that hash! Md5sum and SHA-1 are your friend. And, if you are really paranoid, RIPEMD-160 is a good acquaintance to have.

Update: According to information we've received (thanks, Roel!), Korean versions of Mozilla and Thunderbird distributed through **official** Mozilla FTP sites were also infected. So, if you use Korean Mozilla or Thunderbird, and downloaded the latest versions of thunderbird or mozilla, you may have been compromised. I suggest a good file integrity check, and perhaps a reinstall of your operating system and apps. Thanks again, Roel, for the clarification.

So, how did the infected versions get on the official site? Anybody got any more details?

Mark Kleiman correctly observes that boxing gloves make boxing more dangerous:
As long as prizefighters wear boxing gloves, which protect the delicate bones of their hands so they can safely concuss one another's brains, a few of them are going to die and a lot of them are going to suffer permanent neurological injury. The deaths are more newsworthy, but to my mind the injuries are more heartbreaking.

The contrast between the witty, agile Cassius Clay who struck down Sonny Liston and the brain-damaged stumblebum who is now Muhammed Ali is all the evidence anyone should need to conclude that a major reform is necessary.

And that reform couldn't be simpler: take off the gloves. Then fighters will pound one another's bodies for hours on end, as they did in the Gentleman Jim Corbett era. That might not be as exciting as watching shorter bouts decided by knockouts, but unlike contemporary prizefighting it would be a sport that a decent person who understood was happening could watch without disgust.

The other simple change that would make boxing a lot safer is to eliminate the standing eight count after a knockdown. As with gloves, the standing eight count is supposedly protective, but actually has the reverse effect because it increases the amount of damage a fighter can take during a fight before it's stopped.

It's also highly unrealistic, as can be easily seen by watching mixed martial arts competitions (UFC, Pride, etc.) in which fighters are not protected after being knocked down. When your opponent is knocked down is exactly when you want to press your advantage. Under MMA rules, stunning your opponent even for a second or two is generally a fight-ender because it allows you to assume a dominant position and finish the fight. Under boxing rules, the ref would simply separate you and give the opponent a standing eight count.

It's also arguable that this change would make boxing a lot more interesting because it increases the chance that any given exchange will end the fight, rather than just being one of an endless series of knockdowns and eight counts.


September 22, 2005

The Author's Guild is suing Google over Google Print. Here's the response from Susan Wojciki, Google VP of product management and Jonathan Brand's analysis of the fair use aspects.

A few points I haven't seen made elsewhere.
There's a lot of attention being paid to the fact that Google is only presenting excerpts of the material. Here's Wojciki:

At most we show only a brief snippet of text where their search term appears, along with basic bibliographic information and several links to online booksellers and libraries.

I'm sure that's true, but that doesn't mean that Google isn't making the full book available. Repeated queries with overlapping search keys and a little screen scraping might well let you recover the entire book text. I know that Google has mechanisms that prevent large-scale automatic queries, but I don't know how hard those are to defeat in practice.

As has been noted, Google's ability to perform full-text searches depends on having a complete copy of the scanned book. What hasn't been much noted is that Google probably isn't just making one copy of the material. Remember that Google operates an enormous server farm. It's quite likely that they have to make a substantial number of copies for parallel searches (performance) and redundancy (high availability).

Acknowledgement The ideas in this post were developed during discussions between Hovav Shacham, Nagendra Modadugu, Cullen Jennings, and myself.

From Against the Gods: The Remarkable Story of Risk
Despite Emperor Frederick's patronage of Fibonacci's book and the book's widespread distribution throughout Europe, introduction of the Hindu-Arabic numbering system provoked intense and bitter resistance up to the early 1500s. Here, for once, we can explain the delay. Two factors were at work.

Part of the resistance stemmed from the inertial forces that opposed any change in matters hallowed by centuries of use. Learning radically new methods never finds an easy welcome.

The seocnd factor was based on more solid ground: it was easier to commit fraud with the new numbers than with the old. Turning a 0 into a 6 or a 9 was temptingly easy, and a 1 could be readily converted into a 4, 6, 7, or 9 (one reason Europeans write 7 as [a 7 with a line through it, don't know the HTML code -- EKR]). Although the new numbers had gained their first foothold in Italy, where education levels were high, Florence issued an edict in 1229 that forbade bankers from using the "infidel" symbols. As a result, many people who wanted to learn the new system had to disguise themselves as Moslems in order to do so.

The invention of printing with movable type in the middle of the fifteenth century was the catalyst that finally overcame opposition to the full use of the new numbers. Now the fraudulent alterations were no longer possible. Now the ridiculous complications of using Roman numerals became clear to everyone.

Of course, the invention of computer-based accounting made it (once again) easy to make fraudulent alterations, but by then we'd developed mathematical techniques that made it impossible to commit accounting fraud.


September 20, 2005

Ed Felten compares the movie industry's MovieLabs project to "Perpetual Motion Labs":
Such a ploy might be very effective if it worked. Imagine that you somehow convinced policymakers that the auto industry could make cars that operated with no energy source at all. You could then demand that the auto industry make all sorts of concessions in energy policy, and you could continue to criticize them for foot-dragging no matter how much they did.

If you were using this ploy, the dumbest thing you could do is to set up your own "Perpetual Motion Labs" to develop no-energy-source cars. Your lab would fail, of course, and its failure would demonstrate that your argument was bogus all along. You would only set up the lab if you thought that perpetual-motion cars were pretty easy to build.

Which brings us to the movie industry's announcement, yesterday, that they will set up "MovieLabs", a $30 million research effort to develop effective anti-copying technologies. The only sensible explanation for this move is that Hollywood really believes that there are easily-discovered anti-copying technologies that the technology industry has failed to find.

So Hollywood is still in denial about digital copying.


This is a chance for Hollywood to learn what the rest of us already know — that cheap and easy copying is an unavoidable side-effect of the digital revolution.

It's certainly true that the stopping widespread copying of copyrighted material is an extraordinarily difficult problem, and that the movie and record companies have been worried about it ever since recordable digital media started appearing (remember that the first round of this was about Digital Audio Tape), but that doesn't mean there's nothing you can do about it. There are two basic lines of attack on large-scale digital copying.

  1. Make it hard to extract the media in raw (copyable) form.
  2. Make it hard to transmit the raw media around--or at least easy to prosecute the offenders.

The first line of attack is basically a lost cause and has been ever since media started being released in digital form on CD/DVD. But from the media company's perspective things were pretty much under control before the Internet.1 because there was no good way of moving the bits around. As long as people had to ship actual objects around, it was comparatively straightforward to investigate the crimes and prosecute the offenders using the same techniques you use for counterfeiters, etc.

What's really freaking the media companies out is that the Internet in its current form makes it very difficult to stop people from copying the bits around. But at the moment there are still things that you can do to make that inconvenient. If you look at the list of initial MovieLabs projects, you can see that a lot of them are oriented towards this goal:

  • Ways to jam camcorders being used inside movie theaters, or to project movies with flickering images that are invisible to the eye but will appear on unauthorized video recordings.
  • Network management technologies to detect and block illegal file transfers on campus and business networks.
  • Traffic analysis tools to detect illegal content sharing on peer-to-peer networks.
  • Ways to prevent home and personal digital networks from being tapped into by unauthorized users, while not preventing consumers from sending a movie to more than one TV set without having to pay for it each time.
  • Ways to link senders and receivers of movies transmitted over the Internet to geographic and political territories, to monitor the distribution of movies and prevent the violation of license agreements.

The second, third, and probably the fifth of these projects appear to be aimed at attacking transmission of the bits. Of course, you can't totally stop it, but you can make it a lot more inconvenient, especially if you don't care how much damage you do to the Internet in the process—which it seems likely the media companies don't. The first project is designed to stop a particular form of content extraction: movies that haven't been released on DVD but are available in theaters. The fourth project appears to be aimed at stopping home content extraction, which I agree is a dubious proposition.

Of course, even if the media companies manage to block Internet transmission of content via file sharing networks, it's only a short-term fix. At some point not too far in the future networks and storage will have so much capacity that friend-to-friend transmission of large content libraries will be practical, at which point the game is pretty much over. We're not there yet, though, and it's possible for attempts to block filesharing to screw up the Internet pretty badly in the meantime.

1 Yes, I realize that most media "piracy" happens in the form of pressing new media, not Internet sharing, but that's not as scary for three reasons (1) It's easier to investigate. (2) It happens mostly in markets where the media companies wouldn't sell much content anyway (keyword: market segmentation) (3) It's still clearly a commercial transaction so it doesn't run the risk that customers start thinking that all of this content should just be free.


September 19, 2005

Lieberman et al. reports the somewhat surprising result that the 2nd generation antipsychotics are only marginally better than the 1st generatio—or rather we should say that they suck marginally less. As someone who hasn't been following this field, what's surprising is how high the discontinuation rate: 74% overall and 64% for the best drug tested: olanzapine. Basically, if you're schizophrenic, the options are not good.

September 18, 2005

We were requested by the District Attoryney of one of the Lake States to recover a body that, according to an informant, had been cast into a well about ten years earlier. ... They [the news media] were fascinated by this spectacle for several reasons. First, the District Attorney (DA) had called in expert help (an expert being defined here as someone from more than 100 miles away).

From L.J. Levine, H.R. Campbell, Jr., and J.S. Rhine, Perpendicular Forensic Archaeology, in Human Identifiction: Case Studies in Forensic Anthropology, T.A. Rathburn (ed) and J.E. Buikstra (ed).

The standard response to the Pledge of Allegiance controversy is "so what, it's just two words?" Colby Cosh deftly explains why this doesn't fly by imagining that Congress replaces the star field on the American flag with a big cross.

One difference between Cosh's example and the current situation is that a cross is clearly a Christian symbol and "under God" is nominally religiously semi-neutral. But of course nominally is the key word here, since basically it's Christian. The idea that there's a single unitary (or at least triune) deity pretty much excludes any major religions outside of what's now called the Judeo-Christian-Islamic tradition.

Within that tradition, Muslims, of course refer not to God, but to Allah. While it's true that Allah means "God", the primacy of Arabic in Islam seems to result in Allah being used whatever language is being spoken, much more like a personal name. It's very common to see English-speaking Muslims say "Allah", but Christians tend to use the word for "God" in whatever language they're speaking--even if they themselves learned it in a different language. Anyway, if you think this doesn't matter consider how Americans would great the suggestion that we should say "under Allah" instead of "under God."

That leaves us with Christians and Jews. I'm less familiar with how Jews view this topic, though I do know that many Jews prefer not to write or say "God", and instead will write "G-d" and say "Hashem" (literally, "the Name")1, so I suspect that many Jews would find a certain discomfort level with saying "under God" in a non-prayer context. This leaves us with Christians as the only people wholly comfortable with this language, which isn't surprising, because the real purpose of the "under God" language was always to reinforce the primacy of Christianity in American culture[*].

1. Of course, all of these are placeholders for the personal name of God, YHWH (pronounced Yahweh, not Jehovah). [*].


September 17, 2005

The House Energy and Commerce Committee has released discussion draft for comprehensive Communications Act "reform". As you would have expected, it's a total mess, with the biggest problem that it contemplates the FCC having an enormous amount of regulatory control over the Internet. Check out Section 102, which requires that ISPs register with the FCC:
       (1) REGISTRATION REQUIRED.—Any BITS provider offering BITS
           in any State shall file a BITS registration statement, and
	   any substantive amendments thereto, with the Commission, and
	   file a complete copy of such statement and amendment with
	   the State commission of such State.
	   to paragraph (3), no BITS provider may offer BITS until
	   such provider's registration statement has become effective
	   in accordance with subsection (c).
       (3) TRANSITION.—If a provider was offering BITS prior
	   to the date of enactment of this Act, the commission 
	   shall, in order to provide for a reasonable transition 
	   period, provide a temporary waiver of the prohibition
	   in paragraph (2) during which such provider may offer
	   such service prior to the effective date of the
	   provider's registration statement.
    (b) FEDERAL FORM.—A BITS registration statement 
    shall be in such form, contain such information, and be 
    submitted at such time as the Commission shall require by
    regulation, after consultation with State commissions.
       (1) NOTICE OF FILING.—No BITS registration statement
           or any substantial amendment thereof filed with the 
	   Commission under this section shall be effective earlier
	   than 30 days following issuance of public notice by the
	   Commission of the acceptance for filing of such
	   registration statement or substantial amendment.
       (2) FAILURE TO SUPPLY INFORMATION.—The Commission
	   may disapprove a BITS registration statement that
	   the Commission determines fails to comply with the
	   requirements of the Commission under subsection (b).
       (3) OTHER GROUNDS FOR DISAPPROVAL.—The Commission 
           may disapprove a BITS registration statement if—
	   (A) the BITS provider or any of its officers has
               violated Commission rules, Federal or State law,
	       or has a notice of apparent liability pending
	       at the Commission; and
           (B) the Commission determines that the BITS provider's
	       offering of BITS could harm consumers.

At the moment, there's no registration requirement for ISPs. I can just buy a DS3 and some Ciscos and I'm in business. So, to start with this bill would require registration. That wouldn't be so bad except that (1) there's no requirement that the FCC act on my registration filing in any time period (at least not here). Can they stall me indefinitely? To make matters worse, if anyone at my company has violated any law, they can just determine that my offering would harm consumers and refuse to register me at all. As I read this, if one of my employees has a DUI on their record, I'm subject to disapproval of registration at the FCC's discretion. Why is this a good thing?

    (a) FEDERAL AND STATE REGISTRATION.—Subject to subsection (b),
    each BITS provider has the duty—
       (1) to provide subscribers with access to lawful content,
           applications, and services provided over the Internet, and
	   not to block, impair, or interfere with the offering of,
	   access to, or the useof such content, applications,
	   or services;
       (2) to permit subscribers to connect and use devices of their
	   choosing in connection with BITS; and 
       (3) not to install network features, functions, or
	   capabilities that do not comply with the guidelines
	   and standards pursuant to section 106 of this Act.

Point (10) here above is presumably supposed to please people who disapprove of carrier's blocking services. Of course, in the current market, that requirement isn't super-necessary, since I can just get a different ISP. Of course, with the FCC controlling entry to the market, this kind of requirement may in fact be necessary.

Also, take note of point (3), where ISPs are forbidden to install any feature that doesn't comply with the guidelines in 106, which reads (in part):

    The Commission—
       (1) shall establish procedures for Commission oversight
           of coordinated BITS network planning by BITS providers,
	   and the interconnectivity of devices (including devices
	   from unaffiliated providers) with such networks, for the
	   effective and efficient interconnections of BITS 
	   providers' networks; and
       (2) may participate in the development by appropriate 
           industry standards-setting organizations of BITS
	   network interconnectivity standards that promote
	   interconneciton with—
           (A) BIT and BITS networks; and
	   (B) network capabilities and services by individuals
               with disabilities.

This is a little difficult to process, but it sure looks to me like the FCC can pretty much set any standards they want and then require ISPs to comply with them. Are their readers who interpret this differently?

Another classic is SEC 404.

    (a) MANUFACTURING.—A manufacturer of equipment used
        for BIT, BITS, VOIP service, or broadband video service
        shall ensure that equipment designed, developed, or
        fabricated after the date of enactment of this Act is
        designed, developed, and fabricated to be accessible to
        and usable by individuals, unless the manufacturer
        demonstrates that taking such steps would result in an undue
    (b) SERVICE PROVIDERS.—A BITS provider, VoIP service
	provider, or broadband video service provider shall
	ensure that the service it provides is accessible to
	and usable by individuals with disabilities, unless the 
	provider  demonstrates that taking such steps would result
	in an undue burden.

There's nothing wrong, of course, with manufacturers making their equipment accessible to the disabled, but it's not clear to me that the FCC should be in the business of requiring Cisco to have their management UI handicapped accessible (though they may already have it that way for all I know).

The idea that we need Congress and the FCC to tell us what kind of IT services we need would be a lot more comforting if they seemed to understand the Internet better. Emblematic of this is the fact that they treat generic Internet service (what they call BITS), VoIP, and Broadband video separately. This makes sense in some telco world where each service is separately tariffed, but of course the Internet doesn't work like that at all; bits are bits. I suppose it's arguable that VoIP services need some regulation because people's expectations have been set by the PSTN (though this is arguable) but broadcast video?


September 16, 2005

The NYT has an interesting article about how the Internet is disintermediating the real estate market. Realtors, of course, are trying to protect their 6%, primarily by controlling access to the Multiple Listing Service, but also by getting laws passed that make it hard for realtors to offer cut rate access to the MLS by requiring minimum levels of service--and thus setting a price floor.

One note, though:

There is one caveat: If you list there, you may be obligated to pay a commission to the buyer's agent, which is usually set at 3 percent. You can, however, build that commission into the price of the home, so the buyer actually pays it. Or, if the housing market is particularly hot in your area, you may be able to write into the contract that the buyer is responsible for paying his agent's commission.

This strikes me as fairly confused. There's some maximum amount buyers are willing to pay and they don't really care who the money goes to. If they have to pay the commission they'll just pay 3% less.

One difficulty in deciding how much you're willing to pay for a real estate agent--and consequently how much the price gets driven down--is that the uncertainty in the market price of the house makes it very difficult to determine if your agent is doing a good job, and it's very easy to believe that a small difference in quality could affect the price of the house substantially. While I doubt that my agent expended anywhere near 3% of the price of my house during the driving around process, I think it's quite likely that she saved us 3% in the negotation process.


September 15, 2005

The program for the NIST hash function workshop is up. There seems to be a good mix of papers, ranging from strengthening countermeasures to some new constructions and algorithms. Surprisingly, there don't seem to be any papers on block-cipher based constructions. (Paging Tom Shrimpton....) Steve Bellovin will be presenting our paper on deploying new hash algorithms as well. Not sure if I'm going to attend or not yet...
I've gotten interested in the Clear Registered Traveler program, which lets you shortcut a lot of airport security provided that you pass a bunch of pre-screening. If any EG reader has signed up for this and is willing to talk about, please drop me an e-mail.

September 14, 2005

This week's Science carries a really interesting article by Rome et al. about extracting power from the human gait. The basic idea is that you wear a frame that carries a heavy sprung weight. The bouncing of this weight up and down while the user walks drives a generator. At ordinary walking paces with a 38 kg load this allows the production of around 6 W of power.

The really interesting part is that some of this energy appears to be free. If you're walking at 5.6 km/h and carrying a 29 kg load, the power input to the pack is 12.5 W. Humans operate at about 25% efficiency but when you measure the delta in energy consumption (using O2 consumption) it's only about 19 W, which is a lot less than you expect. It's not clear exactly what's going on but it appears that the sprung pack is giving the wearers some sort of biomechanical assistance--the gait is different when the wait is sprung than when it is locked in place on the pack.

Other interesting points in the paper:

  • Soldiers carry an incredible amount of batteries. According to this article claims that soldiers in Iraq are carrying an average of 20 pounds of batteries each, almost a third of their load!
  • The energy density of food vastly exceeds that of batteries. According to Rome et al., the density of food is 3.9x10^7 J/kg whereas zinc-air batteries are only 1.1x10^6 J/kg. Even factoring in the inefficiency of human energy production, eating and then running a generator is still a lot better than batteries.

Of course, 20 kg (let alone 38 kg) is a fairly heavy pack, and this is just for the power generating system. You've still got to carry the rest of your gear. But lots of devices need far less than 6 W, so you could probably get away with a lighter load and generator. If you could get this down to 10-15 pounds total, that would be pretty interesting.


September 13, 2005

My friend Jason Fischl has just taken the job of Xten's CTO [*] Xten is located in Vancouver, so I expect Jason to be buying drinks at IETF 64.

September 12, 2005

This morning's TImes lets a bunch of lawyers and academics suggest five questions each that the Senate should ask John Roberts. Most of them aren't very interesting, but Glenn Reynolds surprises with:
1. The Ninth Amendment provides that "the enumeration in the Constitution, of certain rights, shall not be construed to deny or disparage others retained by the people." Do you believe that this language binds federal courts, or do you believe - as Robert Bork does - that it is an indecipherable "inkblot?" If the former, how are federal courts to determine what rights are retained by the people? On the other hand, if the Ninth Amendment does not create enforceable rights, what is it doing taking up one-tenth of the Bill of Rights?

2. Justice Joseph Story wrote in 1833 that "since the American Revolution no state government can be presumed to possess the transcendental sovereignty to take away vested rights of property; to take the property of A and transfer it to B by a mere legislative act." Was Story wrong? Or was the Supreme Court wrong this year when it ruled in Kelo v. the City of New London that a government had the right to take property for the use of private developers?

3. Could a human-like artificial intelligence constitute a "person" for purposes of protection under the 14th Amendment, or is such protection limited, by the 14th Amendment's language, to those who are "born or naturalized in the United States?"

4. Does a declaration of war by Congress have the effect of suddenly making proper actions by the executive and Congress that would otherwise have been beyond their constitutional powers?

5. Is scientific research among the expressive activities protected by the First Amendment? If not, is Congress free to bar research based solely on its decision that there

Questions 1 and 2 are fairly directly challenging, which I appreciate. Question 3 is a really good one, not because it's something we're actually likely to face real soon but because it's related to a bunch of other questions of identity that I think we will have to face real soon and we'd get a chance to see what kind of thinker the person answering was. 4 seems on point and 5 is something that we will very likely have to deal with extremely soon.


September 11, 2005

Waco Tribune reports that a Texas parental consent law may make illegal abortions a capital crime:
Roberts spoke about a bill signed into law by Gov. Rick Perry in June at a Fort Worth church school that requires parental consent before minors can have abortions and places additional restrictions on late-term abortions.

In relation to those changes, Roberts noted that the Legislature two years ago altered the definition of an individual in homicide statutes from a human being who has been born and is alive to a human being who is alive, including an unborn child at every stage of gestation, from fertilization until birth.

There was debate when the definition of individual was changed about whether the effect would make abortion the equivalent of murder. So lawmakers took particular care to write into the homicide statute that a lawful medical procedure performed with consent by a physician or other licensed health-care provider, if the death of the unborn child was the intended result, is an abortion. That provided a lawful defense or exception to homicide laws.

Continuing to connect the statutory dots, however, Roberts told local prosecutors that there is no such defense provided for a doctor who performs an unlawful medical procedure, such as an abortion on a minor without parental consent.

So, in effect, the doctor would have killed a child younger than 6 in an illegal abortion and thereby subjected himself or herself to potential prosecution for capital murder, Roberts told the dumbfounded audience.

This is a case study of sorts on how changing one code can have dramatic effects on other codes that actually reference those statutes, Roberts said. I presented it as an unintended consequence on a change to the civil code, but you will have to talk to your local prosecutors there about how they will handle those situations. I just presented what the Legislature has done.


Yesterday at Noah's Bagels I was offered the opportunity to pay an extra $4.00 for the "Bagel box" (a dozen bagels plus two tubs of cream cheese) instead of the dozen bagels I was planning to buy. In return, Noah's would donate $.50 to the Red Cross. Wow, what a deal!
Brad DeLong writes:
Let me say that Michael Kinsley has just become the only person I know who lives and works in the west-coast earthquake zone who claims that he has not "given much thought to the risk of a big earthquake along the west coast."

The rest of us all have our out-of-the-area emergency contact phone numbers--people far away to call and coordinate information. We have our water, our bandages, and our splints stored in the basement. We have our bookcases bolted to the wall. We try to remember to keep our cars relatively gassed up. And we have all thought that if something really bad happens to LA, we in San Francisco will have to mobilize for the first three days--and vice versa. God knows we can't expect anything constructive from Bush's White House, or Chertoff's Homeland Security, or Brown's FEMA.

Well, it's true that I do have a fair amount of disaster preparedness kit, but Brad DeLong is the only person on the West Coast I know who keeps that stuff in the basement.... actually, the only one I know with a basement.


September 9, 2005

Now that companies are required to expense employee stock options, this creates the problem of figuring out what their value should be. [*]. The major problem appears to be that because employee stock option options are restricted (e.g., they vest, can't be transferred, hedged, etc.), their value (and hence cost to to the company) is substantially less than that of an ordinary option. Any expensing scheme needs to reflect the restricted nature of employee options and--unlike with ordinary options--we have no good models for doing so.

The predominant idea seems to be to use the market to value the options. For instance, the company might try to sell some subset of the options to institutional investors and use the price they're willing to pay as a proxy for the value of the option. Cisco proposed a plan like this but the SEC rejected it saying that the restrictions don't really match those employees experience and so the option valuations won't be right.

Any analytic model needs to take into account the probability that employees will quit and leave their options on the table. As that depends largely on employee psychology, it's probably going to be a lot less amenable to modelling than are ordinary options.


September 8, 2005

I've managed to obtain a copy of the Peltre-Thormann report that WADA commissioned on the effectiveness of EPO testing. There is a lot of concern about the quality of the test but not much that casts doubt on positive results:
  • There weren't any reports of false positives. (This paper was written before the Beke case.)
  • The false negative rate appears to be unacceptably high, though they don't provide a specific number.
  • The test is incredibly expensive: €400-600 and time consuming--36 hours of lab time.
  • The test is extremely complicated so there is a lot of room for errors.
  • The different labs are all using slightly different procedures. This needs to be standardized.
  • More information needs to be collected on the longitudinal EPO profiles of specific athletes to get a clearer picture of what the real test results look like. This seems particularly important in light of the results with Beke.

Unless I'm missing something important, there's nothing really damning here. It's not clear to me why WADA remove it from there web site.


September 7, 2005

Some Harvard researchers are suggesting adulterating oral opioids with capsaicin to prevent diversion. The idea is that abusers typically crush then snort the drug. Since inhaling capsaicin is extremely unpleasant, this would deter diversion and abuse:
Recently, the Richard J. Kitz Professor of Anesthesia Research at Harvard Medical School hit on the idea of using the same irritating chemical to "burn" people who illegally use pain medications. When an abuser of a medication like OxyContin snorts, chews, or injects the drug, he or she would get intense hot pain instead of an expected happy high. A patient taking the same capsaicin-laced pill could get needed relief and avoid unpleasant sensations simply by swallowing the pills whole, as directed.

"If a formulation containing capsaicin is swallowed whole, release of the irritant in the stomach and small intestine would not cause discomfort," Woolf maintains. "The majority of the capsaicin would be cleared by the liver on first pass."

I see several potential problems with this scheme:

  • Many opioid abusers snort or inject the drug because it gives a more intense rush, but it's perfectly possible to get high by taking them orally, and some addicts already do so. It seems likely that they would just switch.
  • In my experience, eating spicy food isn't completely without side effects. I'm not sure people should have to suffer anal burning in order to get relief from chronic pain. Some experimentation would be required to determine if there were a level of capsaicin that causes pain upon snorting but not whole ingestion.
  • Oxycodone (the active ingredient in Oxycontin) is extremely soluble in water, but capsaicin is nearly insoluble in water. It should be easy to separate the opiate from the irritant by simply dissolving the mixture in water and decanting the solution off the precipitate.

All that said, this strategy is already being employed to some extent; a number of the commonly prescribed opioid preparations, such as Vicodin and Percocet are already dispensed as a mixture of opioid and acetaminophen. As acetaminophen is already toxic in high doses, this already presents a modest deterrent to abuse. Oxycontin, however, is a pure formulation, which, along with the high dosage levels in which it is typically prescribed, makes it an attractive target for abuse.

A gas station owner in Tennessee is the subject of a lawsuit for charging $7/gallon last Friday. [*]. See my previous post for why this is stupid.
Urine samples from Lance Armstrong during the 1999 Tour de France have tested positive for EPO EPO. Naturally, Armstrong denies it, but Richard W. Pound, the chairman of the World Anti-Doping Agency says:
"There's not much of a middle ground, is there?" said Richard W. Pound, the chairman of the World Anti-Doping Agency. He added that it appeared that the "tests show there was EPO there" and that the EPO test was "as close to 100 percent reliable as you could get."

Reading the original papers on which the test is based gives you a rather different picture, however. Moreover, reading on in the article we find out that there are studies claiming to show false positives in urine from athletes who have been engaging in strenuous exercise.

Erythropoietin (EPO) is a hormone that stimulates red blood cell production. Because anemia is a common medical problem, synthetic (recombinant) versions of EPO have been developed by serveral pharmaceutical companies. However, EPO also has a use as a performance enhancer. Up to a point, the higher your red blood cell count, the higher the oxygen carrying capacity of your blood and the better your endurance performance. Unsurprisingly, it's become relatively common for endurance athletes to use EPO as a performance enhancing drug. Because rHuEPO is the same protein as EPO it's very difficult to distinguish individuals with normally high EPO levels from individuals who have been doping. And since one would expect endurance athletes to have very high hematocrits, this makes doping very hard to detect.

In 2000, Lasne, et al. invented a test that allegedly distinguished natural human EPO from rHuEPO. The test took advantage of the fact that although the protein sequence of natural EPO and rHuEPO is the same, the glycosylation is not. The test uses isoelectric focusing (a form of gel electrophoresis). rHuEPO produces a characteristic banding pattern on the gel that's different from natural EPO.

L'Affaire Armstrong
EPO use has been endemic in professional cycling for years, especially before a reliable test was developed. Rumors have long circulated that Lance Armstrong was using EPO during his TdF races (it's known that he used it during his cancer treatment and there's nothing wrong with that). However, even under the very strict TdF testing regime, Armstrong has always tested clean.

Last month, in what appears to have been part of a research program, the same French lab that developed the original EPO test retested a bunch of frozen urine samples from 1999 and at least a dozen of them tested positive for rHuEPO. The matter would have ended there except that somehow a reporter from L'Equipe obtained a copy of the results and the key that matched the samples to athletes. He claims that six of the samples were Armstrong's.

Reasons for skepticism
Assume for the moment that there wasn't any hanky-panky (mislabelling, contamination, etc.) in the sample handling. Should you conclude that Armstrong was doping in 1999. After going over the sections of the literature I was able to get my hands on I think the answer is no, for three reasons.

I haven't been able to find any estimates of the error rates of the test procedure. If you read theoriginal Lasne paper, the difference between rHuEPO and natural EPO is quite striking, but Lasne et al. don't report any statistical experiments that would let you estimate the false positive or false negative rate. I can't even figure out how many samples they used. A followon study by Khan et al. describing an improvement on this technique used a single pair of donors, so it's possible that the sample size wasn't large enough to provide an adequate baseline. A World Anti-Doping Agency report (by Peltre and Thormann) appears to describe problems in the original Lasne technique, but it's been removed from the WADA website so I can't say for sure. This isn't comforting.

The second problem is the long storage time of the samples, which were taken in 1999. My biochemistry is a bit rusty, but it seems possible that long storage could lead to chemical reactions that produce other isoforms. I don't see any evidence that anyone has done a controlled trial to determine whether this could lead to false positives. This seems like a topic that would be relatively easy to study, provided we have the archival samples.

The third, and most serious, problem is that there appears to be direct evidence that the Lasne test produces false positives. The Flemish triathlete Rutger Beke tested positive for EPO, but according to the Times article, it has been shown that when under heavy training loads he produces positive results without having taken EPO (see here) His suspension has been lifted.

Bottom Line
Note that I'm not saying that Lance Armstrong wasn't taking EPO. It's certainly possible that he was. Unfortunately, the available testing methodology doesn't appear to be good enough to let us differentiate the possibility that he is from the possibility that he isn't. Based on the studies I've read, it's not even clear that the testing methodology is really adequate to distinguish cases of current usage when the samples have been stored properly.


September 6, 2005

The California State Assembly just voted to legalize same sex marriage [*]:
SAN FRANCISCO, Sept. 6 - California lawmakers on Tuesday became the first in the country to legalize same-sex marriage, with the State Assembly narrowly approving a bill that defines marriage as between "two persons" instead of between a man and a woman.

Unlike Massachusetts, where gay men and lesbians are permitted to marry because of court rulings, the legislators in California voted to amend the state's family code without the threat of legal action.


The measure now goes to Gov. Arnold Schwarzenegger, a Republican, who has supported domestic partnership legislation in the past but has not taken a public position on the marriage bill.

A spokeswoman for Mr. Schwarzenegger, Margita Thompson, said after the vote that the governor believed that the issue of same-sex marriage should be settled by the courts, not legislators, but she did not indicate whether that meant he would veto the legislation. The bill did not pass with enough votes to override a veto.

Proposition 22 (passed in 2000) already prohibits same sex marriage, but is under currently being challenged in court. It sounds like even if Schwarzenegger signs the bill things will be left in a pretty uncertain state.


September 5, 2005

Reader Kevin Dick writes:
I'm a little gung-ho on the disaster preparedness front. I've probably gone through 5 evolutions over the last 15 years of researching, procuring, and storing supplies. Family and friends frequently ask, "Hey Kev, I'm not as crazy as you about all this stuff, what's the minimum I should do." It would be nice if I could tell them to simply buy a couple of prepackaged 72-hour kits. Unfortunately, most such kits are very light on water and many of the other components are extremely cheaply made.

As Eric has noted, water is the number one requirement, perhaps exceeded by any prescription medications needed for chronic, life threatening conditions. So I've thought about water strategy quite a bit and have some minimum recommendations. Perhaps the only unobvious thing is that a water strategy requires a sanitation strategy. If you have to wash urine and feces off your hands, that wastes a lot of water. Fouling your water supply because of open waste and contaminated hands obviates all your initial effort. Having diarrhea also wastes a lot of water, hence the recommended Immodium.

I don't recommend specifically getting a filtration pump as part of these minimum requirements. They're kind of expensive, require you to get to a source of water, and can be finicky to use. If you have one for camping already, store it with your other home equipment. If you want something more than the minimum or are squeamish about drinker brown water, you might also want a pump.

I recommend having supplies in two locations, your home and each car. If you work, it's almost as likely that you'll be away from your home so you better have some water-oriented supplies with you. For your home, you should use a storage area outside, in the shade if you live in earthquake country. I use the hollow benches in my deck. Here are my recommendations for water and sanitation in each area:


  • As many 5 gallon containers of water that you have room and patience to store. 2 is probably the minimum and 2.5 per person is probably the maximum. Use a 1/2 tsp of houshold bleach as a preservative in each container. ($13/ each)
  • 2 8oz pump containers of hand sanitizer. (< $10)
  • 4 rolls of toilet paper packed in individual gallon freezer bags (< $5)
  • Emergency/camp toilet w/10 bags ($15-$30)
  • 1 gallon of household bleach (<$10)
  • 1 24 ct package Immodium (<$10)
Total Cost: $100-$150


  • Daypack w/2 mesh water bottle pockets (<$50)
  • 2 1L Nalgene bottles (< $20)
  • 1 package of 30 Micropur MP1 tablets (~$15)
  • 2-4 3-box packages of Aqua Blox (<$15)
  • 1 8oz pump containers of hand sanitizer. (<$5)
  • 2 rolls of toilet paper packed in individual gallon freezer bags (< $3)
  • Box of gallon freezer bags (<$5)
  • 1 12 ct package Immodium (<$5)
Total Cost: $100-$125 per car

My personal bias towards having a filter comes from backpacking experience where weight restrictions preclude carrying all the water you'll need and it's common to have water that's hard to purify with chemicals. I'd also add one more thing to Kevin's list: some sort of sodium replacement. If you're drinking but not eating, it's easy to get hyponatremia under hot conditions. You don't need anything fancy: table salt will work just fine, but salt tablets are generally easier to get down.

Much has been made of the fact that FEMA Director Michael D. Brown has no discernable qualifications for his position other than having been the former roommmate of previous FEMA Director Joe M. Allbaugh, who's principal qualification was as campaign manager for Bush-Cheney 2000. But of course, this is only notable in a system where there's an expectation that Secretary/Minister level positions will be filled by people with some qualifications as opposed to politicians, as in the British system (ever see Yes, Minister.) The problem--and what appears to have happened here--is a system where political appointees are expected to be competent but where the Administration decides instead to appoint political hacks.

September 4, 2005

As the situation in New Orleans makes clear, protecting your supply of drinking water is a critically important element of disaster recovery. You can survive for weeks without water, but only days (the rule of thumb is three) without water. However, with a little planning, it's easy to ensure your supply under most conditions.

Ensuring you have water
The most surefire approach is simply to stockpile water. Basically, you take a bunch of food grade plastic water jugs/bottles/cans and pour water into them. You can easily find 5-7 gallon containers of this type at camping stores. How much water you'll need depends on your habits, climate, etc., but you should assume at least a minimum of a gallon per person per day, and probably more like two. You'll also want to add some sort of preserver to block bacterial growth. Bleach works well. This keeps the water good for about 5 years at which point you have to drain and refill the containers. You can also buy portable long-lifetime water. Aquablox is a well-known brand in a convenient package. It's probably too expensive to go this route for bulk storage, but it's convenient for portable applications.

The big advantage of actually storing water is that it guarantees your supply. The big disadvantage is that it requires actually storing the water, so it clutters up your garage and however much you store is all you've got. Also, if your house/garage/etc. is damaged you may not be able to get at your stash. Still, it's not a bad precaution.

Decontaminating water
The good news is that the problem usually isn't actually a lack of water, it's just that what water there is is contaminated.

There are five basic kinds of contamination to be concerned about:

  • Chemicals.
  • Particulates.
  • Bacteria and parasites
  • Salt.
  • Viruses.

As this Slate article points out, the short term risk of chemical contamination is fairly low, though you wouldn't want to drink chemically contaminated water in the long term. However, in most developed countries it's safe to assume--or at least it seemed so until recently--that water supplies will be restored within a few weeks, so this probably isn't a serious problem.

Particulates, bacteria, parasites, and viruses are a standard problem in camping and backpacking situations because a lot of wilderness water sources are contaminated, so there's a set of standard solutions: chemical purification, uv purification and water filters. Each of these has advantages. Chemical purification is the cheapest and most portable. You can buy purification tablets or drops from camping stores or just use bleach. The big advantage of the camping preparations over bleach is that your water doesn't end up tasting and smelling funny. The big advantage of chemical purification is that it kills bacteria, protozoans and viruses. The disadvantage is that it's slow--you cayn have to wait up to four hours for the water to be safe to drink. Also, it doesn't do anything to remove dirt, silt, etc. and there are concerns about the level of purification you get if your water is contaminated with particulates. You can also get electrical salt-based purifiers. These use basically the same chemistry as chemical purification (at least some of the chemical methods) except that an electrical current is used to create the chlorine dioxide. These have a higher fixed cost but only require salt rather than specialized chemicals.

A newer option is UV purifiers. This is just a handheld version of the kind of UV-based purification that is common in municipal water treatment. UV kills bacteria, parasites, and viruses, and is much faster than chemicals, but still doesn't work well on cloudy water. It also chews through batteries really fast, which doesn't make it great for emergency use.

Probably the best technique in this situation is water filtration. There are a large variety of hand-pumped water filters available for around $50-$100. I use a Katadyn Hiker for backpacking, but all the major brands work well. Typical weights for small units are under a pound, and even a small unit can provide plenty of water for a medium-sized group: water production rates are about a liter a minute with fairly easy pumping action. The big advantage of these filters is that they will remove particulates as well as bacteria and viruses. You can also get them with activated charcoal filters that will help remove chemical contamination--though it's not clear how good a job these really do in this kind of situation. The two big problems with filters are that they don't remove viruses and that they clog, especially if you use them with really particulate-heavy water. I suspect that the virus issue isn't that important but the clogging is. You can get prefilters that will help but it's best to let as much sediment settle out of the water as you can as well. Note: I've never actually had a filter clog on me.

It's probably best to have both a filter and some method of chemical purification on hand. This gives you a backup in case your filter clogs and you can't clean it. You can also use your chemical purification method on the water you filter to kill viruses.

Desalinating Water
If you live in a coastal region, there's a good chance you'll have access to salt water. You can't drink salt water and unfortunately, water filters don't do anything to remove the salt. You can buy desalination units, even handheld ones, but they're expensive ($500-$2000), slow, and require enormous energy input to operate. Typical water production rates are between one and five liters an hour.

A desalinator is probably only really worth having if you think that there's a really good chance that salt water will be available but that fresh won't (like in a marine environment) or if you expect to have some way of powering it (for instance, off your car, since many of these units take 12 volt input).

Acknowledgement: Thanks to Kevin Dick for talking over these issues with me.

At the IETF Security Area Advisory Group (SAAG) meeting in Paris, Zachary Zeltsan from Lucent gave a talk on ITU X.805:Security Architecture for systems providing end-to-end communications. This document falls squarely into the ITU tradition of abstract models which don't really match up with networking practice.

Overview of X.805

The official title of X.805 is "Security Architecture for systems providing end-to-end communications", but the term architecture doesn't really mean what most people would take it to mean. Taxonomy or model might be better words, since the purpose seems to be to divide all possible network security tasks up into categories.

These categories exist along three axes: Dimensions, Layers, and Planes.


Security Dimensions appear to roughly correspond to what most comsec people would think of as basic security services. There are 8 dimensions:

  • Access Control
  • Authentication
  • Non-repudiation
  • Data confidentiality
  • Communications security
  • Data integrity
  • Availability
  • Privacy
This list is strangely non-orthogonal. Many of these features depend on other features (e.g. Privacy and confidentiality or Non-repudiation and authentication/integrity). Also, some of them don't really make sense in an end-to-end context.

Another confusing factor is that in standard terminology "Communications security" is a collective term for all of the major network security servics (authentication, origin integrity, confidentiality, etc.) Here, however, it seems to refer to arranging that your data isn't delivered to the wrong person. This isn't a concept that makes an enormous amount of sense from the end-to-end perspective. There are two reasons you might care about your data being misrouted (1) you don't get it (2) the wrong person does. The first issue is one of availability/DoS. The second one is a confidentiality issue, solved by crypto, not routing.


Each of these dimensions can be applied at any of three layers (note that these layers are NOT the same as the layers in the OSI reference model, so you need to think of the cross-product of security layers and OSI layers.) The layers are:

  • Applications Security
  • Services Security
  • Infrastructure Security
I find this division to be fairly muddy. In particular, the line between the Applications Security Layer, which involves "applications accessed by Service Provider customers" and the Services Security Layer which involves "services that Serivce Providers provide to their customers" seems vacuous. For instance, Instant Messaging is listed in Services but e-mail is listed in Applications.


"A Security Plane is a certain type of network activity protected by Security Dimensions...."

The security planes are:

  • Management Plane
  • Control Plane
  • End-user plane
I'm having a terrible time decoding this, but when I read it I get the impression that e.g., SIP would be part of the "Control Security" plane but RTP would be part of the End-User Security Plane.


There are 9 possible combinations of layers and planes, which are labelled modules 1 through 9. Each security dimension can be present in each module, so there are 72 different component interactions. The last 9 tables indicate the application of each dimension to each module in tabular form, by which I mean they handwave about the kind of service that you would expect each dimension to offer. e.g.,

Module 1, Infrastructure Layer, Management Plane: .... Non-repudiation: Provide a record identifying the individual or device that performed each administrative or management activity on the network device or communications link and the action that was performend. This record can be used as proof of the originator of the administrative or management activity. ....

Assessment Of Use To Practitioners

So, the obvious question is: what does X.805 bring to the party?

The first thing you have to realize is that it doesn't embody any actual technology, even at the architectural level. It doesn't describe any security protocols or even provide detailed enough descriptions of the components (modules) and their interfaces that you would be able to determine, design, and build artifacts that fit into these modules and have any chance that they would interoperate in a sane way.

The abstract of X.805 indicates that it's the architecture for ITU security and that "To achieve such a solution in a multi-vendor environment, network security should be designed around a standard security architecture." In practice, however, X.805 is so vague that it's hard to see that designing something around it has any meaning at all. I'm pretty sure I could take any random protocol and forcefit it into this conceptual architecture (sound familiar?) but what would that achieve?

Based on the presentation at SAAG, it appeared that there was some feeling that the style of analysis promoted by X.805 would be potentially useful for others who wanted to design security protocols and systems. I don't believe that this is the case, for a number of reasons:

Extreme Complexity

The most basic problem with X.805 is its astounding complexity. Any system with 72 basic components is simply not a useful analytical tool--unless your goal is to produce reams of paper indicating that you've followed some analytical procedure given to you from on high. We have a hard enough time getting people to do *any* security analysis, let alone the kind that would be implied by X.805 (to the extent to which it implies anything).

Excessive Factorization/Componentization

The complexity problem is largely a result of the desire to factor the problem into as small pieces as possible (see the Security Dimensions comments above.) Unfortunately, this isn't the way that real security protocols are designed. On the contrary, they generally offer a small number (1-3) sets of security properties which cannot be orthogonally mixed and matched in the way implied by X.805. Creating a bunch of new services just confuses issues. Security can't really be provided a la carte like that.

Bad Fit With The Internet Model

At the end of the day, the outlook implied by X.805 doesn't fit well into the Internet model. The extensive componentization creates an impression of separate security services at each Module.Dimension point. By contrast, the trend in IETF is towards having a very small number of security protocols that we reuse as much as possible and that provide a relatively fixed set of security services.

In addition, the emphasis on provider security services in this document implies a commitment to a much more active core (despite the references to e2e) than has been characteristic of Internet designs. (see above comments about "Communications Security" which is interpreted as an availability issue in Internet Security contexts). In general, the terminology and methodology implied here is so foreign to the mindset, that I doubt that X.805 can realistically be applied as an analytical tool.


The good news about X.805 is that it doesn't actually embody any security technology, so as far as I know there's no reason to believe that it's the start of some alternate ITU suite of security techniques--ITU's track record here is even worse than IETF's. The bad news is that as an analytical framework it's basically irrelevant to current network security practice. That's fine if people ignore it, but not so fine if it gets mandated as a hoop for people to jump through. That wouldn't be something that's likely to contribute to successful new protocol development.


September 3, 2005

Chief Justice Rehnquist has died. This creates an interesting tactical situation for both the Republicans and the Democrats, especially since it means that Bush only has limited flexibility (as a practical matter) as to when to appoint a new Chief.

UPDATE: 9/2/05 It's Roberts for Chief. Thanks to Chris Walsh for the heads-up. Now, does this increase or decrease the chance that the new Associate will be extremely conservative? How about that the Dems will seriously oppose Roberts?


September 2, 2005

Colby Cosh has a contrarian, but I think quite possibly correct take on looting:
but isn't much or most "looting" of the sort we're seeing in New Orleans just "salvage" in fast-forward? Are there really shop owners in downtown NoLa who think it's super important that their furniture or electronics are ruined by moisture over the next month rather than stolen? Isn't it arguably a good thing that valuables are being retrieved--by poor people who contrived to last out a hurricane without much help from the authorities--from a city that, for all relevant purposes, is now gone? I don't for one second apologize for anyone who uses violence against a neighbour; beheading and being fed to the crocodiles is too good for that sort. And non-violent looting cannot and should not be excused when the target is a good that's non-perishable and invulnerable to water, like jewelry. But it's pretty hard for me to get worked up about reports of people breaking and entering into pharmacies or convenience stores. Surely the wise merchant will have already written off inventory of that nature.

The non-perishable test is a pretty hard line to draw in practice, but purely as a matter of cost/benefit, this sounds right.


September 1, 2005

The gulf coast disaster has created a serious oil shortage, and right on cue politicians are complaining about price gouging. This Chicago Sun-Times article is a particular beauty:
Cars with Illinois plates had been lined up three deep at the pumps, sucking up Indiana gas at $2.69 per gallon, but overnight, the price shot up 50 cents a gallon.

"No lines today," Chris Patterson said Wednesday.

Regular unleaded at his Hammond, Ind., Marathon station two blocks from the Illinois state line cost $3.19. He blamed a new shipment that came in with a higher price.

"Everyone around us was running out of fuel last night," Patterson said. "We had people stacked up three to a pump."

Let's see. Before the price raise, gas stations were running out of gas. Afterwards, they're not. Look, it's the market in action.

Here's another classic. In the middle of an article complaining about prices, we get:

"If there is price gouging, it should be rooted out and punished," Blunt said. "Profiteering in a time of tragedy and crisis is both unconscionable and illegal."

Price hikes were evident at stations nationwide Wednesday as gasoline costs breached $3 a gallon in numerous states, the result of fuel pipeline shutdowns and delayed deliveries since Hurricane Katrina devastated Louisiana and Mississippi earlier this week.

Gas prices jumped by more than 50 cents a gallon Wednesday in Ohio, 40 cents in Georgia and 30 cents in Maine.

Concerns are now mounting over limited supplies of gasoline, including the possible return of long lines and scarcity reminiscent of the 1970s gas crisis.

In case it's not blindingly obvious, the cause of the long lines and scarcities of the 1970s gas crisis was price controls on gasoline.