COMSEC: November 2007 Archives

 

November 24, 2007

For some reason, Peter Cox's SIPtap program is getting press. First, it's immediately obvious to anyone with even minimal knowledge of networking that if you have access to the packets of a VoIP flow (or for that matter any other unencrypted network flow), you can reconstruct the data. That's why people use encryption. So, this is hardly news. That's why the IETF and others have spent a lot of time building security protocols for VoIP. Many current VoIP phones come with some encryption now and the newer stuff will be more secure and easier to deploy.

OK, so it's common knowledge. On the other hand, Cox doesn't say he discovered it, just that this is a "proof of concept". Given that it's droolingly easy to write an RTP decoder and that VoIPong and Vomit and Wireshark already existed, it's hard to see exactly what concept is being proved, other than that with enough hype you can get your name in the paper.

UPDATE: Fixed typos

 

November 23, 2007

For obvious reasons, law enforcement and investigative agencies aren't incredibly fond of encrypted communications. The most popular responses to this difficulty have generally been one or more of:

  • Forbid strong crypto entirely.
  • Require "key escrow" where a copy of the keying material somehow goes to the LEA.
  • Get a copy of the keying material after the fact.
  • Use keyloggers or other invasive measures.

None of these have been particularly successful: the strong crypto cat is out of the bag, users have overwhelming rejected key escrow, and although people do sometimes have their keys subpenad (the UK has a law requiring complaince), there are standard cryptographic techniques that provide "perfect forward secrecy" so that even if your keys are disclosed after the fact your communications aren't readable. The government in the US has had some success with keyloggers, spyware, etc., but they either require physical access or compromise of the system in question.

The popularity of combined software/service operations like Hushmail and Skype opens up a new avenue, however. It's recently come out that Hushmail has in the past handed over keys to the government for users who used their online encryption system. This was made easier by Hushmails "software as a service" type architecture, where they do the encryption and decryption on their site. Hushmail also provides an option where you can download a Java applet, but it should be clear that under the right legal constraints, they could theoretically put a backdoor in the applet you downloaded, too.

Similarly, the German police have recently complained that they can't monitor Skype calls. They say they're not asking for the encryption keys, but because of Skype's architecture and the fact that Skype is involved in authenticating each call, it should be clear that Skype could mount a man-in-the-middle attack on your phone call and hand over the keys. They could also just give you an "upgraded" software version with a back door.

Combined software/service systems like Skype and Hushmail are uniquely susceptible to this kind of lawful intercept attack (or for that matter to cheating by the vendor of any kind.) If you use third party software than you don't have to worry about your ISP cheating you because they can't—they don't have the keys. And while your software vendor could potentially cheat you, they don't have the kind of constant contact with you that Skype or Hushmail does, so they would generally need to put a back door in every copy of the software, which carries a much higher risk of discovery and of users switching software. Who wants to run software with a deliberate back door?

 

November 12, 2007

Steve Burnett is giving an intro to crypto talk in which he explains that "cryptography is about turning sensitive data into gibberish in such a way that you can get the sensitive data back from the gibberish".

My observation: "This differs from standardization, where you can't get the sensitive data back from the gibberish."

 

November 5, 2007

I'll be speaking tomorrow at the Stanford Security Seminar:
Some Results From the California Top To Bottom Review

Eric Rescorla

In Spring of 2007, the California Secretary of State convened a team of security researchers to review the electronic voting systems certified for use in California. We were provided with the source code for the systems as well as with access to the hardware. Serious and exploitable vulnerabilities were found in all the systems analyzed: Diebold, Hart, and Sequoia. We'll be discussing the effort as a whole, providing an overview of the issues that all the teams found, and then discussing in detail the system we studied, Hart InterCivic.

Joint work with Srinivas Inguva, Hovav Shacham, and Dan Wallach

If you want to listen, heckle, whatever, it's at 4:30