As I've mentioned before, the cryptographic community has pretty much decided that new asymmetric (public key) algorithms must be provably reducible to some hard (or at least believed hard) problem. Unfortunately, these algorithms tend to be much slower than the bit mashing algorithms that dominate symmetric cryptography. All this is to provide background for Contini, Lenstra, and Steinfeld's Very Smooth Hash (VSH), a number theoretic hash function for which the authors claim collision-resistance is reducible to factoring. The variant that provides security equivalent to 1024-bit RSA is about 25x slower than SHA-1.

# New hash function: VSH

## 15 TrackBacks

Listed below are links to blogs that reference this entry: New hash function: VSH.

TrackBack URL for this entry: http://www.educatedguesswork.org/cgi-bin/mt/mt-tb.cgi/344

VSH ist draussen ... Read More

cheap air ticket Read More

womens health Read More

student loans Where they are sicked, there remain running reespects during the distemper, which I outspan not sou is a s Read More

Mpeg free films Anime samples free Dog fuck virgin video Man zoofilia Read More

Some other points to note I think, are that, well how to do this as efficiently as possible. Hmm. The once obvious solution becomes clouded. Well ok here goes. Read More

Mom dad having sex videos Mature ladies grannies mon and boy pics movies download Free erotic bondage Little b... Read More

Teacher sex online free Boys fuck movie Site:www23wispcom rape sex Son fucking mum previews Read More

Free xxx pics & movies underground South indian teenage sex Mandingo free videos Free female bondage sex sadistic movies Read More

Adult home video Illegal adult video download Dragonballz free videos Filipina sex movie free Read More

Darwinistic reaps naked worried Talladega,888 http://www.20six.co.uk/888 potlatch:casinogame http://www.20six.co.uk/casinogame Read More

party poker partypoker partypoker video poker video poker Read More

internet poker cars cars poker online poker online Read More

Well, not only does that run slooooww, but reduce to factoring?

Isn't hash function (1-way function) thought to be in a harder class than public key? Shouldn't our hash functions survive in a quantum-cryptographic world (we hope)?

Well, we clearly know that for some hash functions, collisions are much easier than factoring. And while I agree that it would be nice to have hash functions that were secure against quantum computing, since a real quantum computer would pretty much tottally destroy our key management system, I'm not hyper-worried about hash functions.

That said I agree that this is too slow, but it presumably represents a lower bound on speed, so it will be interesting to see if it's possible to do something faster.

I disagree:

a: We can do key management without public key. It's a pain in the behind, but Kerberos (to take one example) did manage fairly well enough. So worst case, quantum computing is practical and we fall back to the trusted intermediaries.

b: I really think factoring is too easy. Is it possible to make it provably NP-Complete? I believe there are some provably NP-Complete symmetric key algorithms, so shouldn't there also be NP-Complete hash functions? Factoring is just easy enough (namely, quantum) that it worries me.

Provable NP-completeness says less than nothing about average-case hardness--that is, I'd be much

lessconfident in a proposed cryptographic primitive if I were told that it was provably NP-complete. That's because most methods for proving a problem's NP-completeness show how to map any NP problem onto the original one--including, of course, ones that are very easy on average. And while that doesn'tguaranteethat the mapping will conserve the easiness-on-average of the other problem, the likelihood that a straightforward mapping will turneverysuch easy distribution into a hard one seems very small.Proving average-case hardness is much better, but it's usually done through some form of self-reducibility proof (i.e., proving that the average case is as hard as the worst case), which is much easier if there's a lot of structure to work with--as in the case of the popular trapdoor functions, which are based on algebraic groups. The problem is that such structure also allows obvious avenues of attack, such as the subexponential factoring algorithms, which heavily exploit the structure of the algebraic groups on which the factoring-based trapdoor functions are based.

Now, that's not much a sacrifice in the case of trapdoor functions, since trapdoors already imply a certain amount of structure--it's hard to imagine creating one just by "bit-twiddling". But in the case of symmetric cryptosystems and cryptographic hash functions, such structure doesn't appear to be necessary. And if it's not necessary, it seems highly preferable to avoid it.

Now, it's possible that the new secure-as-factoring hash function could be made much faster than it currently is, while retaining its security. But then, it's also possible that SHA256 could be made much faster than it is, while retaining

itssecurity. In fact, I conjecture that the only reason at this point that we need to worry about the security of our "bit-twiddling"-based cryptographic primitives at all is that we keep trying to cut corners on security in the interest of speed. For example, it would be interesting to compare the speed of the new hash function against that of MD2--a very old, very slow hash function which was designed to be very "safe", and which has not, to the best of my knowledge, even come close to being broken yet.MD2 does have some attacks, including one last year by Muller which could find preimages in 2^104 work rather than the designed 2^128 strength. There was also an old attack against the internal compression function by Rogier and Chauvaud from 1995/1997, but it did not impact the full version.

2^104 is of course still plenty strong enough, but as with SHA-1, any attack that significantly weakens the designed strength is of concern. And this is for pre-images rather than collisions. Plus of course MD2's 128 bit size is not enough for today due to the 2^64 work factor to find collisions.

I agree that NP completeness has not been very successful as a design strategy for crypto algorithms. The classic example is Ajtai-Dwork, which based a PK system on an NP hard problem, even showing that average case complexity was the same as worst case, and yet it was broken anyway.

Provable security for hash functions like this VSH does seem to be a desirable feature, and if we could get it for a fast algorithm that would be great, but I don't expect it.

My guess is that we will end up with two families of hash functions in use, a slow one with some level of provability that is useful for hashing small messages, and a fast one that perhaps comes out of an AES-like competition which will be used for large messages.