Andy

May 23, 2011 at 10:40 pm

Why such short chain lengths? Can’t you decrease the rainbow table size with a linear increase in search time by simply moving from 10,000 entry chains to 100,000 entry chains? Since chaining is cheap even with an expensive hash, it seems like an obvious win.

david_schuetz

May 24, 2011 at 2:31 pm

I needed something to work with, and in testing, 10,000 seemed workable while 100,000 was pretty slow. Keep in mind that the DES routine is slower than the MD5 (about 9x), so having the chain lengths about 1/10 the size keeps the times similar.

On the other hand, if you’re able to get a faster DES routine, especially if it’s implemented in GPU or something else similar, then you’d definitely want to go with longer chain lengths.

Dennis F. OBrien

February 12, 2013 at 1:57 pm

About 15 years ago I did a similar exercise, I used more of a probability model. I had some time on my hands and selected the upper-lower-num-+-minus character sets I noticed most common for prior guessed passwords. Then I pllied each of the 4096 salts for each trial password and I went aaaaaa,aaaaab,aaaaac… thru 99999999 to build files of every upper-lower-num-+-minus password having at least 6 characters, but less than 9 characters (likely unix/linux passwords). While the success rate via the resulting table lookup was over 85%, the amount of storage required to save every character combination using each of the 4096 salts was prohibitive. But today a 3 terrabyte disk is about $100, so the limitations I had at the time likely no longer apply. Then again, with todays distributed storage, someone could likely make the entire rainbow set and turn it into a simple table lookup as a full lookup, with no computation required.