Hacker News new | past | comments | ask | show | jobs | submit login
RSA-210 factored (mersenneforum.org)
130 points by Mithrandir on Oct 7, 2013 | hide | past | favorite | 34 comments



To clarify, this is not a new factoring record for products of two primes. RSA-768, a 232-digit number (22 digits longer) was factored in 2009, and that record still stands. http://en.wikipedia.org/wiki/RSA_numbers#RSA-768

The algorithm used here is GNFS (general number field sieve), which is the same algorithm that's been used for about two decades. In other words, this has no impact on the security of RSA.

More information: http://en.wikipedia.org/wiki/Integer_factorization_records


While it does have no impact in the expected complexity of factoring larger keys, this kind of thing is informative of the maturity of public factoring software: a single person with (presumably) access to a number of machines can pull off this kind of job on their own. Given the number of things that can go wrong in a complex method like the NFS, this is noteworthy.


How many servers does Google have? from search results it seems more than 1 million.

I don't know the details of the algorithm but can we assume that Google can factor 20 bits more?


The answer to "I don't know what I'm talking about, but can I assume..." is always 'no'.


You are wrong, my intuition was correct, the algorithm is pretty parallelizable and I worked in the computer security field.

I wrote an speculation based on the difficulty to read and understand papers before this post is gone.


See, now you haven't assumed it, you've researched it.

But are you saying Google could factor all that... if they took all their servers that they ordinarily use for their business, turned their business off, and instead used them to factor an RSA key?


"RSA-210" is the official name for this challenge, but it is a bit of a misnomer. The modulus has 210 decimal digits but about 210 * log2(10) = 697 bits.

EDIT: Wikipedia article says 696 bits.


Ah, thank you. I hadn't heard of the challenge and didn't know the size of the number being factored. So this will mainly affect 1024 bit keys, rather than 2048+ bit keys, I presume.


It doesn't really even affect 1024 bit keys as even the largest RSA keys factored (this one is close but isn't) are quite a bit far away from 1024 (remember factoring difficulty scales exponentially with number of bits, not linearly). But 2048 bit keys are recommended I believe.


For new keys, 4096 is preferred.


I don't think it's that simple. A 4096 bit key is extremely unlikely to get you fewer bits of security than a 2048 bit key. But the thing that makes 2048 bit keys a realistic threat is likely to make the "preference" be for something other than RSA entirely.


If 2048-bit RSA is vulnerable, then RSA is probably toast.


I know some of the CAs are using 4096-bit keys for their long term signing keys, but those are going to be in use for over a decade and are a major target. Everyone else seems to use 2048-bit.


I know that being able to factor ~670/1024-bits isn't that impressive (overall). But the fact that it is possible to develop the software this far shows that it's mainly a hardware problem. Enough parallelization into GPU cores with a significant system, and I think that cracking a 1024-bit key comes to less than a week.


In case you've never heard of the challenge: https://en.wikipedia.org/wiki/RSA_Factoring_Challenge


From the logs:

> Mon Sep 23 11:09:41 2013 commencing Lanczos iteration (32 threads)

> Mon Sep 23 11:09:41 2013 memory use: 26956.9 MB

> [...]

> Thu Sep 26 07:17:57 2013 elapsed time 51:56:44

I’m not sure that I’m understanding all the details. Does this mean that they factored a 210-digit number in 52 hours in a single machine?


No. What you're seeing in that log is only the two last steps in the number field sieve: linear algebra and the square root step. The rest of the details are missing.

The time of the square root is where you are seeing the 52 hours. That time is essentially 0 in relation to the whole process.

You can also see that the last 1033963 - 1026631 = 7332 iterations of Lanczos (on a 64Mx64M ~24GB matrix!) took 15 hours. We can then estimate that the matrix step required around 3 months on that particular setup.

We don't know anything about the sieving, but I'd venture it took significantly more computing power than the matrix, as is usually the case.


The sieving stage can be massively parallel and doesn't require large amounts of memory, with enough machines you could perform the sieve stage in under a day (and with more machines within an hour). A 100,000 node botnet could be very useful in this regard.

Knowing when you've got enough relations is an inexact science given there will be some duplicates amongst relations generated on separate machines, so it's not uncommon for a final bit of sieving to be required once msieve is pointed at the combined relations file (which is seen in the log).

It's the block Lanczos and matrix square root steps that cannot be parallelised efficiently[1].

Interest in the various Number Field Sieve algorithms (SNFS/GNFS) is one of the main reasons I started (and finished!) a part time degree in Mathematics to go along with my Comp Sci degree. Next on the list (after a break from 8 years of part time study) will be the M.Sc. courses[2].

1. It can be parallelised, but not efficiently, i.e. throwing 10 similar machines at the problem does not get the job done in a tenth of the time, only ~95% of the time. No efficient algorithm exists for doing this. Yet. If that does appear then RSA starts to look very very shaky.

2. The first of which (M820 The Calculus of Variations and Advanced Calculus) can be downloaded from here:

    http://chilli.open.ac.uk/dr9/
Yes, the entire course can be downloaded, (although it's more of a guided read of Apostol than a series of lectures.)


I'm not aware M823 can be downloaded (which is the first of the two number theory courses using Apostol)


Yes, sorry, got the courses mixed up.


I was curious as to how the cash prizes [1] for this challenge compared to playing the lottery. Consider the next smallest one left: RSA-768. By my very rough estimates [2], $1 worth of computing time on a typical desktop gives you a probability of 10^-58 of factoring the prime by picking random numbers smaller than its square root.

[1] https://en.wikipedia.org/wiki/RSA_Factoring_Challenge [2] https://www.google.com/#q=%241+%2F+(%240.09+%2F+kwh)+%2F+100...


What hardware is used for the first round of sieving? Is it "just" distributed computing on normal machines, or are they using special FPGA farms?


For us that don't understand, what does this mean? Is RSA less secure now?


It's as secure as it ever was, it just shows how advancing technology means that you can brute force a larger number. RSA is dependent on math problems that take a fairly long time to solve unless you know one of the base factors. What this is saying is that on a ~$5000 computer, it will take a bit over a couple days to factor a 697 bit RSA number. This is more a demonstration as to why you need to continually increase RSA keysizes -- at this point, a 1024 bit number is probably within range of something a three letter agency could factor within reason.


The implication of the parent is that with a ~ $10000 computer you can take a couple of days to factor a 698 bit RSA number. Or $20,000 can factor 699 in a couple of days - $40,000 gets you 700, $80K for 701, $160K 702, $320K 703, $640K 704, $1.28M 705, $2.56M 706, call it $5M 707, $10M 708, $20M 709, $40M 710, $80M 711, $160M 712, $320M 713, $640M 714, call it $1.2B 715, $2.4B 716, $4.8B 717, $9.6B 718, $19.2B 719, $38.4B 720, $76.8B 721, $153.6B 722, call it $300B for 723. We'll stop here because long before reaching this amount you would have realized massive economies of scale such as running entire plants making custom chips. Then again, we're talking about what can be done in a "couple days".

If we extrapolate couple of days, to 4 days we can add +1 bit, 8 days, +2 bits, 16 days +3, 32 +4, 64 +5, 128 +6, 256 +7, 512 (1.4 years) +8, 2.8 yrs +9, 5.6 yrs +10.

By that time again whatever is sitting there is obsolete.

Still, we're up to 733 bits. If we assume some massive growth and large economies of scale it is quite conceivable that $300B gets you a 10,000,000x increase on the bang per buck based on economies of scale alone (23 bits) working with today's technology; or that by waiting, within 5 years breakthrough technology would cause another 1,000,000 fold increase (call it another 20 bits). We are now up to 776 bits. That is just 248 bits away from 1024 bits:

If we make ALL of the above assumptions, and you throw $300B at the problem for 5 years and get to experience 1 million fold better technology and also a ten million fold better price than the commodity demonstration, you can brute force

1 / 452312848583266388373324160190187140051835877600158453279131187530910662656th of the keyspace.

Thus I would say that the demonstration is NO threat of "advancing technology", on the basis provided.


I stand corrected by the two replies!

According to this article: http://en.wikipedia.org/wiki/Key_size

"For example, the security available with a 1024-bit key using asymmetric RSA is considered approximately equal in security to an 80-bit key in a symmetric algorithm (Source: RSA Security)."

"As of 2003 RSA Security claims that 1024-bit RSA keys are equivalent in strength to 80-bit symmetric keys, 2048-bit RSA keys to 112-bit symmetric keys and 3072-bit RSA keys to 128-bit symmetric keys. RSA claims that 1024-bit keys are likely to become crackable some time between 2006 and 2010 and that 2048-bit keys are sufficient until 2030. An RSA key length of 3072 bits should be used if security is required beyond 2030. NIST key management guidelines further suggest that 15360-bit RSA keys are equivalent in strength to 256-bit symmetric keys."

As such my post contains very grave misinformation and should be disregarded!

The analysis in it applies to symmetric cipher key size.


RSA doesn't scale the same way as a symmetric encryption algorithm, though. It took a rather heavy duty cluster for its day months to crack RSA-512 ( http://web.archive.org/web/20070621021111/http://rsa.com/rsa... ). NIST itself states that RSA-1024 should no longer be considered secure ( https://blogs.rsa.com/rsa-768-factored/ press release regarding RSA-768 being factored, because NIST's pages are part of the shutdown ). While your numbers are true for a traditional crypto algorithm, factorial based problems don't scale the same way.


Some time ago Eran Tromer gave an estimate of single-digit millions of dollars for a device that could factor a 1024 bit key in a year. I can't quite tell what you're suggesting about the useful lifespan of a 1024 bit key, but I feel like Tromer's opinion represents a growing consensus.

You might also think in terms of the security level that a 1024, 2048, 4096 &c key gets you. It isn't 1024 bits for a 1024 bit RSA key!


See above. Another commenter pointed out that the 52 hours figure was only the last step in the factorization process, and that the complete process likely took months. That adds an order of magnitude to all the figures in the reply to your comment, as well.


No, it's the solution to one of the problems in the RSA Factoring Challenge [1].

[1] http://en.wikipedia.org/wiki/RSA_Factoring_Challenge


It just means that lower bit keys are less secure as was known. Folks have been trying to get people to switch to higher bit keys as a result. RSA-210 is just the name of the challenge: https://en.wikipedia.org/wiki/RSA_numbers#RSA-210

There's still 1024bits, which is still in common use and more people are switching to 2048. You're still fine.


how did they know it was the product of two primes in the 1st place?


It's a competition set up with prizes for factoring some 'semiprimes' (numbers with exactly two prime divisors). So unless the challenge organizers had messed up, they were guaranteed this.

[1] https://en.wikipedia.org/wiki/RSA_Factoring_Challenge


Checking if a number is a prime or not is much easier than actually factoring it (if it is not a prime).

http://en.wikipedia.org/wiki/Primality_test




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: