Cracking encryption codes is an activity that has been carried out for millenia now, since the carrying of information in a way that outsiders cannot read it is as old as man's quest for politics and fighting with each other. The tales of the Enigma project in the second World War, the quest between the Allied and Axis powers to read each other's secret messages, and then the quest between the Soviets and the US over encryption and safety of messages eventually turned into a battle of mathematics; and this is what encryption is all about now, a quest for who can have a higher degree of combination of mathematics and computing power to either set up more secure systems, or to break other other's codes.
A few years ago, it seemed that 128 bit encryption was secure, and now it does not even seem that 768 bit is secure (link to article):
Most modern cryptography relies on single large numbers that are the product of two primes. If you know the numbers, it's relatively easy to encrypt and decrypt data; if you don't, finding the numbers by brute force is a big computational challenge. But this challenge gets easier every year as processor speed and efficiency increase, making "secure" a bit of a moving target. The paper describes how the process was done with commodity hardware, albeit lots of it.
Although most people aren't going to have access to these sorts of clusters, they represent a trivial amount of computing power for many organizations. As a result, the authors conclude, "The overall effort is sufficiently low that even for short-term protection of data of little value, 768-bit RSA moduli can no longer be recommended." 1024-bit values should be good for a few years still.