this is why you should disable broken encryption standards (for ex. in your ssl encryption)
in 1998 Bruce Schneider one of the (still) biggest philosophers about online security said the following thing
"Cryptographic algorithms have a way of degrading over time. It's a situation that most techies aren't used to: Compression algorithms don't compress less as the years go by, and sorting algorithms don't sort slower. But encryption algorithms get easier to break; something that sufficed three years ago might not today.
Several things are going on. First, there's Moore's law. Computers are getting faster, better networked, and more plentiful. The table "Cracking for Dollars" on page 98 illustrates the vulnerability of encryption to computer power. Cryptographic algorithms are all vulnerable to brute force--trying every possible encryption key, systematically searching for hash-function collisions, factoring the large composite number, and so forth--and brute force gets easier with time. A 56-bit key was long enough in the mid-1970s; today that can be pitifully small. In 1977, Martin Gardner wrote that 129-digit numbers would never be factored; in 1994, one was.
Aside from brute force, cryptographic algorithms can be attacked with more subtle (and more powerful) techniques. In the early 1990s, the academic community discovered differential and linear cryptanalysis, and many symmetric encryption algorithms were broken. Similarly, the factoring community discovered the number-field sieve, which affected the security of public-key cryptosystems
so when we see that critical belgian online governmental services still accepts a number of encryption standards that have been broken or that can be broken it is flabbergasting and we take issue with it and can't call them secure for exactly this reason
this means that you should always upgrade the encryption standards of your online services, your data on the servers and in the backups and the archives
for example since the 1940's the English and American intelligence services always archived encrypted traffic even if at that time they couldn't decrypt it - it is just because some encrypted cables from the Russian embassy in Washington to Russia could be decoded that they have found several Russian spies among those those in the US-UK atomic program - that the NSA and probably others are keeping apart all the encrypted traffic they are intercepting (so they can decode it every time the standard is broken or if they have got the private keys necessary)