We don't say that these sites are bad or that they are hackable, we just say that they don't receive the care they should be and that administrators should at least Google themselves now and than, just to be sure that no information or loopholes are published by Google before others find them.
It is a good practice that if a technical mistake is encountered, the website doesn't show the technical explanation (and all the rest) to the visitor. You can just as well redirect the visitor to a page with excuses, with a 404 or back to the homepage.
It is also good practice to patch and update your services and scripts. If you wanne be sure you can use metasploit to test against your own site.
What is a bit amazing in this web2.0 world is the high number of websites that still lose all that time with programs and scripts that are implementing guestbooks, forums, photogalleries and stuff like that. There are for the moment enough free services that do all that stuff for you without any risk for you. A high number of websites would have been able to solve its programmatic problems if they should have used these services.
The websites that will arrive in the feed http://rss.furl.net/member/mailforlen.rss?topic=be_googled