Home > Articles > Security > Network Security

  • Print
  • + Share This
Like this article? We recommend

Protecting Yourself from Google Hackers

The following list provides some basic methods for protecting yourself from Google hackers:

  • Keep your sensitive data off the web! Even if you think you're only putting your data on a web site temporarily, there's a good chance that you'll either forget about it, or that a web crawler might find it. Consider more secure ways of sharing sensitive data, such as SSH/SCP or encrypted email.

  • Googledork! Use the techniques outlined in this article (and the full Google Hacker's Guide) to check your site for sensitive information or vulnerable files. Use gooscan from http://johnny.ihackstuff.com to scan your site for bad stuff, but first get advance express permission from Google! Without advance express permission, Google could come after you for violating their terms of service. The author is currently not aware of the exact implications of such a violation. But why anger the "Goo-Gods"?!

  • TIP

    Check the official googledorks web site on a regular basis to keep up on the latest tricks and techniques.

  • Consider removing your site from Google's index. The Google webmasters FAQ provides invaluable information about ways to properly protect and/or expose your site to Google. From that page: "Please have the webmaster for the page in question contact us with proof that he/she is indeed the webmaster. This proof must be in the form of a root level page on the site in question, requesting removal from Google. Once we receive the URL that corresponds with this root level page, we will remove the offending page from our index." In some cases, you may want to remove individual pages or snippets from Google's index. This is also a straightforward process that can be accomplished by following the steps outlined at http://www.google.com/remove.html.

  • Use a robots.txt file. Web crawlers are supposed to follow the robots exclusion standard. This standard outlines the procedure for "politely requesting" that web crawlers ignore all or part of your web site. I must note that hackers may not have any such scruples, as this file is certainly a suggestion. The major search engine's crawlers honor this file and its contents. For examples and suggestions for using a robots.txt file, see http://www.robotstxt.org.

Thanks to God, my family, Seth, and the googledork community for all the support. Happy Googling! j0hnny (http://johnny.ihackstuff.com)

  • + Share This
  • 🔖 Save To Your Account