Home > Articles > Security > Software Security

Stuart McClure's Daily Security Tips for the Week of October 21st

  • Print
  • + Share This
Take a tip from Stuart McClure, the lead author of the best-selling, critically acclaimed security book, "Hacking Exposed." This week, Stu shares some web security tips for vendors and developers.
From the author of

Web Security Tip for Friday, October 25th, 2002

Brute force authentication – Each web page that requests a username and password can be (eventually) broken into. Given enough time and luck, an attacker can "guess" the username and password; and automated tools such as Brutus (http://www.hoobie.net/brutus/) make it sometimes laughable how simple it really is. Make sure your usernames and passwords are difficult to guess, or you may be getting a visit from a hacker looking to make fun of your laziness.

Web Security Tip for Thursday, October 24th, 2002

Sniffing – Using a packet capture tools such as tcpdump (http://www.tcpdump.org), snort (http://www.snort.org), or SnifferPro from NAI (http://www.sniffer.com), you can sometimes reveal sensitive information that should otherwise be held private. For example, if you request credit card information from a user, you will need to use some form of encryption to keep that sensitive information away from prying sniffer eyes. If you do not, an attacker can view that information and use it to commit identity fraud.

Web Security Tip for Wednesday, October 23rd, 2002

State management – Maintaining a user's state as they traverse the manifold links within a typical eCommerce web site is difficult at best. The only way to adequately tackle this task is to understand such techniques as cookie spoofing, session ID decompiling, etc. and learn their countermeasures such as cookie validation, random generation of session IDs and usage of strong encryption.

Web Security Tip for Tuesday, October 22nd, 2002

Source sifting – You can either sift through each file created by the web crawling exercises or you can perform these steps manually. Searching for nuggets of information such as comment fields, HTML tags, HTML actions, etc. is all in an attempt to understand the application's design and potential flaws. For example, simply identifying the technologies in place by viewing the URLs (.asp for Active Server Pages, or .pl for Perl) can provide a first step in understanding the potential risks in place.

Web Security Tip for Monday, October 21st, 2002

Crawling – The first step to reviewing your web applications is to crawl the targeted web site. By crawling I mean using an automated tool that will request a page from each and every link on your website. Using Windows-based tools like Teleport Pro (http://www.tenmax.com/teleport/pro/home.htm) and Black Widow (www.softbytelabs.com) or UNIX-based tools like wget (www.gnu.org), you can obtain a local copy of each web page and perform extensive analysis of its design, intent, and technology usage. Of course, if your site is filled with links to alternate or external websites, you will want to restrict the crawling to only the targeted website, and most crawling tools can restrict themselves in this manner.

Check back here every weekday for another security tip!

  • + Share This
  • 🔖 Save To Your Account