The security auditor should pay careful attention to the way in which software is developed for use within the systems being audited. Many security-related issues are in some part due to "bugs" introduced during the programming stage.
The design, development, code review, testing, and source code control/change-management processes should be carefully inspected to ensure that they are engineered to reduce the likelihood of bugs being introduced into the system. There are some extremely good specific programming guidelines in Chapter 15 ("Secure CGI/API Programming") of the book Web Security and Commerce1 (O'Reilly & Associates, 2001), by Simson Garfinkel and Gene Spafford, for all the different programming languages. Also check out the IBM Web site.
One of the most common mistakes made in coding for a Web site is to allow file uploads to the Web site and to not restrict the name of the file sufficiently (for example, to restrict it to .doc, .gif, .jpg, or .txt files) and to allow a hacker to upload .exe files or other malicious content up onto the Web site. Programmers should be aware of these kinds of implications.
The main points to bear in mind are these:
Are good developers, DBAs, and network administrators employed who are aware of the risk of security holes?
Do these employees keep up-to-date with the latest issues via training courses, Web sites, newsgroups, and mailing lists?
Are good coding standards in place, with common routines developed for error handling, session timeouts, and exceptional conditions?
Are designs and code reviewed appropriately for security flaws before code is released for testing?
Do test plans include testing for security issues?
Is source code managed effectively so that bugs are not reintroduced by deploying the wrong code?
Does code log all required activity (the IP address from which a user logged on, attempts at login that fail, or database transaction records?)