Media and Boot Security
Other seldom-addressed issues are boot media and drive accessibility. In settings that expose your machines to public use or accesssuch as in a university computer labyou should disable floppy or CD-ROM boot access. Typically, you do this through system BIOS settings.
In older systems, this isn't an issue. In fact, it's only in recent years that PC-based CD-ROM drive manufacturers have incorporated exotic boot options. (Workstation-based SCSI systems have been bootable for much longer). Also, it was only recently that the majority of BIOS chips supported user-defined boot options.
The reason for disabling boot options is this: If you don't, anyone walking by can insert a boot disk or installation media and overwrite your drive, install software, or perhaps copy or read files on unprotected, non-NTFS, or poorly controlled Unix partitions. (Note also that if certain conditions are met, certain boot disks, if properly configured, can bypass some or all of your security measures.)
How you disable these boot options varies. In some cases, the BIOS supports an implicit restriction, offering a Disable Floppy Boot or a Disable CD-ROM Boot feature, or both. In other cases, you must force a prohibition by specifying a particular boot sequence.
The term boot sequence refers to what drives the system should search to find the bootable partition. Today, it's common for BIOS chips to offer widely diverse boot sequence options, such as A, C, IDE01, IDE02, CDROM, OTHER, ALL, and so on. Many offer preset combinations, such as the following:
A, C, CDROM
C
C, A
CDROM, C, A
IDE01, IDE02, CDROM, C
In situations where your BIOS does not offer an implicit restriction, choose C only (if that option is available). This forces the system to boot exclusively from the C drive. (In cases where the preset combinations permit you to exclude the CD-ROM, but force drive A in their sequences, toggle the Disable Floppy Seek on Boot option.)
If you're using SCSI drives, however, disabling boot features is more complicated. Here, you must review your SCSI adapter's documentation. Only in rare cases can you control SCSI device boot control from the system BIOS. (Exceptions include situations where your SCSI is on-board, as in ASUS boards that have twoand sometimes fourSCSI connectors permanently installed on the motherboard.)
Most SCSI adapters have their own BIOS, which permits you to set which drives are bootable. If you establish such settings, ensure that you either set the SCSI adapter's administrative password (if it has one), or otherwise set your BIOS password. Standalone SCSI adapters kick in after the BIOS finishes its hardware diagnostic routines.
Biometric Identification: A Historical Perspective
Biometric identification is a new field, but its roots reach to ancient Egypt, when Pharaohs "signed" decrees with their thumbprint. In more recent times, Sir Francis Galton significantly advanced biometric identification when in 1893 he demonstrated that no two human's fingerprints were alike, even in cases of identical twins.
Sir Edward Henry exploited this when he developed the Henry System of fingerprint analysis, which, though waning, is still in use today. Henry's system classified our fingertip ridges into loops of varying dimension. By analyzing these and establishing eight to sixteen points of comparison between samples, cops could positively identify criminals.
NOTE
Fingerprint analysis is lauded as infallible, and in most cases it isproviding the target has fingerprints. Not everyone does. Some skin diseases distort fingerprints or deny them altogether. One example is epidermolysis, an inherited condition that mostly attacks unborn children. Epidermolysis victims sometimes have partial fingerprints, and sometimes none at all.
Until the mid-20th century, fingerprinting technology was surprisingly primitive. Obtaining and analyzing prints involved direct physical hand-to-ink impressions. Armed with these prints, which were stored on paper cards, criminologists made visual comparisons against samples from crime scenes.
More advanced technology has since surfaced. Today, the FBI stores 200 million fingerprints (29 million of which are unique) using the Fingerprint Image Compression Standard (FICS). FICS provides digital, space-efficient storage, and reduces terabytes of data to a fraction of their original size. And, as you might expect, computers now do most of the matching digitally.
Contemporary digital fingerprinting technology is now inexpensive enough that vendors can incorporate it into PCs. Compaq, Sony, and many other manufacturers now offer fingerprint ID systems for PC models, and this trend is growing. Such systems capture your prints with a camera and use the resulting image to verify your identity.
Fingerprints are merely the beginning, though. In recent years, scientists have used several unique biological characteristics to reliably identify users, and of these, retinal patterns offer high assurance levels.
The retina, which handles peripheral vision, is a thin optical tissue that converts light to electrical signals and then transmits them to your brain. Retinal scanners focus on two retinal layers. One, the outer layer, contains reflective, photoreceptive structures called cones and rods that process light. Beneath these, in the choroid layer, the retina houses complicated blood vessel systems.
Retinal scans bombard your eye with infrared light, causing the cones and rods to reflect this light. The resulting reflection in turn reveals an imprint of your retina's blood vessel patterns. These patterns, and in some cases, their digital or cryptographic values, constitute your retinal "fingerprint."
Experts report that retinal scans are largely superior to fingerprints for identification purposes. Retinal patterns, for example, offer more points for matching than fingerprints do (anywhere from 700 to 4,200). For this reason, experts class retinal scanners as high biometrics, or biometric systems with exceptionally high degrees of assurance.
Indeed, only in rare cases are retinal scans insufficient, such as where users are blind, partially blind, or have cataracts. If anything, retinal scanners are too sensitive. They will sometimes bear disproportionately high false negative or rejection rates. That is, almost no chance exists that a retinal scanner will authenticate an unauthorized user, but it might reject a legitimate one.
More recent technology focuses on voice patterns, but such systems can be unreliable. Instances can arise where voice recognition fails because the user has bronchitis, a cold, laryngitis, and so forth.
Using Biometric Access Control Devices
There are pros and cons to biometric access control. On the one hand, such controls offer extreme assurance. On the other, practical obstacles exist to instituting a wholly biometric approach.
First, when expanding biometric controls beyond the scope of your own workstation, you face privacy issues. For example, suppose you decide to institute biometric access controls enterprise-wide. Even if your employees sign a release, they could later sue for invasion of privacy, and perhaps prevail.
NOTE
Privacy concerns are more real than imagined. Experts say that retinal scans can detect drug abuse, hereditary disease, and even AIDS. Maintaining a retinal pattern database could therefore expose you to litigation. Fingerprints can reveal criminal convictions, too, which also constitute sensitive data. For a closer look at these techniques and their implications, check out A Primer on Biometric Technology, a PDF file located at http://www.rand.org/publications/MR/MR1237/MR1237.ch2.pdf.
Biometric access controls also have social implications. Even if your employees don't voice it, they might resent such controls, and see them as a privacy violation. This could cultivate a hostile work environment, even if not overtly.
Perhaps the strangest drawback of biometric access controls, though, is their sheer effectiveness, an issue to consider before deploying them. Most biometric systems perform at least simple logging, and thus create an incontrovertible record of whom did what and when they did it. In lawsuits or criminal actions, your opponents could use your biometric system's records against you, as the logs could deprive your personnel of plausible deniability.
Finally, biometric access controls are impractical in environments that extend beyond your local network. You can't, for example, force remote users to use biometric devices, nor do all remote systems offer biometric support.
These issues aside, biometric access controls are excellent when used in-house, in close quarters among trusted co-workers. I recommend using them in your inner office on machines used to control and administrate your network.
To learn more about biometric identification, check out these sites:
Biometrics: Promising Frontiers for Emerging Identification Markets; MSU-CSE-00-2; Anil K. Jain and Lin Hong and Sharath Pankanti; February 2000. http://www.cse.msu.edu/publications/tech/TR/MSU-CSE-00-2.ps.gz. (PostScript and gzipped)
Bio1 (http://www.bio1.com )A resource for papers, statistics, standards, and studies.
A View From Europe. http://www.dss.state.ct.us/digital/news11/bhsug11.htm)An interview with Simon Davies that focuses on biometric privacy issues.
Fight the Fingerprint (http://www.networkusa.org/fingerprint.shtml)A group that sees a biometric future (and doesn't like it). As their opening page explains: "We Stand Firmly Opposed to All Government Sanctioned Biometrics and Social Security Number Identification Schemes!"
The BioAPI Consortium (http://www.bioapi.org/)This group was established to help developers integrate biometric identification into existing standards and APIs.
The Biometric Consortium (http://www.biometrics.org/)("...the US Government's focal point for research, development, test, evaluation, and application of biometric-based personal identification/verification technology...")