Internationally recognized computer security experts offer advice and tools for achieving security through computer forensic analysis.
° Authors are renowned for developing notorious computer-security software such as SATAN (Security Administrator's Tool for Analyzing Networks) and the Coroner's Toolkit.
° Covers looking at systems in a new way--one that affords the opportunity to understand how systems can be compromised.
° Only titles approved by Brian Kernighan are allowed into Professional Computing Series.
"Don't look now, but your fingerprints are all over the cover of this book. Simply picking it up off the shelf to read the cover has left a trail of evidence that you were here.
"If you think book covers are bad, computers are worse. Every time you use a computer, you leave elephant-sized tracks all over it. As Dan and Wietse show, even people trying to be sneaky leave evidence all over, sometimes in surprising places.
"This book is about computer archeology. It's about finding out what might have been based on what is left behind. So pick up a tool and dig in. There's plenty to learn from these masters of computer security."
--Gary McGraw, Ph.D., CTO, Cigital, coauthor of Exploiting Software and Building Secure Software
"A wonderful book. Beyond its obvious uses, it also teaches a great deal about operating system internals."
--Steve Bellovin, coauthor of Firewalls and Internet Security, Second Edition, and Columbia University professor
"A must-have reference book for anyone doing computer forensics. Dan and Wietse have done an excellent job of taking the guesswork out of a difficult topic."
--Brad Powell, chief security architect, Sun Microsystems, Inc.
"Farmer and Venema provide the essential guide to 'fossil' data. Not only do they clearly describe what you can find during a forensic investigation, they also provide research found nowhere else about how long data remains on disk and in memory. If you ever expect to look at an exploited system, I highly recommend reading this book."
--Rik Farrow, Consultant, author of Internet Security for Home and Office
"Farmer and Venema do for digital archaeology what Indiana Jones did for historical archaeology. Forensic Discovery unearths hidden treasures in enlightening and entertaining ways, showing how a time-centric approach to computer forensics reveals even the cleverest intruder."
--Richard Bejtlich, technical director, ManTech CFIA, and author of The Tao of Network Security Monitoring
"Farmer and Venema are 'hackers' of the old school: They delight in understanding computers at every level and finding new ways to apply existing information and tools to the solution of complex problems."
--Muffy Barkocy, Senior Web Developer, Shopping.com
"This book presents digital forensics from a unique perspective because it examines the systems that create digital evidence in addition to the techniques used to find it. I would recommend this book to anyone interested in learning more about digital evidence from UNIX systems."
--Brian Carrier, digital forensics researcher, and author of File System Forensic Analysis
Computer forensics--the art and science of gathering and analyzing digital evidence, reconstructing data and attacks, and tracking perpetrators--is becoming ever more important as IT and law enforcement professionals face an epidemic in computer crime. In Forensic Discovery, two internationally recognized experts present a thorough and realistic guide to the subject.
Dan Farmer and Wietse Venema cover both theory and hands-on practice, introducing a powerful approach that can often recover evidence considered lost forever.
The authors draw on their extensive firsthand experience to cover everything from file systems, to memory and kernel hacks, to malware. They expose a wide variety of computer forensics myths that often stand in the way of success. Readers will find extensive examples from Solaris, FreeBSD, Linux, and Microsoft Windows, as well as practical guidance for writing one's own forensic tools. The authors are singularly well-qualified to write this book: They personally created some of the most popular security tools ever written, from the legendary SATAN network scanner to the powerful Coroner's Toolkit for analyzing UNIX break-ins.
After reading this book you will be able to
The book's companion Web site contains complete source and binary code for open source software discussed in the book, plus additional computer forensics case studies and resource links.
Download the Sample
Chapter related to this title.
About the Authors.
I. BASIC CONCEPTS.
1. The Spirit of Forensic Discovery.
Unusual Activity Stands Out.
The Order of Volatility (OOV).
Layers and Illusions.
The Trustworthiness of Information.
The Fossilization of Deleted Information.
Archaeology vs. Geology.
2. Time Machines.
The First Signs of Trouble.
What's Up, MAC? An Introduction to MACtimes.
Limitations of MACtimes.
Argus: Shedding Additional Light on the Situation.
Panning for Gold: Looking for Time in Unusual Places.
DNS and Time.
Journaling File Systems and MACtimes.
The Foibles of Time.
II. EXPLORING SYSTEM ABSTRACTIONS.
3. File System Basics.
An Alphabet Soup of File Systems.
UNIX File Organization.
UNIX File Names.
UNIX File Types.
A First Look Under the Hood: File System Internals.
UNIX File System Layout.
I've Got You Under My Skin: Delving into the File System.
The Twilight Zone, or Dangers Below the File System Interface.
4. File System Analysis.
Preparing the Victim's File System for Analysis.
Capturing the Victim's File System Information.
Sending a Disk Image Across the Network.
Mounting Disk Images on an Analysis Machine.
Existing File MACtimes.
Detailed Analysis of Existing Files.
Wrapping Up the Existing File Analysis.
Intermezzo: What Happens When a File Is Deleted?
Deleted File MACtimes.
Detailed Analysis of Deleted Files.
Exposing Out-of-Place Files by Their Inode Number.
Tracing a Deleted File Back to Its Original Location.
Tracing a Deleted File Back by Its Inode Number.
Another Lost Son Comes Back Home.
Loss of Innocence.
5. Systems and Subversion.
The Standard Computer System Architecture.
The UNIX System Life Cycle, from Start-up to Shutdown.
Case Study: System Start-up Complexity.
Kernel Configuration Mechanisms.
Protecting Forensic Information with Kernel Security Levels.
Typical Process and System Status Tools.
How Process and System Status Tools Work.
Limitations of Process and System Status Tools.
Subversion with Rootkit Software.
Command-Level Evasion and Detection.
Kernel Rootkit Installation.
Kernel Rootkit Operation.
Kernel Rootkit Detection and Evasion.
6. Malware Analysis Basics.
The Dangers of Dynamic Program Analysis.
Program Confinement with Hard Virtual Machines.
Program Confinement with Soft Virtual Machines.
The Dangers of Confinement with Soft Virtual Machines.
Program Confinement with Jails and chroot().
Dynamic Analysis with System-Call Monitors.
Program Confinement with System-Call Censors.
Program Confinement with System-Call Spoofing.
The Dangers of Confinement with System Calls.
Dynamic Analysis with Library-Call Monitors.
Program Confinement with Library Calls.
The Dangers of Confinement with Library Calls.
Dynamic Analysis at the Machine-Instruction Level.
Static Analysis and Reverse Engineering.
Small Programs Can Have Many Problems.
Malware Analysis Countermeasures.
III. BEYOND THE ABSTRACTIONS.
7. The Persistence of Deleted File Information.
Examples of Deleted Information Persistence.
Measuring the Persistence of Deleted File Contents.
Measuring the Persistence of Deleted File MACtimes.
The Brute-Force Persistence of Deleted File MACtimes.
The Long-Term Persistence of Deleted File MACtimes.
The Impact of User Activity on Deleted File MACtimes.
The Trustworthiness of Deleted File Information.
Why Deleted File Information Can Survive Intact.
8. Beyond Processes.
The Basics of Virtual Memory.
The Basics of Memory Pages.
Files and Memory Pages.
Anonymous Memory Pages.
The savecore Command.
Static Analysis: Recognizing Memory from Files.
Recovering Encrypted File Contents Without Keys.
File System Blocks vs. Memory Page Technique.
Recognizing Files in Memory.
Dynamic Analysis: The Persistence of Data in Memory.
File Persistence in Memory.
The Persistence of Nonfile, or Anonymous, Data.
The Persistence of Memory Through the Boot Process.
The Trustworthiness and Tenacity of Memory Data.
Appendix A. The Coroner's Toolkit and Related Software.
Data Gathering with grave-robber.
Time Analysis with mactime.
File Reconstruction with lazarus.
Low-Level File System Utilities.
Low-Level Memory Utilities.
Appendix B. Data Gathering and the Order of Volatility.
The Basics of Volatility.
The State of the Art.
How to Freeze a Computer.
Today, only minutes pass between plugging in to the Internet and being attacked by some other machine--and that's only the background noise level of nontargeted attacks. There was a time when a computer could tick away year after year without coming under attack. For examples of Internet background radiation studies, see CAIDA 2003, Cymru 2004, or IMS 2004.
With this book, we summarize experiences in post-mortem intrusion analysis that we accumulated over a decade. During this period, the Internet grew explosively, from less than a hundred thousand connected hosts to more than a hundred million (ISC 2004). This increase in the number of connected hosts led to an even more dramatic--if less surprising--increase in the frequency of computer and network intrusions. As the network changed character and scope, so did the character and scope of the intrusions that we faced. We're pleased to share some of these learning opportunities with our readers.
In that same decade, however, little changed in the way that computer systems handle information. In fact, we feel that it is safe to claim that computer systems haven't changed fundamentally in the last 35 years--the entire lifetime of the Internet and of many operating systems that are in use today, including Linux, Windows, and many others. Although our observations are derived from today's systems, we optimistically expect that at least some of our insights will remain valid for another decade.
The premise of the book is that forensic information can be found everywhere you look. With this guiding principle in mind, we develop tools to collect information from obvious and not-so-obvious sources, we walk through analyses of real intrusions in detail, and we discuss the limitations of our approach.
Although we illustrate our approach with particular forensic tools in specific system environments, we do not provide cookbooks for how to use those tools, nor do we offer checklists for step-by-step investigation. Instead, we present a background on how information persists, how information about past events may be recovered, and how the trustworthiness of that information may be affected by deliberate or accidental processes.
In our case studies and examples, we deviate from traditional computer forensics and head toward the study of system dynamics. Volatility and the persistence of file systems and memory are pervasive topics in our book. And while the majority of our examples are from Solaris, FreeBSD, and Linux systems, Microsoft's Windows shows up on occasion as well. Our emphasis is on the underlying principles that these systems have in common: we look for inherent properties of computer systems, rather than accidental differences or superficial features.
Our global themes are problem solving, analysis, and discovery, with a focus on reconstruction of past events. This approach may help you to discover why events transpired, but that is generally outside the scope of this work. Knowing what happened will leave you better prepared the next time something bad is about to occur, even when that knowledge is not sufficient to prevent future problems. We should note up front, however, that we do not cover the detection or prevention of intrusions. We do show that traces from one intrusion can lead to the discovery of other intrusions, and we point out how forensic information may be affected by system-protection mechanisms, and by the failures of those mechanisms.
We wrote this book for readers who want to deepen their understanding of how computer systems work, as well as for those who are likely to become involved with the technical aspects of computer intrusion or system analysis. System administrators, incident responders, other computer security professionals, and forensic analysts will benefit from reading this book, but so will anyone who is concerned about the impact of computer forensics on privacy.
Although we have worked hard to make the material accessible to nonexpert readers, we definitely do not target the novice computer user. As a minimal requirement, we assume strong familiarity with the basic concepts of UNIX or Windows file systems, networking, and processes.
The book has three parts: we present foundations first, proceed with analysis of processes, systems, and files, and end the book with discovery. We do not expect you to read everything in the order presented. Nevertheless, we suggest that you start with the first chapter, as it introduces all the major themes that return throughout the book.
In Part I, "Basic Concepts," we introduce general high-level ideas, as well as basic techniques that we rely on in later chapters.
In Part II, "Exploring System Abstractions," we delve into the abstractions of file systems, processes, and operating systems. The focus of these chapters is on analysis: making sense of information found on a computer system and judging the trustworthiness of our findings.
In Part III, "Beyond the Abstractions," we look beyond the constraints of the file, process, and operating system abstractions. The focus of this part is on discovery, as we study the effects of system architecture on the decay of information.
The appendices present background material: Appendix A is an introduction to the Coroner's Toolkit and related software. Appendix B presents our current insights with respect to the order of volatility and its ramifications when capturing forensic information from a computer system.
In the examples, we use constant-width font for program code, command names, and command input/output. User input is shown in bold constant-width font. We use $ as the shell command prompt for unprivileged users, and we reserve # for super-user shells. Capitalized names, such as Argus, are used when we write about a system instead of individual commands.
Whenever we write "UNIX," we implicitly refer to Solaris, FreeBSD, and Linux. In some examples we include the operating system name in the command prompt. For example, we use solaris$ to indicate that an example is specific to Solaris systems.
As hinted at earlier, many examples in this book are taken from real-life intrusions. To protect privacy, we anonymize information about systems that are not our own. For example, we replace real network addresses with private network addresses such as 10.0.0.1 or 192.168.0.1, and we replace host names or user names. Where appropriate, we even replace the time and time zone.
The examples in this book feature several small programs that were written for the purpose of discovery and analysis. Often we were unable to include the entire code listing because the additional detail would only detract from the purpose of the book. The complete source code for these and other programs is made available online at these Web sites:http://www.fish.com/forensics/
On the same Web sites, you will also find bonus material, such as case studies that were not included in the book and pointers to other resources.
Download the Index
file related to this title.