Home > Articles > Security > Network Security

  • Print
  • + Share This
This chapter is from the book

3.5 Anonymity

Anonymity is concerned with protecting the identity of agents with respect to particular events or messages. In this case, unlike secrecy, the messages themselves need not be protected, only their association with particular agents. Hence it is natural to model events in the system as consisting of two components: the identity of the agent performing that event, and the content itself. For anonymity, we consider events of the form a.x, where a is the identity of the agent and x is the content of the event.

An anonymity protocol will be designed to achieve some kind of transaction or exchange of messages without revealing the identity of some or all of the participants. This means not only that the name of the participants should not appear directly, but also that the identities of the participants should not be deducible from the information that is available. The threat is often considered to be more passive than for secrecy and authentication: that agents are not actively working to subvert the protocol, but are simply observing the occurrence of events and making deductions about the participants. Thus an anonymity protocol will generally be over a fairly reliable medium. However, such protocols can also be analyzed in the presence of stronger threats or a more unreliable medium (which might misdeliver messages for example) to see if their anonymity properties still hold.

The principle of anonymity is that a data item that could have originated from one agent could equally have originated from any other (perhaps any other from some given set of users). Hence we wish our definition to capture the notion that any message of the form i.x could equally well have been of the form j.x. If the set Anonusers consists of the set of all users whose identities should be masked by the system in providing anonymity, then the set of messages we wish to confuse for a given piece of information x is given by the set A:


Rather than deal directly with the identity of users, we can capture anonymity by requiring that whenever any event from the set A occurs, it could equally well have been any other event. In terms of agent identity and content, this means that if an observer has access only to the content of the message then it is not possible to deduce the identity of the agent associated with it.

A protocol described as a CSP system P will provide anonymity on the set A if any arbitrary permutation pA of the events in A, applied pointwise to all of the traces of the system, does not alter the set of possible traces of the system. This means that the other events from A that can appear in the trace are independent of the identity of the agent performing events from A. If A is of the form above (a set of possible users associated with a piece of information) then permutations will correspond to permutations on agents.

Anonymity is often relative to particular observers or particular viewpoints. In other words, anonymity is provided in cases where an observer has access only to certain kinds of information, and might not be provided in cases where more information is available. For example, a donation to a charity or a political party would be anonymous if the only information available is details of the amounts of money passing through particular accounts, but might not be anonymous if all details of particular transactions are available.

In general, an observer does not have complete access to all the events occurring in a system, but has only limited or no direct access to some events. The events that an observer has access to could be captured as another set B.

It is an immediate requirement for anonymity that AB = ∅. If an observer has direct access to the very events that we wish to mask, then it will always be possible to tell some events in A (in particular, those also in B) from some others.

The events that are not in B are those events that the observer does not have direct access to. From the point of view of modelling the system in order to analyze for anonymity, the events other than A and B should be abstracted, since the system to be analyzed for anonymity should encapsulate the information available to the observer.

If C is the set of events that are to be abstracted from P, then the system to be analyzed is AbsC(P), where AbsC is an abstraction mechanism such as hiding, masking, or renaming. This situation is pictured in Figure 3.9. In each case the requirement will be to check for any arbitrary permutation on A, lifted to apply to events and thus to traces, that


03fig62.gifFigure 3.9. Analyzing P for anonymity

The dining cryptographers

We will consider the toy example of the dining cryptographers protocol. This is concerned with a situation in which three cryptographers share a meal. At the end of the meal, each of them is secretly informed by their organization whether or not she is paying. Either at most one is paying, or else the organization is itself picking up the bill.

The cryptographers would like to know whether it is one of them who is paying, or whether it is their organization that is paying; but they also wish to retain anonymity concerning the identity of the payer if it is one of them. They will use the following protocol to achieve this.

They each toss a coin, which is made visible to them and their right-hand neighbour. Each cryptographer then examines the two coins that they can see. There are two possible announcements that can be made by each cryptographer: that the coins agree, or that they disagree. If a cryptographer is not paying then she will say that they agree if the results on the coins are the same, and that they disagree if the results differ; a paying cryptographer will say the opposite.

If the number of ‘disagree’ announcements is even, then the organization is paying. If the number is odd, then one of the cryptographers is paying. The two cryptographers not paying will not be able to identify the payer from the information they have available.

The protocol is modelled in CSP as the parallel combination of cryptographers and coins, and a master process dictating who pays, as illustrated in Figure 3.10. The events of the form pays.i and notpays.i are the instructions from the organization concerning payment. Events of the form look.i.j.x model cryptographer i reading value x from coin j. The channels out.i are used for the cryptographers to make their declaration.

03fig64.gifFigure 3.10. Components of the protocol

The Master process nondeterministically chooses either to pay, or one of the cryptographers to pay.


Each cryptographer process follows the protocol. This is described in CSP as follows:


Each coin is modelled as a choice between reading heads and reading tails:


The master either sends a pay message to one of the cryptographers and a do-not-pay to the other two or a do-not-pay message to all of them.

The system is constructed from the cryptographers and coins, which are two collections of independent processes.


They must synchronize on the events representing the examination of coins and the Master decides who is paying.


It is also possible to provide a parametric description of the system for an arbitrary number n of cryptographers; but automatic verification via model-checking will be possible only once a particular n is chosen.


There are a number of parameters that dictate the precise form of anonymity available to the dining cryptographers:

  • the events A for which anonymity is provided;

  • the events B of the system that can be observed;

  • the way the remaining events Σ \ (AB) are abstracted.

If anonymity is to be provided with respect to the environment of Meal (for example, an observer on another table), then the set A for anonymity to be provided is simply {pays.i | 0 ≤ i ≤ 2}: the observer should not be able to distinguish the paying cryptographer from the others. Such an observer should only see the publicly announced values on out: occurrences of the notpays and look events should be abstracted. In this situation, the parameters are as follows:

  • A = {| pays |}

  • B = {| out |}

  • the remaining events will be hidden: this means that the observer does not even know whether and how often these events occur.

The system under consideration will be


and the question of anonymity will be whether or not


for an arbitrary permutation pA.

It might be plausible to expect that look events can be observed, simply requiring that the particular values associated with them are abstracted. In this case a renaming abstraction is more appropriate, using the following alphabet renaming:


The system flook(Meal) allows the occurrence of user i inspecting coin j to be observed, but abstracts the information given by the coin. If this form of abstraction is instead used (with notpays events remaining completely hidden) then the equivalence to be checked for anonymity is


The cryptographers themselves are privy to more information than an outsider watching the protocol running. They have access to the values of coins, and also they know whether they themselves are paying or not. In principle this additional information might be enough to break anonymity, and indeed anonymity on the set A is trivially not provided, since cryptographer i can distinguish pays.i from pays.j for some other j: pays.i is in both A and B.

In this case, the requirement is that anonymity is provided for each cryptographer against the other two. In other words, if a particular cryptographer is paying then each of the other two should not know whether it is her or the third cryptographer. From the point of view of each cryptographer, this means that the other two pays events should be indistinguishable.

Since the situation is symmetric, we will consider anonymity with respect to cryptographer 0. In this case we have the following parameters defining the anonymity property. A is the set of events that must be indistinguishable, and B is the set of events that are available to cryptographer 0:

  • A = {pays.1, pays.2}

  • B = {pays.0, notpays.0} ∪ {| look.0 |} ∪ {| out |}

  • C = Σ \ (AB)

There is in fact only one permutation pA to consider: swapping pays.1 and pays.2. The equivalence to check is that


This equivalence holds, and hence the protocol is correct.

The threat model considered above is benign: the participants all behave as they should, and we consider whether this allows any information to leak out. However, we may also consider situations where attempts are made to subvert the protocol. For example, a cryptographer might have access to the value on the third coin. This can be captured by allowing the cryptographer access to one of the look events at that third coin. For example the set B that cryptographer 0 has access to might include events of the form look.2.2.

In this case anonymity is lost. In the configuration

  • A = {pays.1, pays.2}

  • B = {pays.0, notpays.0} ∪ {| look.0, look.2.2 |} ∪ {| out |}

  • C =Σ\(AB)

    we find that


    is a trace of Meal \ C but that it is not a trace of pA(Meal \ C). In other words, heads on all three coins are observed and so out.2.disagree is consistent with pays.2. However, it is not consistent with pays.1:ifpays.2 is replaced by pays.1 then the rest of the trace is not consistent with that change. This means that the rest of the trace contains enough information to distinguish pays.1 from pays.2, and hence break the anonymity required for cryptographers 1 and 2.

    It is instructive to consider this example when look.2.2 is not available. In this case we have


    as the equivalent trace of Meal \ C. But now this is also a trace of pA(Meal \ C): the rest of the trace is consistent with pays.1, since the value of Coin(2) could be tails, leading to out.2.disagree. If the value of Coin(2) is not available then anonymity is again obtained.

  • + Share This
  • 🔖 Save To Your Account

InformIT Promotional Mailings & Special Offers

I would like to receive exclusive offers and hear about products from InformIT and its family of brands. I can unsubscribe at any time.


Pearson Education, Inc., 221 River Street, Hoboken, New Jersey 07030, (Pearson) presents this site to provide information about products and services that can be purchased through this site.

This privacy notice provides an overview of our commitment to privacy and describes how we collect, protect, use and share personal information collected through this site. Please note that other Pearson websites and online products and services have their own separate privacy policies.

Collection and Use of Information

To conduct business and deliver products and services, Pearson collects and uses personal information in several ways in connection with this site, including:

Questions and Inquiries

For inquiries and questions, we collect the inquiry or question, together with name, contact details (email address, phone number and mailing address) and any other additional information voluntarily submitted to us through a Contact Us form or an email. We use this information to address the inquiry and respond to the question.

Online Store

For orders and purchases placed through our online store on this site, we collect order details, name, institution name and address (if applicable), email address, phone number, shipping and billing addresses, credit/debit card information, shipping options and any instructions. We use this information to complete transactions, fulfill orders, communicate with individuals placing orders or visiting the online store, and for related purposes.


Pearson may offer opportunities to provide feedback or participate in surveys, including surveys evaluating Pearson products, services or sites. Participation is voluntary. Pearson collects information requested in the survey questions and uses the information to evaluate, support, maintain and improve products, services or sites, develop new products and services, conduct educational research and for other purposes specified in the survey.

Contests and Drawings

Occasionally, we may sponsor a contest or drawing. Participation is optional. Pearson collects name, contact information and other information specified on the entry form for the contest or drawing to conduct the contest or drawing. Pearson may collect additional personal information from the winners of a contest or drawing in order to award the prize and for tax reporting purposes, as required by law.


If you have elected to receive email newsletters or promotional mailings and special offers but want to unsubscribe, simply email information@informit.com.

Service Announcements

On rare occasions it is necessary to send out a strictly service related announcement. For instance, if our service is temporarily suspended for maintenance we might send users an email. Generally, users may not opt-out of these communications, though they can deactivate their account information. However, these communications are not promotional in nature.

Customer Service

We communicate with users on a regular basis to provide requested services and in regard to issues relating to their account we reply via email or phone in accordance with the users' wishes when a user submits their information through our Contact Us form.

Other Collection and Use of Information

Application and System Logs

Pearson automatically collects log data to help ensure the delivery, availability and security of this site. Log data may include technical information about how a user or visitor connected to this site, such as browser type, type of computer/device, operating system, internet service provider and IP address. We use this information for support purposes and to monitor the health of the site, identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents and appropriately scale computing resources.

Web Analytics

Pearson may use third party web trend analytical services, including Google Analytics, to collect visitor information, such as IP addresses, browser types, referring pages, pages visited and time spent on a particular site. While these analytical services collect and report information on an anonymous basis, they may use cookies to gather web trend information. The information gathered may enable Pearson (but not the third party web trend services) to link information with application and system log data. Pearson uses this information for system administration and to identify problems, improve service, detect unauthorized access and fraudulent activity, prevent and respond to security incidents, appropriately scale computing resources and otherwise support and deliver this site and its services.

Cookies and Related Technologies

This site uses cookies and similar technologies to personalize content, measure traffic patterns, control security, track use and access of information on this site, and provide interest-based messages and advertising. Users can manage and block the use of cookies through their browser. Disabling or blocking certain cookies may limit the functionality of this site.

Do Not Track

This site currently does not respond to Do Not Track signals.


Pearson uses appropriate physical, administrative and technical security measures to protect personal information from unauthorized access, use and disclosure.


This site is not directed to children under the age of 13.


Pearson may send or direct marketing communications to users, provided that

  • Pearson will not use personal information collected or processed as a K-12 school service provider for the purpose of directed or targeted advertising.
  • Such marketing is consistent with applicable law and Pearson's legal obligations.
  • Pearson will not knowingly direct or send marketing communications to an individual who has expressed a preference not to receive marketing.
  • Where required by applicable law, express or implied consent to marketing exists and has not been withdrawn.

Pearson may provide personal information to a third party service provider on a restricted basis to provide marketing solely on behalf of Pearson or an affiliate or customer for whom Pearson is a service provider. Marketing preferences may be changed at any time.

Correcting/Updating Personal Information

If a user's personally identifiable information changes (such as your postal address or email address), we provide a way to correct or update that user's personal data provided to us. This can be done on the Account page. If a user no longer desires our service and desires to delete his or her account, please contact us at customer-service@informit.com and we will process the deletion of a user's account.


Users can always make an informed choice as to whether they should proceed with certain services offered by InformIT. If you choose to remove yourself from our mailing list(s) simply visit the following page and uncheck any communication you no longer want to receive: www.informit.com/u.aspx.

Sale of Personal Information

Pearson does not rent or sell personal information in exchange for any payment of money.

While Pearson does not sell personal information, as defined in Nevada law, Nevada residents may email a request for no sale of their personal information to NevadaDesignatedRequest@pearson.com.

Supplemental Privacy Statement for California Residents

California residents should read our Supplemental privacy statement for California residents in conjunction with this Privacy Notice. The Supplemental privacy statement for California residents explains Pearson's commitment to comply with California law and applies to personal information of California residents collected in connection with this site and the Services.

Sharing and Disclosure

Pearson may disclose personal information, as follows:

  • As required by law.
  • With the consent of the individual (or their parent, if the individual is a minor)
  • In response to a subpoena, court order or legal process, to the extent permitted or required by law
  • To protect the security and safety of individuals, data, assets and systems, consistent with applicable law
  • In connection the sale, joint venture or other transfer of some or all of its company or assets, subject to the provisions of this Privacy Notice
  • To investigate or address actual or suspected fraud or other illegal activities
  • To exercise its legal rights, including enforcement of the Terms of Use for this site or another contract
  • To affiliated Pearson companies and other companies and organizations who perform work for Pearson and are obligated to protect the privacy of personal information consistent with this Privacy Notice
  • To a school, organization, company or government agency, where Pearson collects or processes the personal information in a school setting or on behalf of such organization, company or government agency.


This web site contains links to other sites. Please be aware that we are not responsible for the privacy practices of such other sites. We encourage our users to be aware when they leave our site and to read the privacy statements of each and every web site that collects Personal Information. This privacy statement applies solely to information collected by this web site.

Requests and Contact

Please contact us about this Privacy Notice or if you have any requests or questions relating to the privacy of your personal information.

Changes to this Privacy Notice

We may revise this Privacy Notice through an updated posting. We will identify the effective date of the revision in the posting. Often, updates are made to provide greater clarity or to comply with changes in regulatory requirements. If the updates involve material changes to the collection, protection, use or disclosure of Personal Information, Pearson will provide notice of the change through a conspicuous notice on this site or other appropriate way. Continued use of the site after the effective date of a posted revision evidences acceptance. Please contact us if you have questions or concerns about the Privacy Notice or any objection to any revisions.

Last Update: November 17, 2020