Head Belly Root
Notes Privacy Is Hard Creative Commons License
Chapter 5

We Always Need To Know Who You Are

  1. C. P. Pfleeger and S. L. Pfleeger, Security in Computing, 4th ed. (Upper Saddle River, NJ: Prentice-Hall, 2007).
  2. See chapter 8 for a discussion on how to properly (i.e., securely) use passwords.
  3. Such a document can be an arbitrary piece of data, like a digital bank note or a Word document, for example. As we will discuss later, digital signatures are also used to sign certificates and credentials to prove their authenticity.
  4. Public keys in and of themselves are just pieces of random-looking data that do not contain any identifying information. By looking at a public key, you cannot see who it belongs to. You either need to keep track of who gave it to you, or you need a trusted source that reliably tells you who it belongs to.
  5. M. Bauer, M. Meints, and M. Hansen, D3.1: Structured Overview on Prototypes and Concepts of Identity Management Systems (FIDIS, 2005).
  6. The identity provider may in fact forward additional information to the relying party.
  7. Through a mechanism called OAuth
  8. See BankID and DigiD .
  9. G. Alpár, J.-H. Hoepman, and J. Siljee, "The Identity Crisis: Security, Privacy and Usability Issues in Identity Management," Journal of Information System Security 9, no. 1 (2013): 23–53.
  10. Assuming for the moment (unrealistically) that users at least use different, hard-to-guess passwords for each of their accounts.
  11. A large majority of people actually are affected by essentially this risk without really being aware of it. Most of us have a single email address that we use either as the account name for all our accounts or as the recovery email address for when we forget our passwords. A hacker with access to this email account can invoke a password recovery process and thus gain access to all our online accounts.
  12. C. M. Ellison, "The Nature of a Usable PKI," Computer Networks 31 (1999): 823–830.
  13. K. Cameron, "The Laws of Identity" (May 11, 2005).
  14. Instead of writing that a certain attribute type has a certain attribute value, we will more colloquially use the term attribute to refer to the type and use the term value to refer the value.
  15. We restrict our discussion here to identity management for natural persons, but pretty much everything easily translates to non-natural persons or arbitrary objects or entities that can be assigned attributes in some context.
  16. Traditionally, a certificate that ties a claim to a specified individual (usually specified through its public key) would have been used for this. Certificates are much simpler to construct than credentials. They lack the basic privacy protective features of credentials, however. This is why we do not explicitly discuss them in the main text of this chapter.
  17. In fact, it does not contain the actual user key, but a derived value that ensures that in order to use the credential, the user key must be known. Conceptually, we could say it contains the hash of the user key.
  18. W. Lueks, G. Alpár, J.-H. Hoepman, and P. Vullers, "Fast Revocation of Attribute-Based Credentials for Both Users and Verifiers," Computers & Security 67 (2017): 308–323.
  19. This is called the all or nothing defense against credential pooling.
  20. Biometry like fingerprints or retina scans could in theory be used for this, but in practice biometry has many limitations (like poor revocation and poor error rates for a decent security level).
  21. Traditionally, such systems have been called anonymous credentials in the literature. Recently, the terminology has changed to stress the fact that even though the underlying technology guarantees full anonymity, this anonymity is easily thwarted once you disclose a single identifying attribute, like your name or social security number.
  22. In many, but not all, cases, the user has to be fully identified in order to determine the appropriate attribute values. This step of the issuing process is therefore not privacy preserving at all. In some cases, however, showing some more or less anonymous attribute contained in another credential may be enough to issue a new credential. For example, we could imagine that a special "is allowed to buy alcoholic beverages" credential can be obtained in the US whenever a person seeking it can prove to be over twenty-one years old.
  23. D. Chaum, "Security without Identification: Transaction Systems to Make Big Brother Obsolete," Communications of the ACM 28, no. 10 (1985): 1030–1044.
  24. See uProve. See also S. Brands, Rethinking Public Key Infrastructures and Digital Certificates: Building in Privacy (Cambridge, MA: MIT Press, 2000).
  25. See Identity Mixer; IBM Research Zürich Team, Specification of the Identity Mixer Cryptographic Library (Zürich: IBM Research, February 2012); J. Camenisch and A. Lysyanskaya, "An Efficient System for Non-transferable Anonymous Credentials with Optional Anonymity Revocation," in Advances in Cryptology— EUROCRYPT 2001, International Conference on the Theory and Application of Cryptographic Techniques, ed. B. Pfitzmann (Berlin: Springer, 2001), 93–118.
  26. See Privacy by design foundation.
  27. See https://github.com/privacybydesign.
  28. See, for example, AgeChecked.
  29. J.-J. Quisquater, M. Quisquater, M. Quisquater, L. C. Guillou, M. A. Guillou, G. Guillou, A. Guillou, G. Guillou, S. Guillou, and T. A. Berson, "How to Explain Zero- Knowledge Protocols to Your Children," in Advances in Cryptology— CRYPTO '89, 9th Annual International Cryptology Conference, ed. G. Brassard (Berlin: Springer, 1989), 628–631.
  30. This rules out a classical challenge-response protocol based on signatures, typically used in authentication schemes to prove possession of a private key corresponding to a known public key. In such a protocol, the verifier sends a random challenge to the prover, asking the prover to sign the challenge with her private key. The verifier can verify the correctness of the response using the public key. But the same response can also be shown to anybody else to convince them the prover must have known the private key.
  31. If Victor and Walter are really good friends, then Walter may trust Victor enough to be sure he won't try to deceive him. But in that case, he might as well stop wasting his time watching the full video and simply trust Victor when he tells him Peggy knows the secret.
  32. Goffman, The Presentation of Self in Everyday Life.
  33. B. van den Berg, "The Situated Self: Identity in a World of Ambient Intelligence" (PhD diss., Erasmus University Rotterdam, 2009).
  34. I. Kerr, C. Lucock, and V. Steeves, Lessons from the Identity Trail: Anonymity, Privacy and Identity in a Networked Society (Oxford: Oxford University Press, March 18, 2009).
  35. M. Koning, P. Korenhof, G. Alpár, and J.-H. Hoepman, "The ABCs of ABCs: An Analysis of Attribute-Based Credentials in the Light of Data Protection, Privacy and Identity," in Internet, Law & Politics: A Decade of Transformations: Proceedings of the 10th International Conference on Internet, Law & Politics, ed. J. Balcells (Barcelona: UOC-Huygens Editorial, 2014), 357–374.
  36. B. Latour, Reassembling the Social: An Introduction to Actor-Network-Theory (Oxford: Oxford University Press, 2005).
  37. Via Google Pay for Android or Apple Wallet for iOS. The discussion assumes that these wallets indeed only store their information locally.
  38. Because of the strong unlinkability properties of attribute-based credentials, a voter could present their eligibility attribute many times to vote many times without anybody noticing. Attribute-based credentials can actually prevent this by forcing the voter to also show a fixed and voting-specific pseudonym that nevertheless is unknown to the issuer and hence cannot be linked back to the identity of the voter.
  39. This is very similar to the trust on first use (TOFU) principle for public key authen- tication, as is used, for instance, by the Secure Shell (SSH) protocol. See D. Wendlandt, D. G. Andersen, and A. Perrig, "Perspectives: Improving SSH-style Host Authentication with Multi-path Probing," in Proceedings of the 2008 USENIX Annual Technical Conference (USENIX '08), ed. R. Isaacs and Y. Zhou (USENIX Association, 2008), 321–334.
  40. M. S. Merkow, "Secure Electronic Transactions (SET)," in The Internet Encyclopedia, vol. 3, ed. H. Bidgoli (Hoboken, NJ: Wiley, 2004), 247–260.
  41. J.-H. Hoepman, "Privately (and Unlinkably) Exchanging Messages Using a Public Bulletin Board," in ACM Workshop on Privacy in the Electronic Society 2015, ed. I. Ray, N. Hopper, and R. Jansen (New York: ACM, 2016), 85–94.