Skip to main content
Log in

A privacy threat analysis framework: supporting the elicitation and fulfillment of privacy requirements

  • Digital Privacy
  • Published:
Requirements Engineering Aims and scope Submit manuscript

Abstract

Ready or not, the digitalization of information has come, and privacy is standing out there, possibly at stake. Although digital privacy is an identified priority in our society, few systematic, effective methodologies exist that deal with privacy threats thoroughly. This paper presents a comprehensive framework to model privacy threats in software-based systems. First, this work provides a systematic methodology to model privacy-specific threats. Analogous to STRIDE, an information flow–oriented model of the system is leveraged to guide the analysis and to provide broad coverage. The methodology instructs the analyst on what issues should be investigated, and where in the model those issues could emerge. This is achieved by (i) defining a list of privacy threat types and (ii) providing the mappings between threat types and the elements in the system model. Second, this work provides an extensive catalog of privacy-specific threat tree patterns that can be used to detail the threat analysis outlined above. Finally, this work provides the means to map the existing privacy-enhancing technologies (PETs) to the identified privacy threats. Therefore, the selection of sound privacy countermeasures is simplified.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20

Similar content being viewed by others

References

  1. Lamsweerde AV, Brohez S, Landtsheer RD, Janssens D, Informatique DD (2003) From system goals to intruder anti-goals: attack generation and resolution for security requirements engineering. In: Proceedings of the RE03 workshop on requirements for high assurance systems (RHAS03), pp 49–56

  2. van Lamsweerde A (2009) Requirements engineering: from system goals to UML models to software specifications. Wiley, Chichester

    Google Scholar 

  3. Howard M, Lipner S (2006) The security development lifecycle. Microsoft Press, Redmond, WA

    Google Scholar 

  4. Mcgraw G (2006) Software security: building security. Addison-Wesley Professional, Boston, NY

  5. Schneier B (2000) Secrets and lies: digital security in a networked world. Wiley, New York

    Google Scholar 

  6. Andreas GS, Opdahl AL (2001) Templates for misuse case description. In: Proceedings of the 7th international workshop on requirements engineering, foundation for software quality, pp 4–5

  7. Opdahl AL, Sindre G (2009) Experimental comparison of attack trees and misuse cases for security threat identification. Inf Softw Technol 51(5):916–932. SPECIAL ISSUE: Model-Driven Development for Secure Information Systems

    Google Scholar 

  8. Solove DJ (2006) A taxonomy of privacy. Univ PA Law Rev 154(3):477; GWU Law School Public Law Research Paper No. 129

    Google Scholar 

  9. Solove DJ (2008) Understanding privacy. Harvard University Press, Cambridge

    Google Scholar 

  10. Pfitzmann A, Hansen M (2010) A terminology for talking about privacy by data minimization: anonymity, unlinkability, undetectability, unobservability, pseudonymity, and identity management (Version 0.33 April 2010), technical report, TU Dresden and ULD Kiel, http://dud.inf.tu-dresden.de/Anon_Terminology.shtml

  11. Hansen M (2008) Linkage control integrating the essence of privacy protection into identity management systems. In: Cunningham P, Cunningham M (eds) Collaboration and the knowledge economy: issues, applications, case studies, Proceedings of eChallenges, IOS Press, Amsterdam, pp 1585–1592

  12. Danezis G (2008) Talk: an introduction to u-prove privacy protection technology, and its role in the identity metasystem—what future for privacy technology. http://www.petsfinebalance.com/agenda/index.php

  13. ISO 17799 (2000) Information technology code of practice for information security management, technical report, British Standards Institute

  14. Roe M (1997) Cryptography and evidence. PhD thesis, University of Cambridge, Clare College

  15. McCallister E, Grance T, Kent K (2009) Guide to protecting the confidentiality of personally identifiable information (PII) (draft), technical report, National Institute of Standards and Technology (US)

  16. Lederer S, Hong JI, Dey AK, Landay JA (2004) Personal privacy through understanding and action: five pitfalls for designers. Pers Ubiquitous Comput 8:440–454

    Article  Google Scholar 

  17. Patil S, Kobsa A (2009) Privacy considerations in awareness systems: designing with privacy in mind, chap 8. In: Human computer interaction series, Springer London, pp 187–206

  18. P3P, Platform for privacy preferences project, W3C P3P specifications. http://www.w3.org/TR/P3P/

  19. EU (1995) Directive 95/46/EC of the European parliament and of the council of 24 october 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data. Off J Eur Commun 281:31–50. http://europa.eu/scadplus/leg/en/lvb/l14012.htm

  20. HIPAA (2006) HIPAA administrative simplification: enforcement; final rule. United States Department of Health & Human Service. Fed Regist Rules Regul 71(32):8390–8433. http://www.hhs.gov/ocr/privacy/hipaa/administrative/privacyrule/finalenforcementrule06.pdf

  21. PIPEDA (2009) Personal information protection and electronic documents act (2000, c. 5). http://laws.justice.gc.ca/en/showtdm/cs/P-8.6

  22. Australia’s national privacy regulator: privacy act. http://www.privacy.gov.au/law/act

  23. OECD (1980) Guidelines on the protection of privacy and transborder flows of personal data, organization for economic cooperation and development. http://www.oecd.org/document/18/0,2340,en_2649_34255_1815186_1_1_1_1,00.html

  24. Breaux TD, Anton AI, Boucher, Dorfman M (2008) Legal requirements, compliance and practice: an industry case study in accessibility. In: RE’08: Proceedings of the 16th IEEE international requirements engineering conference (RE’08), IEEE Society Press, pp 43–52

  25. United States Department of Justice, Workforce investment act of 1998, SEC. 508. electronic and information technology. http://www.justice.gov/crt/508/508law.php

  26. Breaux T, Antón A (2008) Analyzing regulatory rules for privacy and security requirements. IEEE Trans Softw Eng 34(1):5–20

    Article  Google Scholar 

  27. Danezis G, Diaz C, Syverson P (2009) Systems for anonymous communication. In: CRC handbook of financial cryptography and security. Chapman and Hall, Boca Raton, FL, p 61

    Google Scholar 

  28. Sweeney L (2002) K-anonymity: a model for protecting privacy. Int J Uncertain Fuzziness Knowl-Based Syst 10(5):557–570

    Google Scholar 

  29. Alexander I (2003) Misuse cases: use cases with hostile intent. IEEE Softw 20(1):58–66

    Article  Google Scholar 

  30. OWASP, Risk rating methodology. http://www.owasp.org/index.php/OWASP_Risk_Rating_Methodology

  31. MSDN Library, Improving web application security: threats and countermeasures

  32. NIST, Risk management guide for information technology systems, special publication 800-30. http://csrc.nist.gov/publications/nistpubs/800-30/sp800-30.pdf

  33. C. S. E. Institute, OCTAVE. http://www.cert.org/octave/

  34. Wuyts K, Scandariato R, Decker BD, Joosen W (2009) Linking privacy solutions to developer goals. Availability, reliability and security, international conference on 0:847–852

    Google Scholar 

  35. Kalloniatis C, Kavakli E, Gritzalis S (2008) Addressing privacy requirements in system design: the pris method. Requir Eng 13:241–255. http://dx.doi.org/10.1007/s00766-008-0067-3

    Google Scholar 

  36. PETs, Annual symposium on privacy enhancing technologies, homepage. http://petsymposium.org/

  37. Chaum D (1981) Untraceable electronic mail, return addresses, and digital pseudonyms. Commun ACM 24(2):84–88

    Article  Google Scholar 

  38. Chaum D (1985) Security without identification: transaction systems to make big brother obsolete. Commun ACM 28(10):1030–1044

    Article  Google Scholar 

  39. Chaum D (1988) The dining cryptographers problem: unconditional sender and recipient untraceability. J Cryptol 1(1):65–75

    Article  MathSciNet  Google Scholar 

  40. Pfitzmann A, Pfitzmann B, Waidner M (1991) ISDN-mixes: untraceable communication with very small bandwidth overhead. In: Proceedings of the GI/ITG conference on communication in distributed systems, pp 451–463

  41. Goldschlag DM, Reed MG, Syverson PF (1996) Hiding routing information. In: Anderson R (ed) Proceedings of information hiding: first international workshop. Springer-Verlag, LNCS 1174, pp 137–150

  42. Reiter M, Rubin A (1998) Crowds: anonymity for web transactions. ACM Transact Inf Syst Secur 1(1):1–23. http://avirubin.com/crowds.pdf

    Google Scholar 

  43. Bacard A, Anonymous.to: Cypherpunk tutorial. http://www.andrebacard.com/remail.html

  44. Mixmaster, Mixmaster homepage. http://mixmaster.sourceforge.net/

  45. Mixminion, Mixminion officia site. http://mixminion.net/

  46. Back A, Goldberg I, Shostack A (2001) Freedom systems 2.1 security issues and analysis, white paper, Zero Knowledge Systems, Inc

  47. Berthold O, Federrath H, Köpsell S (2000) Web MIXes: a system for anonymous and unobservable internet access. In: Federrath H (ed) Proceedings of designing privacy enhancing technologies: workshop on design issues in anonymity and unobservability, Springer-Verlag, LNCS 2009, pp 115–129

  48. Dingledine R, Mathewson N, Syverson P (2004) Tor: the second-generation onion router. In: Proceedings of the 13th USENIX security symposium

  49. Pfitzmann A, Waidner M (1985) Networks without user observability—design options. In: Proceedings of EUROCRYPT 1985, Springer-Verlag, LNCS 219

  50. Waidner M, Pfitzmann B (1990) The dining cryptographers in the disco: unconditional sender and recipient untraceability. In: Proceedings of EUROCRYPT 1989, Springer-Verlag, LNCS 434

  51. Abadia M, Fournet C (2004) Private authentication. Theor Comput Sci 322:427–476

    Article  Google Scholar 

  52. Aiello W, Bellovin SM, Blaze M, Canetti R, Ioannidis J, Keromytis AD, Reingold O (2004) Just fast keying: key agreement in a hostile internet. ACM Trans Inf Syst Secur 7:2004

    Article  Google Scholar 

  53. Brands S, Chaum D (1993) Distance-bounding protocols (extended abstract). In: EUROCRYPT93. Springer-Verlag, LNCS 765, pp 344–359

  54. Camenisch J, Lysyanskaya A (2004) Signature schemes and anonymous credentials from bilinear maps. In: Proceedings crypto. Springer-Verlag, LNCS 3152, pp 56–72

  55. Naor M (2002) Deniable ring authentication. In: Proceedings of crypto 2002, Springer-Verlag, LNCS 2442, pp 481–498

  56. Borisov N, Goldberg I, Brewer E (2004) Off-the-record communication, or, why not to use PGP. In: Proceedings of the 2004 ACM workshop on privacy in the electronic society. ACM New York, NY, pp. 77–84

  57. Yao ACC (1982) Protocols for secure computations. In: Proceedings of 23rd IEEE symposium on foundations of computer science, pp 160–164

  58. Naor M, Nissim K (2001) Communication complexity and secure function evaluation, CoRR, vol. cs.CR/0109011

  59. Deng M, Bianchi T, Piva A, Preneel B (2009) An efficient buyer-seller watermarking protocol based on composite signal representation. In: Proceedings of the 11th ACM workshop on multimedia and security (Princeton, NJ). ACM, New York, NY, pp 9–18

  60. Chor B, Goldreich O, Kushilevitz E, Sudan M (1998) Private information retrieval. J ACM 45:965–981

    Google Scholar 

  61. Rabin MO (1981) How to exchange secrets by oblivious transfer, technical report tr-81. Aiken Computation Laboratory, Harvard University

  62. Cachin C (1998) On the foundations of oblivious transfer. In: Advances in cryptology—Eurocrypt 1998. Springer-Verlag, LNCS 1403, pp 361–374

  63. Verykios V, Bertino E, Fovino I, Provenza L, Saygin Y, Theodoridis Y (2004) State-of-the-art in privacy preserving data mining. ACM SIGMOD Record 3:50–57

    Article  Google Scholar 

  64. Pinkas B (2002) Cryptographic techniques for privacy preserving data mining. SIGKDD Explor 4(2):12–19

    Article  Google Scholar 

  65. Abdalla M, Bellare M, Catalano D, Kiltz E, Kohno T, Lange T, Malone-lee J, Neven G, Paillier P, Shi H (2005) Searchable encryption revisited: consistency properties, relation to anonymous ibe, and extensions. In: Proceeding of CRYPTO. Springer-Verlag, pp 205–222

  66. Ostrovsky R, Skeith WE III (2005) Private searching on streaming data. CRYPTO pp 223–240

  67. Sweeney L (2002) Achieving k-anonymity privacy protection using generalization and suppression. Int J Uncertain Fuzziness Knowl-Based Syst 10(5):571–588

    Google Scholar 

  68. Machanavajjhala A, Gehrke J, Kifer D, Venkitasubramaniam M (2006) l-diversity: privacy beyond k-anonymity. In: Proceedings of the 22nd international conference on data engineering (ICDE’06), p 24

  69. Anderson R, Petitcolas F (1998) On the limits of steganography. IEEE J Sel Areas Commun 16:474–481

    Article  Google Scholar 

  70. Moskowitz I, Newman RE, Crepeau DP, Miller AR (2003) Covert channels and anonymizing networks. In: Workshop on privacy in the electronic society, ACM, Washington, DC, pp 79–88

  71. Kirovski D, Malvar HS (2001) Robust covert communication over a public audio channel using spread spectrum. In: Information hiding, pp 354–368

  72. Hansen M, Berlich P, Camenisch J, Clauß S, Pfitzmann A, Waidner M (2004) Privacy-enhancing identity management. Inf Secur Tech Rep (ISTR) 9(1):35–44. http://dx.doi.org/10.1016/S1363-4127(04)00014-7)

    Google Scholar 

  73. Clauß S, Pfitzmann A, Hansen M, Herreweghen EV (2002) Privacy-enhancing identity management. IPTS Rep 67:8–16

    MATH  Google Scholar 

  74. Simoens K, Tuyls P, Preneel B (2009) Privacy weaknesses in biometric sketches. In: Proceedings of the 2009 30th IEEE symposium on security and privacy. IEEE Computer Society, Washington, DC, pp 188–203

  75. Menezes AJ, Oorschot PCV, Vanstone SA, Rivest RL (1997) Handbook of applied cryptography. CRC Press, Washington

  76. Fontaine C, Galand F (2007) A survey of homomorphic encryption for non-specialists. EURASIP J Inf Secur. http://www.hindawi.com/RecentlyAcceptedArticlePDF.aspx?journal=IS&number=13801

  77. Camenisch J, Damgard I (1998) Verifiable encryption and applications to group signatures and signature sharing. In: Technical report RS-98-32, BRICS, Department of Computer Science, University of Aarhus

  78. Georgiadis CK, Mavridis I, Pangalos G, Thomas RK (2001) Flexible team-based access control using contexts. In: SACMAT, pp 21–27

  79. Carminati B, Ferrari E (2008) Privacy-aware collaborative access control in web-based social networks. In: Proceedings of the 22nd IFIP WG 11.3 working conference on data and applications security (DBSEC2008)

  80. Ardagna CA, Camenisch J, Kohlweiss M, Leenes R, Neven G, Priem B, Samarati P, Sommer D, Verdicchio M (2009) Exploiting cryptography for privacy-enhanced access control: a result of the PRIME project. J Comput Secur 18(1):123–160

    Google Scholar 

  81. OASIS, eXtensible access control markup language: XACML 3.0. http://xml.coverpages.org/xacml.html

  82. IBM, Enterprise privacy authorization language: EPAL 1.2. http://www.w3.org/Submission/2003/SUBM-E

  83. Lipford HR, Besmer A, Watson J (2008) Understanding privacy settings in facebook with an audience view. In: Churchill EF, Dhamija R (eds) Proceedings of the 1st conference on usability, psychology, and security, USENIX Association, Berkeley, CA, USA. http://www.usenix.org/events/upsec08/tech/full_papers/lipford/lipford.pdf

  84. Anderson J, Diaz C, Bonneau J, Stajano F (2009) Privacy-enabling social networking over untrusted networks. In: WOSN ’09: Proceedings of the 2nd ACM workshop on online social networks. pp 1–6

  85. Beato F, Kohlweiss M, Wouters K (2009) Enforcing access control in social networks. HotPets. http://www.cosic.esat.kuleuven.be/publications/article-1240.pdf

  86. PrimeLife, The European PrimeLife research project—privacy and identity management in Europe for life. http://www.primelife.eu/

  87. Mylopoulos J, Chung L, Nixon B (1992) Representing and using non-functional requirements: a process-oriented approach. IEEE Transact Softw Eng 18:483–497

    Article  Google Scholar 

  88. Privacy guidelines for developing software products and services, version 3.1, technical report, Microsoft Coorporation, Sept 2008

  89. Microsoft security development lifecycle (SDL) version 3.2, technical report, Microsoft Coorporation, April 2008

  90. Yu E, Cysneiros LM (2002) Designing for privacy and other competing requirements. In: Proceedings of the 2nd symposium on requirements engineering for information security, SREIS-02, pp 15–16

  91. Liu L, Yu E, Mylopoulos J (2003) Security and privacy requirements analysis within a social setting. Requir Eng IEEE Int Conf 0:151

    Google Scholar 

  92. Miyazaki S, Mead N, Zhan J (2008) Computer-aided privacy requirements elicitation technique. Asia-Pacific conference on services computing. 2006 IEEE, pp 367–372

  93. Antón AI, Earp JB, Reese A (2002) Analyzing website privacy requirements using a privacy goal taxonomy. In: RE ’02: Proceedings of the 10th anniversary IEEE joint international conference on requirements engineering. IEEE Computer Society, pp 23–31

  94. Danezis G (2007) Talk: introduction to privacy technology. http://research.microsoft.com/en-us/um/people/gdane/talks/Privacy_Technology_cosic.pdf

Download references

Acknowledgments

This research is partially funded by the Interuniversity Attraction Poles Programme Belgian State, Belgian Science Policy, and by the Research Fund K.U. Leuven.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mina Deng.

Appendix: A Misuse case examples

Appendix: A Misuse case examples

1.1 MUC 2: Linkability of the user-portal data stream (data flow)

Summary: Data flows can be linked to the same person (without necessarily revealing the persons identity)

Asset: PII of the user

  • The user:

    • data flow can be linked to each other which might reveal the persons identity

    • the attacker can build a profile of a user’s online activities (interests, active time, comments, updates, etc.)

Primary misactor: skilled insider/skilled outsider

Basic Flow:

  1. 1.

    The misactor intercepts/eavesdrops two or more data flows

  2. 2.

    The misactor can link the data flows to each other and possibly link them (by combining this information) to the user/data subject

Trigger: by misactor, can happen whenever data are communicated

Preconditions:

  • No anonymous communication system used

  • Information disclosure of data flow possible

Prevention capture points:

  • Use strong anonymous communication techniques

  • Provide confidential channel

Prevention guarantee: Impossible to link data to each other

1.2 MUC 3: Linkability of the social network users (entity)

Summary: Entities (with different pseudonyms) can be linked to the same person (without necessarily revealing the persons identity)

Asset: PII of the user

  • The user:

    • data can be linked to each other, which might reveal the persons identity

    • attacker can build a profile of a user’s online activities (interests, actives time, comments, updates, etc.)

Primary misactor: skilled insider/skilled outsider Basic Flow:

  1. 1.

    The misactor intercepts or eavesdrops two or more pseudonyms

  2. 2.

    The misactor can link the pseudonyms to each other and possibly link (by combining this information) to the user/data subject

Trigger: by misactor, can happen whenever data are communicated

Preconditions:

  • Information disclosure of the data flow possible

  • Different “pseudonyms” are linked to each other based on content of the data flow

Prevention capture points:

  • protection of information such as user temporary ID, IP address, time and location, session ID, identifier and biometrics, computer ID, communication content, e.g. apply data obfuscation to protection this information (security)

  • message and channel confidentiality provided

Prevention guarantee: Impossible to link data to each other

1.3 MUC 4: Identifiability at the social network database (data store)

Summary: The users identity is revealed

Asset: PII of the user

  • The user: revealed identity

Primary misactor: skilled insider/skilled outsider

Basic Flow:

  1. 1.

    The misactor gains access to the database

  2. 2.

    The data is linked to a pseudonym

  3. 3.

    The misactor can link the pseudonym to the actual identity (identifiability of entity)

  4. 4.

    The misactor can link the data to the actual user’s identity

Alternative Flow:

  1. 1.

    The misactor gains access to the database

  2. 2.

    This can link information from the database to other information (from another database or information which might be publicly accessible)

  3. 3.

    The misactor can re-identify the user based on the combined information

Trigger: by misactor, can always happen

Preconditions:

  • no or insufficient protection of the data store

  • no data anonymization techniques used

Prevention capture points:

  • protection of the data store (security)

  • apply data anonymization techniques

Prevention guarantee: hard-impossible to link data to identity (depending on applied technique)

1.4 MUC 5: Identifiability of user-portal data stream (data flow)

Summary: The users identity is revealed

Asset: PII of the user

  • The user: revealed identity

Primary misactor: insider/outsider

Basic Flow:

  1. 1.

    The misactor gains access to the data flow

  2. 2.

    The data contains personal identifiable information about the user (user relationships, address, etc.)

  3. 3.

    The misactor is able to extract personal identifiable information from the user/data subject

Trigger: by misactor, can happen whenever data is communicated

Preconditions:

  • no or weak anonymous communication system used

  • Information disclosure of data flow possible

Prevention capture points:

  • apply anonymous communication techniques

  • Use confidential channel

Prevention guarantee: hard-impossible to link data to identity (depending on applied technique)

1.5 MUC 6: Identifiability of users of the social network system (entity)

Summary: The users identity is revealed

Asset: PII of the user

  • The user: revealed identity

Primary misactor: skilled insider/skilled outsider

Basic Flow:

  1. 1.

    The misactor gains access to the data flow

  2. 2.

    The data contains the user’s password

  3. 3.

    The misactor has access to the identity management database

  4. 4.

    The misactor can link the password to the user

Alternative Flow:

  1. 1.

    The misactor gains access to the data flow

  2. 2.

    The data contains the user’s password

  3. 3.

    The misactor can link the user’s password to the user’s identity (password is initials followed by birthdate)

Trigger: by misactor, can happen whenever data are communicated and the user logs in using his “secret”

Preconditions:

  • Insecure IDM system OR

  • weak passwords used and information disclosure of data flow possible

Prevention capture points:

  • Strong pseudonymity technique used (e.g. strong passwords)

  • privacy-enhancing IDM system

  • Data flow confidentiality

Prevention guarantee: hard(er) to link log-in to identity.

1.6 MUC 7: Information disclosure at the social network database (data store)

Summary: Data are exposed to unauthorized users

Asset: PII of the user

  • The user: revealed sensitive data

Primary misactor: skilled insider/skilled outsider

Basic Flow:

  1. 1.

    The misactor gains access to the database

  2. 2.

    The misactor retrieves data to which he should not have access

Trigger: by misactor, can always happen

Preconditions:

  • no or insufficient internal access policies

Prevention capture points:

  • strong access control policies (security). For example, rule-based access control based on friendships in the social network

Prevention guarantee: hard-impossible to obtain data without having the necessary permissions

1.7 MUC 8: Information disclosure of communication between the user and the social network (data flow)

Summary: The communication is exposed to unauthorized users

Asset: PII of the user

  • The user: revealed sensitive data

Primary misactor: skilled insider/skilled outsider

Basic Flow:

  1. 1.

    The misactor gains access to the data flow

  2. 2.

    The misactor retrieves data to which he should not have access

Trigger: by misactor, can happen whenever messages are being sent

Preconditions:

  • communication goes through insecure public network

Prevention capture points:

  • messages sent between user and social network web client is encrypted and secure communication channel is ensured

Prevention guarantee: hard-impossible to gain access to the data flow without having the right permissions

1.8 MUC 9: Content unawareness

Summary: User is unaware that his or her anonymity is at risk due to the fact that too much personal identifiable information is released

Asset: PII of the user

  • The user: revealed identity

Primary misactor: skilled insider/skilled outsider

Basic Flow:

  1. 1.

    The misactor gain access to user’s online comments

  2. 2.

    The misactor profiles the user’s data and can identify the user

Trigger: by misactor, can always happen

Preconditions:

  • User provides too much personal data

Prevention capture points:

  • User provides only minimal set of required information

Prevention guarantee: user will be informed about potential privacy risks

MUC 10: Policy and consent noncompliance

Summary: The social network provider doesn’t process user’s personal data in compliance with user consent, e.g., disclose the database to third parties for secondary use

Asset: PII of the user

  • The user: revealed identity and personal information

  • The system/company: negative impact on reputation

Primary misactor: Insider

Basic Flow:

  1. 1.

    The misactor gains access to social network database

  2. 2.

    The misactor discloses the data to a third party

Trigger: by misactor, can always happen

Preconditions:

  • misactor can tamper with privacy policies and makes consents inconsistent OR

  • policies not managed correctly (not updated according to user’s requests)

Prevention capture points:

  • Design system in compliance with legal guidelines for privacy and data protection and keep internal policies consistent with policies communicated to user

  • Legal enforcement: user can sue the social network provider whenever his or her personal data are processed without consents

  • Employee contracts: employees who share information with 3th parties will be penalized (fired, pay fine, etc.)

Prevention guarantee: Legal enforcement will lower the threat of an insider leaking information but it will still be possible to breach user’s privacy

Rights and permissions

Reprints and permissions

About this article

Cite this article

Deng, M., Wuyts, K., Scandariato, R. et al. A privacy threat analysis framework: supporting the elicitation and fulfillment of privacy requirements. Requirements Eng 16, 3–32 (2011). https://doi.org/10.1007/s00766-010-0115-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00766-010-0115-7

Keywords

Navigation