By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

TDWI Articles

Privacy-Enhancing Technologies: From Buzz to Reality

Enterprises are adopting more robust privacy and data protection regulations, but there are limits to an approach based solely on privacy-enhancing technologies. Here are four things to keep in mind.

In the wake of the adoption of more robust privacy and data protection regulations, privacy-enhancing technologies (PETs) have received increased attention. The European Union Agency for Cybersecurity (ENISA) released its PETs readiness report in 2016 after having done groundwork on Privacy by Design. Since then, PETs have been frequently included in regulations around the world (see Resources at end of article).

For Further Reading:

Executive Q&A: How Data Privacy Regulations Are Changing Marketing

Keep Privacy and Governance in Mind When Developing or Updating Your Systems

Company Nurse: Our Path to Superior Data Privacy

In fact, even in jurisdictions without robust privacy frameworks, PETs are increasingly seen as a must-have. The U.S., in collaboration with the United Kingdom, is setting up prize challenges with a view to advancing PETs. Gartner predicts PETs will be adopted by the majority of large organizations by 2025.

The list of PETs has grown over the years and now includes a wide range of techniques, including tokenization, k-anonymization, global and local differential privacy, federated learning, homomorphic encryption, secure multiparty computation, and trusted execution environments.

However, although the PETs space is maturing, there are limits to an approach based solely on them. As each PET pursues a narrow aim, (as explained below), implementing a PET does not necessarily mean that all privacy and data protection requirements will be taken care of. In fact, implementing a PET is resource intensive, and if not done properly can lead to information leakage.

In addition, in some cases PETs have been used to help legitimize unfair processing practices or make processing activities more opaque, thus undermining key privacy and data protection requirements such as fairness and transparency. As privacy and data protection compliance entails tradeoffs, it’s important to be aware of PETs’ limits.

PETs Considerations

With so many PETs to consider, how should data teams really think about them? Here are four things to keep in mind.

1. PETs are not always interchangeable.

To understand what a specific PET actually achieves, it’s essential to unpack its aims. For example, does the PET ensure that:

  • It’s relatively impossible to distinguish an individual within the crowd
  • It’s not feasible to link events to a specific individual
  • Unauthorized users are effectively prevented from accessing the data
  • Individuals can deny either their participation in a data set or the value of a specific attribute (or even both)

For example, secure multiparty computation only guarantees the confidentiality of the input data and does not offer any protection against inference risks.

2. Even if PETs all help mitigate risks, they don’t necessarily offer the same guarantees.

For example, compare k-anonymization with (global) differential privacy. Although the first technique aims to give individuals the ability to hide in groups (i.e., a situationally relevant attacker is not able to distinguish individuals within a given cohort while the data remains at the individual level), the second lets individuals deny their participation in the data set altogether and, therefore, only allows aggregate queries of the data. Some PETs offer formal mathematical guarantees, while others don’t and are thus considered to be “softer,” such as federated learning or secure multiparty computation.

3. Check newer PETs.

One intuitive PET is the trusted third-party (or TTP) in which a mutually trusted third party may act as a clearing house and buffer for interorganizational processing of sensitive data. However, the real world is often not as nice and it’s hard to unpack all the “good reasons” that would justify trusting a TTP, as TTPs are by definition third parties. This has given rise to modern-day PETs: trusted execution environments (TEEs) and secure multiparty computation (SMC).

TEEs simulate TTPs in hardware by using computer processor features for secure process isolation. TTPs can insulate code and data from other processes -- even privileged ones. This is in stark contrast to SMC, which offers a framework for creating distributed cryptographic protocols that can be used by the parties to simulate a TTP without having one (organizational or physical).

For Further Reading:

Executive Q&A: How Data Privacy Regulations Are Changing Marketing

Keep Privacy and Governance in Mind When Developing or Updating Your Systems

Company Nurse: Our Path to Superior Data Privacy

4. Find PETs to fill in gaps.

Because each PET only addresses a limited number of privacy requirements, data teams should precisely identify their requirements of the PETs they’ve selected and find alternative controls, such as organizational measures, to address their remaining needs, such as those related to fairness and transparency. Data teams should also think early about how to combine PETs and create their privacy-enhancing tech stack. Given that PETs are usually use-case specific, it is essential to employ a variety of PETs within a data science environment and have the option to smoothly adapt the selected PET(s) to the needs of each specific project.

A Final Word

Privacy-mature data teams will soon discover that well-bred PETs have the potential to be useful for the protection of both personal and non-personal data, such as commercially sensitive data. In a world where data sovereignty is appealing to an increasing number of governments, restrictions on the disclosure and sharing of different types of sensitive data will only become more prevalent.

Resources

In France, the CNIL-INRIA Privacy award has included PETs.

In the United Kingdom, the Royal Society released its first report on PETs in 2019. Thereafter, the Center for Data Ethics and Innovation released its PET adoption guide including a repository of real-world use cases.

Also in the U.K., the Information Commissioner’s Office is currently working on revised guidance about anonymization, pseudonymization, and PETs.

In Canada, the historical home of the privacy by design concept, the Office of the Privacy Commissioner has completed its own review of the PET landscape.

A report from the Mobey Forum’s new AI and Data Privacy Expert Group revealed a blind spot in the banking industry regarding the importance of emerging PETs.

In 2019, the World Economic Forum explored the ability of PETs to “unlock new value in the financial services industry by facilitating new forms of data-sharing.”

The Centre for Data Ethics and Innovation’s PETs Adoption Guide offers interesting use cases for PETs in healthcare, finance, and other industries.

 

TDWI Membership

Accelerate Your Projects,
and Your Career

TDWI Members have access to exclusive research reports, publications, communities and training.

Individual, Student, and Team memberships available.