In 2010, the total amount of data generated worldwide was approximately 2 zettabytes (for perspective, that is 2 billion terabytes). By 2020, this figure had surged to 64.2 zettabytes, marking a 32-fold increase within a decade. Come next year, predictions indicate that global data creation will reach over 180 zettabytes, nearly tripling the 2020 volume.
While this burgeoning trove of information is a prime resource for data-driven innovations, it also poses substantial privacy risks to individuals and legal liabilities to organisations. Unsurprisingly, the rapid adoption of technology and surge in data collection, use and transfers have led to unprecedented rates of data theft and breaches. This year, IBM reported that the global average cost of a data breach climbed to USD 4.88 million, up by 10% from 2023 and the highest on record.
With data functioning as currency in today’s digital economy, safeguarding information is non-negotiable. Now, Privacy-Enhancing Technologies (PETs) are coming to the forefront as key tools to address these challenges, offering a pathway to balance innovation with privacy.
If you work in the IT sector or the like, PETs may already be in your professional vernacular. Lesser-known outside of the tech sphere, PETs comprise a suite of methodologies designed to protect individual privacy by minimising the exposure of Personally Identifiable Information (PII) during processing and analysis. As data continues to grow in both volume and importance, PETs are gaining traction as critical enablers for organisations to continue maximising the full value of data and its insights, while maintaining privacy and compliance to regulations.
According to ISACA, a global authority on IT governance, quintessential techniques in PETs span, but are not limited to, the following:
1. Trusted Execution Environment (TEE): A secure area on a computer processor isolated from the Operating System (OS) that stores data and code. It protects them from unauthorised access or modification, even if the OS is compromised.
2. Differential Privacy: Introduces controlled noise to datasets, ensuring individual data points remain confidential while allowing for accurate aggregate analysis.
3. Synthetic Data: Artificially generated data that mimics similar properties as real-world data without revealing sensitive information on individuals from the original data set.
4. Federated Learning: Enables multiple entities to collaboratively train Machine Learning (ML) models without sharing raw training data, maintaining data locality and privacy.
5. Homomorphic Encryption: Allows computations on encrypted data, producing encrypted results that can be decrypted without exposing the underlying data.
6. Secure Multiparty Computation (SMPC): A form of confidential computing that distributes data between multiple parties, permitting each party to privately compute their share of the data and recombine the outcomes to get the final result, without seeing others’ data.
7. Zero-Knowledge Proof: A cryptographic method that allows one party to prove knowledge about certain data to another party without revealing the information itself
These technologies are being increasingly integrated into various sectors, including healthcare, finance, and consumer protection. For instance, medical data, when combined with a PET like SMPC, can allow multiple researchers across different institutes to work together and positively impact outcomes for diseases and treatments, without disclosing sensitive information. Historically, legal challenges in the processing and sharing of sensitive information with third parties have limited the full utilisation of data in analysis. PETs can, therefore, circumvent this roadblock and extract value from data that’s been left idle. On the consumer front, regulators can also leverage PETs to detect fraud and unfair practices for consumer protection without direct access to sensitive customer data.
By ensuring the protection of individual privacy, PETs allow organisations to de-risk data collection and usage, making it safer to innovate with data while upholding compliance and trust, and unlock benefits to society.
For many regulators and experts around the world, PETs are emerging as a top-of-mind solution and they are taking active steps to support meaningful adoption of this technology within organisations.
Just this November, the UK’s Information Commissioner’s Office (ICO) and the Responsible Technology Adoption (RTA) Unit within the Department for Science, Innovation and Technology (DSIT) released a PETs Cost-Benefit Awareness Tool designed to guide organisations looking to adopt PETs assess the associated costs and benefits. Meanwhile, the U.S. National Science Foundation launched in June a new Privacy-Preserving Data Sharing in Practice (PDaSP) program to advance PETs and promote their use to solve real-world problems, which aligns with the 2023 Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence from the White House.
Closer to home, the 2024 Privacy Enhancing Technology (PET) Summit Asia-Pacific (APAC) during Singapore's Personal Data Protection (PDP) Week spotlighted PETs as transformative tools for responsible innovation. Experts discussed their applications across diverse domains, with a specific focus on themes of AI, security and integrity.
At the 2024 PET Summit APAC, one of the areas panellists honed in on was the potential of PETs to enable trustworthy AI systems.
Christian Reimsbach-Kounatze, Information Economist / Policy Analyst, Organisation for Economic Co-operation and Development (OECD), underscored the importance of using federated learning in conjunction with other PETs to mitigate the risk of data leakage. He shared that “even if you use Privacy Enhancing Technologies (PETs), you have to acknowledge that leakages are still possible and it's well documented in the case of Federated Learning. Depending on how your queries are designed, there is still a possibility that you could extract insights from the data. That's why you need to combine them, for instance, with Homomorphic Encryption, Secure Multiparty Computation (SMPC) or Trusted Execution Environments (TEE).”
This approach also plays into the concept of layered privacy security, as represented by the Swiss Cheese Model for Defence-in-Depth. By combining PETs strategically, organisations can ensure that breaches in one layer will not compromise the entire system.
The need for rigorous and ethical testing has become paramount in responsible AI development as well. April Chin, Managing Partner and CEO, Resaro, spotlighted the role of synthetic data in addressing one of AI's most pressing challenges: the scarcity of high-quality, privacy-compliant datasets for testing. Not only is synthetic data recognised as a PET by EU data protection regulations like the ePrivacy Directive and GDPR, it also allows enterprises to augment test sets and independently verify the robustness of their AI systems. This approach helps organisations circumvent traditional barriers in AI testing, ensuring compliance while maintaining data utility.
Of course, finding the right balance between data protection and usability is key in ensuring these techniques can be leveraged effectively. "You can adjust the level of synthetic data protection depending on the risk level that you face...if the level of protection is too high, the data may not be very useful for building models," commented Manprit Singh, CTO Data and AI Healthcare, Microsoft. This flexibility enables enterprises to tailor their PET strategies to achieve both privacy and functionality.
Scaling PET adoption requires collaboration across industries and geographies. During a panel on "Global Trends on How PETs Enable Innovation and Cross-Border Flows" at the summit, William Malcolm, Senior Director of International Privacy, Legal, and Consumer Protection, Google, reframed PETs as enablers of innovation, rather than mere compliance tools. “Most security technologies are sold as vacuum cleaners, you buy them because you have to”, he remarked. “But they allow you to do new things, create new business models, and engage with new customers,” he continued, underscoring their transformative potential.
In another panel, Nina Liguda, Privacy & Data Protection Office Program Lead, TikTok Singapore, and Anne Flanagan, Vice President for Artificial Intelligence, Future of Privacy Forum, emphasised the value of open-source collaboration in testing and refining PET implementations. One can leverage resources from research organisations and regulatory sandboxes, such as IMDA’s PET Sandbox, to test and refine solutions. Platforms like these provide the space to experiment and address challenges collaboratively.
As technologies like generative AI continue to evolve, the role of PETs will continue to become evermore essential in balancing innovation with the imperative of data privacy. The successful adoption of PETs, however, requires a holistic, collaborative approach—one that combines regulatory guidance, industry innovation, technological synergy, and appropriate application of PETs at different stages of the AI development lifecycle. By investing in PETs capabilities, organisations can then unlock the true potential of data-driven AI while upholding privacy and security - a crucial step towards a more trustworthy and responsible digital economy.
Capabara, our Next-Gen AI Capability-as-a-Service platform, is currently available on a limited trial basis. If your company is interested in developing Gen AI competencies and capabilities with us, write in to sales@straitsinteractive.com with your company email address. You can also stay tuned to Capabara’s latest developments by following our CAPABARA LinkedIn page or heading over to capabara.com to find out more about how your organisation can be empowered through safe and secure Gen AI.
DPOinBOX AI is our latest iteration of privacy management software, powered by generative AI, to help Data Protection Officers with compliance and data governance. Sign up for a demo at dpoinbox.ai to see how it can augment your data protection practices.
This article was first published on The Governance Age on 27 November.
Get access to news, enforcement cases, events, and actionable tips and guides
Get regular email updates and offers
Job opportunities, mentorship and career guidance
Exclusive access to Data Protection community - ask questions, network and share knowledge with peers and experts via WhatsApp and Linkedin
DPEX Network is a Community Initiative of Straits Interactive.
Copyright © Straits Interactive Pte Ltd. All Rights Reserved.
All intellectual property rights to logos and brands featured on this website remain the property of their respective owners.