Attention: You are using an outdated browser, device or you do not have the latest version of JavaScript downloaded and so this website may not work as expected. Please download the latest software or switch device to avoid further issues.

LEARN > Blogs > Charting the Future of Privacy Enhancing Technologies in Education Data

Charting the Future of Privacy Enhancing Technologies in Education Data

Charting the Future of Privacy Enhancing Technologies in Education Data: Insights from the Legal Convening
26 Mar 2025
Written by Avery Freeman
Blogs

On March 12, 2025, the Georgetown University’s Massive Data Institute (MDI), the Future of Privacy Forum and Data Foundation convened a group of legal experts, policymakers, technologists, and researchers to tackle a new and emerging issue in the use of education data: How can Privacy Enhancing Technologies (PETs) be leveraged within the legal frameworks that govern data sharing?

With the growing interest in increasing access to education and workforce data to drive policy decisions, PETs – such as secure multi-party computation (SMC), homomorphic encryption (HE), and secure hashing – offer promising options. These technologies could allow researchers and policymakers to access and share data to generate valuable insights while reducing risks associated with sharing personally identifiable information (PII). However, questions remain about how these technologies can be used effectively within existing legal frameworks, particularly the Family Educational Rights and Privacy Act (FERPA).

The themes and recommendations outlined here are meant to advance the dialogue on privacy-enhancing technologies in education data and should not be attributed to any specific organization or participant

 

Key Themes from the Convening

Defining “Disclosure” in a PETs-Enabled World

One of the fundamental legal questions raised during the convening was whether access to data by a PET constitutes a disclosure under FERPA. FERPA defines "disclosure" of education records or personally identifiable information from education records as: Permitting access to, releasing, transferring, or otherwise communicating personally identifiable information contained in education records by any means (including oral, written, or electronic means) to any party except the party identified as the party that provided or created the record (34 CFR § 99.3). 

If a school encrypts student records using a given PET before sharing them with a third party for analysis, and that third party never has the ability to decrypt the data, does that count as a disclosure?

Legal and data experts debated whether current definitions of “disclosure” under FERPA need updating to account for scenarios where data is accessed but not in a form that allows re-identification. The heart of the conflict lies in whether FERPA's pre-digital era language can properly address computational arrangements where data technically changes hands but remains mathematically protected from human interpretation. While PETs offer mathematical guarantees for data security, regulatory frameworks must address both privacy protections and legitimate access requirements, as compliance involves not just preventing data exposure but also ensuring proper authorization for data processing.

Ownership and Control of Educational Records

Another key issue discussed was ownership of the data created through PETs. If a school uses a contracted third-party system under their direct control to generate a new, deidentified dataset (such as de-identified student performance data), who owns that dataset? Third-party systems could include cloud computing platforms, analytics providers, secure computation providers, AI-powered educational tools, or research institutions collaborating with schools.However, ambiguity remains around what constitutes control over such records, especially when private vendors and artificial intelligence tools are involved in their creation.

The Role of PETs in Enabling Research

While FERPA lacks a broad research exemption, the law does include certain exceptions, such as the audit and evaluation exception and the studies exception. This means educational institutions cannot share identifiable student data with researchers unless they obtain individual consent from parents/students or qualify under narrow exceptions where research is conducted “for or on behalf of” the educational institution and under a written agreement. This restriction creates significant barriers for researchers studying important educational questions, particularly those requiring large datasets or longitudinal analyses.

Participants explored how PETs could expand access to education data for research by ensuring compliance with these exceptions.For example, some PETs could help agencies share and connect data more securely.  If implemented properly and in compliance with legal requirements, such approaches could be valuable in understanding long-term student outcomes, such as how early education interventions impact workforce participation, while maintaining appropriate privacy protections. 

However, one challenge raised was whether PETs should be used as a workaround for legal restrictions. Some participants cautioned against viewing PETs as a means to avoid compliance with FERPA, emphasizing that privacy-enhancing tools should be implemented in alignment with existing laws rather than as a means to bypass them. Educational institutions must carefully consider whether using these technologies aligns with both the technical requirements and the underlying intent of FERPA's privacy protections.

Bridging the Gap Between Legal and Technical Communities

A recurring theme was the need for better education and collaboration between legal experts and technologists. A recurring theme was the need for better education and collaboration between legal experts and technologists. Some attorneys approach PETs with caution as they work to understand the scope of how these technologies function in practice. Simultaneously, certain technologists raise important considerations about whether protecting data through PETs, while potentially meeting FERPA's technical requirements for confidentiality, adequately addresses the possibility of student re-identification within the broader data ecosystem. 

Drawing parallels to the early days of cloud computing, experts noted that similar questions were raised regarding the risks of cloud storage. Over time, as standards and best practices developed, cloud adoption became mainstream. A similar trajectory may be needed for PETs, with increased investment in legal-technical education and clearer guidance.

 

Looking Ahead: Recommendations for Policy and Practice

The convening concluded with a discussion on next steps for integrating PETs into education data governance. Key recommendations included:

  • Clarifying FERPA and Applicable State Guidance: Explicit guidance on how or if PETs align with FERPA and relevant State requirements, particularly regarding legitimate purpose, de-identification, disclosure is needed.
  • Developing Standards for PET Implementation: Policymakers and technical experts should collaborate on establishing best practices for implementing PETs in a secure, private, and compliant manner in education and workforce data systems.
  • Building Legal-Technical Capacity: Government agencies, educational institutions, and research organizations should invest in training programs to bridge the gap between legal and technical expertise in data privacy.
  • Exploring Legislative Reforms: Given the limitations of FERPA’s current framework, there may be a need for policy updates that explicitly account for the role of privacy-enhancing technologies in responsible data sharing.

As education and workforce data become increasingly essential for transforming policy, ensuring privacy while maximizing data utility will be an important challenge. The Data Foundation looks forward to advancing this dialogue, fostering collaboration, and supporting efforts to responsibly integrate PETs into data governance frameworks.

 

For more insights and to stay updated on future convenings, follow https://datafoundation.org/events, https://fpf.org/fpf-events/upcoming/ and https://mdi.georgetown.edu/privacy-enhancing-technologies/.

 

image

DATA FOUNDATION
1100 13TH STREET NORTHWEST
SUITE 800, WASHINGTON, DC
20005, UNITED STATES

INFO@DATAFOUNDATION.ORG

This website is powered by
ToucanTech