A lot of stuff happened in 1974, including, but not limited to:
- “Hooked on a Feeling” (by one-hit wonder Blue Suede)
- The Exorcist was released
- All in the Family and Archie Bunker graced our TVs for a fifth season
- More “pocket” calculators (for really big pockets)
- More “word processors” (fancy typewriters)
- Nixon’s resignation signaling the end of Watergate
- The first bar-code scanner hit the stores
- George Foreman faced Muhammad Ali
And oh yes–the Family Educational Rights and Privacy Act, known as FERPA.
Passed with swift bipartisan efficiency, FERPA provided safeguards for student educational records, from K-12 to higher ed. Besides letting parents request access and corrections to their children’s school records, it also gave them consent over record sharing, and it transferred these rights to the students upon turning eighteen.
But, despite all of the things that happened in 1974, there were things that hadn’t been invented yet, and therefore weren’t covered by FERPA. Like:
- Personal computers
- Compact discs
- The Internet
- Flat-screen TVs
- Social media (Mark Zuckerberg’s parents hadn’t even met yet)
- Mobile devices
- Justin Bieber
And oh yes—online education.
FERPA, you hardly know us
Like other privacy issues in the digital era, protecting student privacy has become a thorny, contentious, not-likely-to-end-soon drama.
Protagonists include parents, students, and privacy advocates, rightly anxious about who is managing all those digital footprints and toward what end. Bold educators and entrepreneurs hoping to transform teaching and learning through cutting-edge tools and approaches. And, institutions often playing it safe, even at the cost of missed learning opportunities.
What they can all agree on is that the current privacy legislation doesn’t begin to accommodate today’s educational landscape. Considering all that has happened since 1974, we shouldn’t be surprised that FERPA today is not up to the job of privacy protection:
- Regulatory scope is too narrow. FERPA only applies to educational institutions, and its only enforcement weapon is cutting off federal funds (a drastic recourse never taken). It doesn’t cover the many third-party vendors who typically make up a school’s technological ecosystem–and who often manage large amounts of student data.
- “Educational record” as defined by FERPA is woefully outdated. What might have been pages in a cabinet in 1974 can now encompass a digital galaxy of student artifacts and interaction, much of it not currently regulated.
- FERPA has no comprehensive digital strategy. Education in the twenty-first century is complex and multifaceted. What’s needed is a thoughtful plan that engages parents and students in the process of managing personal data, while also balancing privacy with freedom–for students, educators, and the ongoing pursuit of innovation.
Legislators have tried, up to a point. They’ve revised FERPA a few times, and the 1978 Protection of Pupil Rights Amendment (PPRA) further expanded privacy protection to cover what schools might gather via surveys and other means.
More recently, genuine signs of progress have come out of Washington. A May 2014 White House report on big data and privacy included recommendations for modernizing FERPA, and this was followed, in 2015, by the introduction of a bipartisan bill, the Student Digital Privacy and Parental Rights Act.
But not much has happened since then. Maybe we shouldn’t expect much help from Congress, at least any time soon. After all, they are an ocean liner, not a jet ski, and changes of direction do not come easy.
Sharing is caring?
A central episode in the student privacy saga is the demise of inBloom.
Funded by the Gates Foundation and the Carnegie Corporation, this non-profit had a broadly ambitious goal: gather huge amounts of data on K-12 students across nine states and analyze the data to help schools provide personalized learning. Then a storm of protest erupted over privacy risks. States began pulling out, and in 2014, inBloom abruptly shut down.
As any educator will tell you, failure is almost always instructive. So what can we learn here?
- Trust, knowledge, and collaboration make the world go round. Much of the pushback came down to simple fear of the unknown: potential security abuses both unintentional (hackers) and intentional (marketers). If inBloom had been more proactive about informing and engaging the public (such as providing clear opt-out paths), things might have turned out differently.
- Innovation must coexist with reality. Part of inBloom’s model was the collection and storage of critical personal information on its own servers. This data might have yielded many useful insights, but its sensitive nature became too big a burden to overcome.
- We need more dialogue about personalized learning. Seemingly lost in the uproar over data security was any serious discussion about the implications of unique student experiences. Ultimately, we may discover that we can’t get something for nothing–that the most effective personalized learning will demand the free flow of personal data. But no matter how we all land on this subject, we should be having conversations about it. Only then can we best understand what to safeguard and how.
Next steps (with or without FERPA)
Just because the government isn’t quite addressing modern privacy concerns doesn’t mean the rest of us–parents, students, privacy advocates, teachers, educational institutions, third-party vendors–can’t keep moving forward. In fact, there are examples of this happening already:
- In the absence of formal regulation, many ed-tech vendors have drawn up their own safeguard policies, and more than 250 companies have thus far signed on to a Student Privacy Pledge for the software industry (our application has been submitted!).
- A federally funded project from Carnegie Mellon University, called LearnSphere, is creating an open data repository that is similar to what inBloom developed, but with some key differences: no personal information is collected, and there’s no central storage for the data.
Of course, there’s still plenty more we can do, with or without government intervention. While the May 2014 White House report focused on privacy regulations, it also recognized the importance of digital literacy as a twenty-first century skill set:
In order to ensure students, citizens, and consumers of all ages have the ability to adequately protect themselves from data use and abuse, it is important that they develop fluency in understanding the ways in which data can be collected and shared…although such skills will never replace regulatory protections, increased digital literacy will better prepare individuals to live in a world saturated by data.
A key FERPA concept is “legitimate educational interest”—the operating principle for determining if educators can rightly share student data. The language is vague and, especially nowadays, ripe for loopholes big enough to fit a data warehouse.
Not surprisingly, “legitimate educational interest” has prompted some anxious responses, not only from parents, students, and privacy advocates suspicious of its latitude, but also from institutions wary of interpreting it too liberally.
But maybe it’s time we embrace “legitimate educational interest” as the learning challenge it is, exploring its dimensions and implications from all angles. Considering the rich digital resources available to educators, how should they share data in a way that best serves a student’s right to both privacy and a transformative education?
Let’s also include students in the discussion. Along with making informed, intelligent consent decisions, addressing this question can be part of the digital literacy they should be acquiring.
Because despite many limitations, FERPA does enshrine an essential truth: as a student, it’s your education, and you have a right to manage it how you want.