Are schools leaking personal data on their students?

The latest research indicates most are. And the data they are leaking flows to third parties through remote learning apps and school websites.

Remote Learning Apps: feeding our children’s information to data brokers

As a society, we have underestimated the threat to children’s privacy presented by online learning applications used by most schools today.  Students’ data is flowing to third parties despite the existence of laws intended to specifically protect it. And likely in ways that would be prohibited by those laws if those involved were aware of it.

Serious student privacy issues began before COVID-19

While the COVID-19 pandemic accelerated the scope and the severity of the privacy problem by pushing schools into online learning, the data abuse began well before the pandemic. 

The current privacy epidemic is a result of multiple factors. First, is the widespread use of off-the-shelf code which includes third-party functionality that collects, records, and often sells or re-shares student data with other parties. Then there’s the lack of visibility into which third parties are present in the data chain of these apps and what they’re doing with users’ personal data.

No real choice

Parents generally oversee their children’s use of social media and watch to make sure that they don’t put too much information online. This stems from concerns over stalkers and predators, but it also helps prevent data brokers from getting too much data about kids before they’re even old enough to understand the risks. 

But with remote learning applications used by schools for online instruction, the story is different. You don’t really have a choice unless you decide to homeschool your kids. And even then, you may decide to use online tools or apps that can expose you to many of the same problems.

So, what are online learning apps doing with our children’s data?

Most of us assume schools and their vendors are following all the rules, especially around student privacy. Two recent studies, however, suggest that these apps could be moving a lot of personal student data and not even know it. 

This week, the Me2B Alliance released a US-based study of 73 mobile applications being used by 38 schools across the country. 

A few of their findings:

  • On average, each app was sending data to 10.6 third party data channels
  • Some of these apps were sending data to what Me2B classifies as “high risk” or “very high risk” third parties, those which share data onward to many other entities; this risk was significantly lower with iOS apps than Android ones
  • Data collection was happening even when the apps were not actively being used, and often even when the user was not signed in
  • Data shared with third parties included mobile advertising identifiers (MAIDs), which are unique identifiers used to build consumer profiles

More privacy questions than answers

Me2B didn’t do a deeper dive on whether these third parties were handling data correctly but did conclude that the privacy notifications associated with them are unlikely to be current or complete. This is especially true for apps that share data with third parties who themselves pass data onward.

Evidence of a wider problem

Me2B are not the only ones to look at the student privacy problem.  IDAC also did a study on applications used in an educational context (not necessarily in schools), which was released in September 2020. 

They did manual testing on 98 iOS and Android apps, as well as automated testing on 421 (the specific apps are included in their report).

They found that:

  • A small number of the apps were collecting and sharing location data and persistent personal identifiers, and a few were also exposing personal information in URL query strings, which is not a safe practice. 
  • Potentially worse, 79 of the 123 apps they manually tested were communicating personal data to multiple third parties (not the company that owns the app) and they had similar results with automated testing.
  • A significant number of the Android apps were engaging in “ID bridging,” sending both the AAID (a “less sensitive” identifier for targeting advertising) as well as the actual Android ID; sending both of these gets around the user’s ability to reset the AAID and disconnect themselves from their previous history.  It violates Google’s policy for developers.
  • Finally, many of the apps make use of third-party code in the form of analytics or advertising SDKs – meaning that they were bringing third parties into the mix by default, and in ways, they likely were not aware of.

Remote learning apps may just be the tip of the privacy iceberg

While these studies have focused on mobile apps, the problem isn’t exclusive to them.  Any website which collects personal data and is not transparent about exactly how they are using data and who else is accessing it presents the same risks.

Educational websites which incorporate third-party code expose users to all the risks associated directly with that code (who else the app owner shares data with, and why?).  Not only is this a big privacy problem but it is also a major cybersecurity problem. Third-party app code integrated into a school’s website increase the risk of undetected attacks on that codebase such as the MageCart attacks on any web payment forms.

Privacy is a core technological problem that’s getting worse

It’s worth repeating that these privacy issues didn’t originate with the COVID-19 pandemic.  Schools were already using apps to communicate with students and parents, make announcements, track homework assignments, and many other things before the pandemic hit.

For example, one school was even using software to approve and create digital hall passes in 2019, as covered in the Washington Post. All the pandemic did was massively accelerate the use of educational technology, as schools were forced to migrate to online learning to continue operating without endangering lives.

Now, almost everyone is affected

The number of students using these apps increased tremendously over the last year, and the functions performed day to day by learning software increased as well. Some companies which had relatively little involvement with education (e.g. Zoom) suddenly found themselves heavily in the market, and not without growing pains.

In this kind of rapidly-changing environment, schools were understandably prioritizing how best to get their core functions online. They did not have the time to properly vet the nuances of privacy and data handling.  Most figured child data and general privacy protections were covered. But clearly, in some cases, schools were encouraged to cut corners, get ramped up, and work out the details later. 

With schools already being under-resourced, and the relevant resources told to get everything online in a relatively short period of time, it’s no surprise that they didn’t catch some of these critical privacy issues.

Aren’t there laws protecting children’s personal information?

Absolutely.  In the US, there’s the Children’s Online Privacy Protection Act, which places restrictions on operators of websites or online services which collect personal information about children under age 13 or market toward that age group. 

There’s also the Family Educational Rights and Privacy Act, which protects student educational records.  In the EU, the GDPR places special restrictions on consent and the use of data pertaining to children (defined by default as under age 16, though member states can lower that to 13). 

The problem with enforcement

Some of this data collection and use is unlawful – and that’s not a hard call, as the consents and privacy statements they are publishing are clearly incomplete.  But people have to be aware that there’s some difficulty in initiating an enforcement action.

One of the problems with making use of third-party code is that your codebase becomes extremely complex – and you don’t have full visibility or control over it. It gets even worse as third parties pass data to fourth parties, fifth parties, and so on.

Organizations operating an app or website which includes third-party code are often unaware of all the third parties within their ecosystem, much less what those parties are doing with the data they collect. 

Case in point, when IDAC reached out to a couple of the app developers to ask why they were sharing data with third parties, the developers were unaware that it was even happening.  (They shut it down after they found out in both cases, so the apps weren’t supporting a critical operational function.)

Data leaks, data breaches, and hacks

It’s bad enough that you don’t necessarily understand what the code in your mobile app or on your school’s website does with your users’ data, but you also become vulnerable to attacks on that third-party code.

For example, there’s a loose criminal group called “MageCart” whose stock in trade is compromising third-party code in web payment forms and shopping carts to steal credit card numbers and defraud users. We’ve written about MageCart’s disastrous attacks on corporations like British Airways on this blog before.

Because these are attacks against the third-party codebase and not your own, and because the MageCart group is careful in how they operate, these attacks are especially difficult to catch before they damage you. Fortunately, MageCart goes after the big fish, but the risk of exposure and likelihood of a data exploit remain nonetheless.

What can be done?

If you’re a parent or student, you don’t have a lot of options here. You can use the privacy settings available to you in your OS or web browser and restrict data flow to the extent possible. You can also pressure your schools to review the privacy practices of their apps.

If you’re a school or school system, you can have your counsel review the legal agreements with the app vendors for privacy provisions and ensure all privacy notices you and your students are receiving are up to date. You should raise questions when you have concerns. And you absolutely should ensure that all of your faculty, staff, and students get appropriate training on online privacy – which very few have actually had.

If you’re a vendor providing educational services, you need to get your hands around your environment, and the third (fourth, fifth, etc.) parties within it as soon as possible.

Don’t wait for a fine or breach

Now that the initial panic phase of the pandemic is over, privacy enforcement actions are coming. The US FTC and the European DPAs are likely to get very aggressive if you’ve been giving student data to commercial organizations.

To start with, make sure that you understand everything in your ecosystem and who it sends data to. Make sure your privacy notices are accurate, up to date, and complete.

You should also close off any channels where data is flowing, and it doesn’t need to.  Remember, you need specific purposes to move sensitive data, and if you didn’t know it was moving in the first place, you legally didn’t have a specific purpose for doing it.

Tech solutions to tech problems

Here is where a solution like Lokker can help. Not only will a free Lokker Privacy Scan map your entire ecosystem, showing you what all is happening, but our Privacy Automation Platform also enables you to selectively shut off or anonymize individual data flows to third-party apps. This essentially cuts off the ability of your website to leak sensitive information at all – from the source.

Opt-in vs Opt-out Privacy

At Lokker, we are all about changing the rules of how companies use and protect private data. The current systems for website publishing and mobile application development are not deploying with a privacy-first mentality as they should have been.

Lokker firmly believes in Privacy by Design and will always work to develop solutions that fix the current “opt-out” privacy assumptions by moving companies to a zero-trust, “opt-in” only default state to private data sharing of any kind – for children and adults. And we will always strive to do this in the most efficient and transparent ways possible.