Last year, like many new parents, I was walking the extreme tightrope of keeping my young child healthy and happy. When my daughter left the stages of infancy into becoming a much more aware toddler, I decided that it was high time to put her in preschool. It was better than her staring at the same four walls of the living room while I contemplated the health risks over and over.
After a few internet searches and some phone calls, I chose one that was close and had spots open (which was pretty hard to obtain). When I started the enrollment process, I saw a flyer in the huge packet that immediately threw me into a new set of worries I didn’t want to deal with: “We also use Brightweel , a mobile application to log attendance, share milestones, and keep parents up to date on daily interactions. ‘” Alexis Hancock is director of engineering at the Electronic Frontier Foundation.
I don’t know what goes through other parents’ minds at this point, but I do privacy- and security-oriented work as my day job at the Electronic Frontier Foundation, so I couldn’t help myself from looking at the security controls Brightwheel gave to me as a parent. This was my child’s data left up to some company. Don’t get me wrong, the app provided some comfort, allowing me to see my baby smiling, making friends, and enjoy riding bikes during outside playtime.
Especially in that first week when you aren’t there to oversee every aspect of their life for the first time. But looking at my account, I saw very few settings that said anything about security. There was a PIN code to check them in and out, but that was about it.
Over several months, I looked at the gigantic amount of data that was being shared and stored by this app every day. Diaper changes, story time pictures, nap times, etc. The more data about my daughter I saw, the more my worry grew.
By October 2021, I couldn’t sit on this any longer. I wouldn’t call myself a hacker by the definition in most people’s heads. But in this case, for my daughter’s sake, being a mother means doing everything in my power to keep her safe.
So I began a months-long dive into the early education landscape of apps—and didn’t like what I found. I am lucky in where I work. Some cold emails and a little networking later, a coworker (also a new parent being asked to use Brightwheel) and I finally got a meeting with an actual person at the company.
The meeting was productive in the sense that Brightwheel seemed to understand the concerns but confirmed how woefully behind the entire industry was in privacy and security protections. For example, a very basic and well-known protection measure is two-factor authentication. You know how some services now require you to enter a one-time code in addition to your password? That’s two-factor authentication, which gives an enormous bang for your buck in terms of security.
It’s been spreading rapidly, and at least offering it is pretty much an industry standard these days. Brightwheel now has two-factor authentication available for all school or day care administrators and parents, but it is the only one to have done so. Which is bullshit.
Several of these companies don’t disclose what data they collect and where it goes. And what we’ve found is that what they do is, in some cases, track and share information in the way Facebook is also known to. That’s bad enough when it’s data about adults on a public social media site, but it’s horrifying when it’s information about a preschooler.
Figuring out the privacy and security issues around the app your child’s day care uses isn’t like researching how to sleep-train a baby or what high chair to use, where parents can easily find trusted sources of information. This information isn’t out there. Parents and administrators are being sold on convenience, but they aren’t given even the most basic tools to choose a secure app.
And for those of us who have the know-how to find these vulnerabilities and fix them, we’ve run into the problem of the companies not wanting to hear about it. As an ethical hacker, the thing I planned to do was disclose what I found and wait 90 days for a response (a common security industry practice). Even there, I hit roadblocks.
Beyond not finding a way to contact them on their websites, I discovered that researchers based in Germany released a paper in March 2022 identifying security and privacy problems with 42 early education and day care management applications. In addition to outlining the vulnerabilities, the paper also explained that the researchers did their due diligence by ethically reporting the issues and had almost no response from the companies. That’s unacceptable.
If your company handles sensitive information, and researchers do the work of figuring out how to make your product more secure for you, not responding to them is a terrible practice. I published my own research into these apps on EFF’s website , where you can dig into the technical details, but the major takeaway is that these services are not as secure as they can or should be. Some very basic demands we have for all of these companies: In addition, we would like to see it become standard for these apps to secure any messages sent between the schools and parents.
End-to-end encryption would do that, and there’s no need for a server to be seeing the updates on a child’s life. And finally, these companies need to monitor and proactively respond to reports of problems with their applications. It should not take a technologist who happens to work at a digital privacy organization and a coworker who happens to be a lawyer on these same issues cold-emailing and working contacts to get a meeting.
Being able to get daily updates on how your child is faring in day care is extremely comforting to a parent. It was for me. Unfortunately, that comfort was soon outweighed by the danger I found.
.
From: wired
URL: https://www.wired.com/story/daycare-app-privacy-security/