Follow the code: What the VTech data breach means for developers

"This is the type of situation that can put privacy regulators and advocates into action across an entire industry," says privacy consultant and iKids guest columnist Linnette Attai, of the unprecedented VTech hacking scandal. She also outlines best practices for kidtech developers in light of recent events.
December 10, 2015

By Linnette Attai, founder of compliance consulting firm PlayWell, LLC

For those in the children’s tech industry, it’s the nightmare scenario: A trusted brand in kids electronics is the victim of a crime. The website for VTech’s Learning Lodge app store was hacked, giving some still-unknown party access to data from more than four million parents and more than six million children. How could such a thing happen? And what does it mean for companies like VTech?

Unfortunately for VTech and its customers, hacking into the database was apparently not all that difficult. It’s been widely reported that the hacker used a common tactic – the SQL injection attack. In plain English, this basically means that the hacker was able to change the meaning of commands sent to the server. It’s one of the most well-known security flaws in the book, widely exploited by hackers. There are simple steps companies can take to help protect against it – steps that VTech seems to have missed.

To make things even easier for the hacker, some of the data was not encrypted either in transit or at rest. Sensitive information, such as answers to password recovery questions, was stored in plain text. And apparently the data from children was stored in a way that allowed it to be easily connected to the parent users. Although VTech didn’t collect the last names of children, the hacker could connect a child’s first name to a parent’s last name because of the way VTech designed its database. Data exposed included parent names, addresses, password recovery answers, children’s first names, gender, birthdates and possibly photos, audio files and chat logs.

There’s more, but you get the idea. Security fundamentals seem to have been missing.

One look at the VTech Kids website tells us that privacy fundamentals may have been missing, as well. VTech’s privacy policy promises users that registration data and personally identifiable information sent to VTech is encrypted in transit, housed in a database that is not accessible over the internet. Privacy Policy 101: What you say must be what you do, and it seems that this was not the case with VTech.

And what about all of the data that was exposed? Good privacy practices require minimizing the collection of data. In the case of VTech, it’s not entirely clear why all the data was required in the first place.

There are a number of reasons to minimize minimize data collection. When it comes to children under the age of 13, COPPA requires companies to collect only the data necessary to provide a service or feature to young users. Regardless of the age of the user, VTech has illustrated another reason to collect only what you really need, as data can be a liability.

So, what happens now?

A global hack of children’s data is unprecedented, and it’s not surprising that the situation has raised the ire of regulators around the world. In the US, Congress has asked VTech to answer questions about its privacy and security practices, as have the Attorneys General from Illinois and Connecticut. Hong Kong’s Office of the Privacy Commissioner for Personal Data is investigating, and so is the UK’s Information Commission’s Office. At least one class action lawsuit is being filed. The financial price from penalties, attorney fees and lost consumer trust will be astronomical.

However, it’s not just VTech that will be under scrutiny. This is the type of situation that can put privacy regulators and advocates into action across an entire industry. Mobile apps and interactive toys, already questioned regularly about privacy practices, may be viewed with even more skepticism. It can also make parents very, very nervous about technology. And hackers will look for new targets.

Which means that developers now have a lot more to consider. The situation shows us that the compliance mantra of “designing with privacy and security in mind” is not just a nice policy to have, but rather an actionable practice for companies to implement. Perhaps your company already has solid compliance practices in place, but keeping those practices fresh is as important as establishing them in the first place.

As the climate heats up, there are five distinct areas on which developers should focus in order to keep their compliance program in good shape:

1. Security: Review your security policies and practices. What physical, technical and administrative controls are in place? Are they aligned with industry best practices?
2. Privacy: When was the last time you reviewed your privacy policy and practices? Have you kept them current as both your product and the legislation has evolved? Are you minimizing the data you collect?
3. Education and training: Do all of your employees understand your privacy and security policies? Do they have at least a basic understanding of the regulatory requirements? How often do you provide training?
4. Culture of Compliance: How is the importance of data privacy and security communicated within your organization? Is it top of mind for employees as they design, produce and market products? How can you breathe new life into those responsibilities so that they remain fresh for your teams?
5. Industry Awareness: How do you stay informed about regulatory action impacting your business? Are your team aware of policy changes as they unfold?

Unfortunately, there is no such thing as a data system that is 100% secure. As our industry brings innovative technology products to market, we will continue to see criminals attempt to exploit vulnerabilities in the systems. We will also continue to see skeptical eyes cast on new product ideas. The most important thing developers can do is to keep building on existing compliance practices. The more practices evolve and the more that privacy and security by design becomes ingrained in company cultures, the easier it will be to innovate within the bounds of good privacy and security practices.

PlayWell, LLC guides companies in the US and abroad through the regulatory and self-regulatory environments surrounding digital and mobile privacy, user safety, security, product development, content, advertising and marketing. Attai has more than 20 years of experience in the compliance industry, with special expertise in the youth entertainment and education sectors.


About The Author
Linnette Attai is the founder of compliance consultancy PlayWell, LLC, which guides companies through regulation and industry self-regulation surrounding digital and mobile privacy, user safety, security, product development, content and marketing. You can contact her at


Brand Menu