Who’s watching the kids?

According to Flurry Analytics, not only is Christmas the biggest day for new device activation, it's also when the largest number of apps are traditionally installed. For developers, parents and regulators, this is a wake-up call to pay better attention to what - and how - kids are playing online. Privacy and safety expert Linnette Attai, founder of PlayWell LLC, has some advice.
January 21, 2015

The Christmas tree has been recycled, the menorah de-waxed, and school is back in session. Now, what about all those gifts? Many kids (and parents) couldn’t wait to use the fancy new devices that Santa dropped off.  According to Flurry Analytics, not only is Christmas the biggest day for new device activation, it’s also when the largest number of apps are traditionally installed. Not surprisingly, games and messaging see the greatest action.  For developers, parents, and regulators (informally or legally), this is a wake-up call to pay better attention to what—and how—kids are playing online. Privacy and safety expert Linnette Attai, founder of PlayWell LLC has some advice.

WS: What are the biggest safety issues with kids’ apps? 

LA: At the top of the list of concerns are predators, bullying and children in distress.  When planning any social features, it’s incredibly important to develop with the protections in place to minimize the risk, support users properly, and ensure that kids can use your app safely.  You might hope for the best, but need to plan for the worst.

WS: Do you differentiate between privacy and safety? 

LA:Definitely!  These are related, but different concepts, even though they’re often used interchangeably. When we refer to privacy, we’re talking primarily about practices around data collection and use, including statutory requirements such as COPPA, FERPA (Family Educational Rights and Privacy Act), PPRA (Protection of Pupil Rights Amendment) and a variety of state regulations.  Safety is a reference to online events such as cyberbullying and sexting, as well as physical safety of users that may be compromised as a result of an online interaction.  This is most relevant when there are community and social features.  Poor privacy practices can definitely impact safety, but there are no regulatory requirements specifically dealing with user safety.  It’s very common to see attention to safety get missed, or short-sided in the development process.

WS: Isn’t enforcement an FTC job? What are they doing? 

LA:Enforcement of COPPA falls to the FTC.  They actively monitor websites and apps, and investigate complaints.  When COPPA was updated, developers had both an official and a bit of an unofficial grace period to get into compliance.  That’s over now, and we expect enforcement action to pick up.  We’ve seen a few FTC COPPA settlements over the past several months, and one warning letter.  State Attorneys General are also empowered to bring enforcement action around COPPA.  Maryland and New Jersey have already taken action this year, and other states are sure to follow.  Despite all of that, given the sheer volume of websites and apps, it’s challenging for regulators to monitor everyone.  Industry needs to own the responsibility that comes with creating products intended for young users, and operate in alignment with the existing regulations.

WS: What should developers know? And what do you do to help them?

LA: It’s critically important for developers to understand privacy requirements. Whether for entertainment or education technologies, regulation is an active, evolving area.  If you’re not aware of the requirements or don’t know where to start, get help.  The nightmare is not only the potential harm to a user and the regulatory repercussions, but the headline, “App Developer X Violates Children’s Privacy.”  It’s very hard to recover from that type of misstep when the products you create are for children.

Part of my aim is to help developers understand the special considerations that wrap around our industry, and how to apply that knowledge.  There are over 50 years of regulatory action to guide us.  Having immersed in the history behind it all, I seewhere the hot buttons are, and use that to predict what’s coming next and why.  Understanding what’s come before helps me to not only map out a strategy for successful product development now, but also helps guide creation of business models that will stand the test of time.

WS: Student data privacy regulation is in the news now. How do you work with ed tech companies to help them navigate?

LA: Ed tech companies are under extraordinary compliance pressures right now.  Last year saw over 100 bills across 30 states, all focused on industry responsibility for maintaining student data privacy.  So far we’ve seen 15 more bills this year.  On the federal front, we’ve seen proposed legislation to update the Family Educational Rights and Privacy Act (FERPA), and President Obama recently announced that ensuring student data privacy is a White House priority.

For those in the ed tech space, the time to understand their obligations and those of schools to protect student data privacy is NOW!  I work with ed tech companies to bring their products into compliance with federal and state student data privacy regulation, and leverage my experience with schools to help companies properly navigate the purchase process.  My aim is to help bridge the gap between compliant, responsible ed tech companies and schools.  It’s a combination of compliant practices and compliance communications and positioning that brings transparency about products to the schools that may want to put the technology in the hands of students.

As a side note to that, I’m the architect and author of the first industry self-regulatory assessment program for FERPA. In addition to working on product development projects, I build large-scale compliance policies and company programs, and I essentially designed this as the model for creating an official, federal safe harbor program should one exist in the future.

I also provide COPPA and FERPA training for industry and schools, and run workshops for educators, parents and youth groups interested in better understanding how to choose the right technology and manage privacy and safety in the digital world.

WS: Parents are pretty busy managing all kinds of on- and off-line issues. Other than (often unsuccessfully) monitoring every moment of activity, are there other ways parents can protect their kids? 

LA:In my workshops I often hear from parents who are simply overwhelmed by all the different devices, apps and sites their kids are using.  They don’t know how to manage it all.  The truth is, we can’t expect parents to be up to speed on every product out there.

However, parents can dig into the settings of devices, set up parental controls, review privacy settings with their kids, turn off location features, use filtering software and most importantly, talk to their children about good behavior and safety online and off. I often remind parents that technology has changed, but good parenting hasn’t.  Kids can be taught that the rules, norms and expectations for behavior that exist in the real world apply to the online world as well.  We talk to kids all the time about real world safety.  We need to have those same conversations – early and often – about safety in the online world as well.

WS: Sadly, it’s pretty easy for an under-13-year-old to circumvent the system. Do you see this changing?

LA: With the tipping point for kids having a phone sitting at about 8 years old and dropping, I don’t see that changing anytime soon.  When parents upgrade their phone, they’re often handing the old one back to the child.  Some children as young as 6 are sporting smartphones.  Unfortunately, not only do many kids circumvent the system, parents either don’t know their kids are on sites that are meant for older users, or they know it and allow their kids to lie about their age.  It concerns me because not only are there privacy issues with that, but safety issues as well.  Parents often don’t realize that when we build social networking sites and apps for adults, we don’t build in the same safety protections that we would if the product were built for kids.

From the industry side, we need to continue to use age gates properly, develop them in a compliant manner, and continue to educate parents.  We tend to forget that young kids still want to be kids.  There’s room—and a need—for robust, engaging, entertaining and safe experiences for them, and when done well, the kids will show up!

For more of Linnette’s ideas and observations, go to

Send your thoughts to me at

About The Author


Brand Menu