When the US Federal Trade Commission laid down the law for child privacy in 1998, the world was a very different place. Sure, there was an internet and mobile phones had already advanced beyond the bricks we chuckle nostalgically about today. But there were also no mobile apps, SMS, chat rooms or virtual worlds.
In fact, since the Children’s Online Privacy Protection Act of 1998 (COPPA) came online in April 21, 2000, the world has changed so much that the FTC decided it had to get with the times. In an effort to do just that, the US government regulator recently began consulting with the public over a list of changes it had proposed in September 2011, changes which would redefine the way data is collected from kids under 13 and, perhaps just as importantly, the way parents will sign off on it.
Then on August 1, the FTC issued a revised set of proposed rules based on the enormous public response to the 2011 notice. The revised rules, which expand the definition of “personal information” and take into account third-party advertising networks and plug-in developers, could go into effect after a 30-day comment period. If kids digital media developers weren’t nervous before, they have more reason to be now.
The latest efforts of the FTC come at a critical time as the industry’s success with COPPA compliancy has been intermittent to date. Some sites do particular things well. Others not at all. A significant number of mobile app developers, however, don’t even know COPPA applies equally to them. Guess what? It does.
The FTC’s proposal in 2011 was as much a clarification as it was a whole new set of standards. It pinpointed the “who,” and looked to clearly define the “what” and “how” to ensure that if any company targets kids under the age of 13 in the US in any manner with digital content, it knows COPPA applies to its practices.
Major concerns about minor privacy
There are many things that COPPA is not. COPPA, for example, doesn’t really care all that much about the suitability of your content for kids. Nor does it particularly care about how much money you earn from targeting them. (If sites and apps harvest too much money then it’s really about protecting parents, not kids.)
COPPA is really only about protecting privacy, and the FTC’s new proposed rules hope to offer kids a more clearly defined safety net when it comes to their app or online adventures.
The central issue when it comes to kids’ digital privacy is parental oversight—especially the way permission is garnered from parents. The industry currently relies heavily on a scheme called “email plus.” Essentially, kids wanting access to a site have to provide a parent’s email address so that the site or app can contact them for sign-off. But it doesn’t take much time to figure out the inherent flaw in that particular scheme—and it takes kids even less. The fact is that no one knows whose email address kids really enter.
But the approach was readily adopted by the industry because it’s automated, cheap and easy to scale to any application—it just doesn’t necessarily always involve parents.
There are alternatives, of course, but they are far from perfect, too. Some services require a parent’s credit card number for authorization. COPPA is suggesting the solution might be an authorization that involves referencing a parent’s federally issued identification (such as a Social Security card), or even a short video call with parents. One of the FTC’s new recommendations is that mobile app developers and ad networks collecting information from kids under 13 become responsible for obtaining parental consent before kids can use their services. The organization also wants websites that cater to both children and adults to be able to age-screen all visitors to protect those who are actually underage.
While both are well-intentioned rules, many in the industry who readily acknowledge the flaws of email plus, caution that the alternatives might be worse.
“A lot of developers are going to take a long, hard look at whether or not it is viable to get parental consent,” warns Stuart Drexler, an L.A.-based product developer and brand strategist who has worked with huge online properties such as Moshi Monsters and Club Penguin. “It will call into question a lot of their existing data. They will have to re-qualify it in some way shape or form perhaps, or they may have to rip out all kinds of features if they don’t have the right parental approvals. I think that’s really problematic.”
Worse still, a more complicated approval regimen may stop developers from launching kid-targeted activities in the first place, and actively discourage innovation. If that’s the case, Shai Samet, founder and president of the kidSAFE Seal Program, suggests that increasingly more kids will turn to sites with content not intended for them, such as YouTube and Facebook. (Although, Facebook is looking to revisit its age policy that currently prohibits kids under 13 from having their own account.)
But email plus has other issues as well. Currently, notes Samet, COPPA consent rules are less strict if the information collected is for internal use. If consent rules become more stringent, or the definition of personal information changes to include photos, videos and audio files as the FTC is suggesting it might, then sites will have to have explicit parental permission before allowing children to upload certain photos or videos for contests, even if they are not intended for public viewing.
Of course, one of the biggest problems when it comes to privacy monitoring on mobile apps is device pass-along. Parents often hand kids their mobiles or tablets to keep them occupied. Ray Sharma, president and founder of Toronto, Canada-based game developer XMG Studio, describes it as a “fundamental issue” with mobile, but also one of the platform’s best features.
“It’s a control issue,” he observes, “unless you start getting users to sign in and authenticate their identity every time.”
Sharma suggests this is an area where operating system programmers could make a significant difference by creating safe modes of play or ways in which parents could quickly switch over OS functionality. Like it or not, the OS players are already in the privacy fight so they might consider it.
Companies doing it for themselves
At the end of the day, while no privacy protection system is foolproof or ideal, the FTC is hoping it can keep up with the fast-paced mobile and online industry and use its resources to enforce the law with its latest round of checks and balances.
Additional stipulations in the FTC’s new proposed rules expand the definition of a “website or an online service directed to children” to include apps targeting kids, and they update the definition of personal information to include cookies and IP addresses.
Smart companies, therefore, have decided to be aggressively proactive.
Disney, for example, recently announced an online safety education drive for Club Penguin with US$4.7 million in media spend to support campaigns in 100 million households in Europe, the Middle East and Africa. The site, like some other online services, also limits chat functionality to a pre-defined dictionary of words, and invests heavily in robust live moderation. Some sites and apps also offer parental control dashboards.
Sharma points out that child privacy often comes down to the headspace and intentions of the designer. “We’re more in the indie developer mentality,” he notes. “We only want to use [user] data to make the game better, therefore it can be 100% blind collection.”
Some sites have begun turning to third-party sources to make sure their verification systems are robust enough to comply with the suggested new COPPA regulations.
Marshall Harrison, founder and CEO of Westport, Connecticut-based Imperium, developed ChildGuardOnline as an alternative to email plus. ChildGuardOnline seamlessly works with clients to verify parental consent based on simple but hard to fake parental data (from parent’s date of birth to the last four numbers of their social security number). The service also offers parents an interface to keep tabs on kids’ usage and rescind permissions if they have to.
Sites can also turn to Samet’s kidSAFE program or Safe Harbor to be certified as both COPPA compliant, and perhaps more importantly, kid-content friendly.
The one thing that all seem to agree on is that there is no one-size-fits-all solution to child privacy issues. While clarifying COPPA regs will help, ultimately the responsibility lands on parents, developers and the general public—especially the latter, who can devastate the worst offenders by publicly spreading the word online and in their communities.