lundi 23 octobre 2017

What Can Voice-Activated Device Makers Legally Do With Recordings Of Kids’ Voices?

From your watch to your TV to your crockpot to your kids’ toys, the products we use in our home are increasingly voice-activated. Unlike previous generations of devices, these newer ones are listening, getting smarter, adapting to multiple users with different accents and cadences. To do that, they listen to, record, and often transmit recordings, of everyone in earshot of the device — including kids, whose private details are specifically protected by federal law, but who sometimes end up ordering hundreds of dollars worth of cookies. So how can Amazon, Google, Apple, or any tech company legally make an always-on device that doesn’t violate your little one’s privacy?

The Federal Trade Commission has issued clarifying guidance saying what it expects companies that collect children’s voice recordings to do with that data in order to stay within the boundaries of the law.

The Law

Children have more privacy protections under the law than the rest of us, thanks to the Children’s Online Privacy Protection Act of 1998 (COPPA).

COPPA requires any entity knowingly collecting personal data from children under age 13 to adhere to a specific set of privacy, data storage, and disclosure guidelines. “Personal information” includes anything that can be a unique identifier: social security number, home address, phone number, screen name, geolocation information, a photo, or anything else that could tie an account to a single, particular child.

Entities covered by the rule must:

  • Post privacy policies
  • Provide notice to, and obtain consent from, parents about privacy practices
  • Give parents the option of letting kids’ data be used internally but not shared with third parties
  • Permit parents access to review their kids’ data or have it deleted
  • Keep kids’ data confidential and secure
  • Limit the retention of kids’ data after it is no longer needed and take “reasonable measures” to prevent it from unauthorized access or use

The first version of the COPPA debuted so long ago that the children born that year are now themselves all legal adults. However, the rule was updated in 2013 to broaden the definition of “personal data” that sites and services collect to include things that modern devices easily gather that their two-decades-gone counterparts did not: Geolocation, photos, videos, and voice recordings.

The rule doesn’t say you cannot collect the data; it says you have to do it under certain conditions and for certain reasons.

The FTC says it “will not take an enforcement action against an operator for not obtaining parental consent” before collecting a child’s voice if it is “collected solely as a replacement of written words, such as to perform a search or fulfill a verbal instruction or request.”

Basically, as long as Amazon doesn’t retain the recording of your kindergartner asking Alexa to play “Let It Go” for the four millionth time this week, or use it for other purposes than to fill your home with Elsa claiming her freedom yet again, it’s in the clear.

Are There Loopholes?

As always happens with law, the devil is in the details.

The FTC said today that there are limitations on the non-enforcement policy. For example, if a device requests a child say personal information — like their name or address — then that recording is still required to be handled under COPPA guidelines.

Companies must also explain in their privacy policies what they do with data and how they store and delete it, the commission adds.

This guidance, however, only covers voice activation or detection used “solely as a replacement for written words.”

Many requests to a home assistant fit under that umbrella. You can tell your home devices to play a certain piece of media with a mouse click or device tap, or you can say, “Play this.” You can turn on your computer or phone to order something from Amazon, or you can say, “Alexa, buy this.” Those are substitutions for written words.

But what the new guidance does not seem to cover are voice actions that may not be substitutions, because there is nothing they can be substituted for. For example, there’s no way to give a voice-activated doll a written instruction; such a use simply doesn’t exist.

The Toys are Listening

As technology becomes increasingly voice-activated, the matter of what to do with all those recordings has become a more pressing problem.

Devices like an Amazon Echo, Google Home, or other “smart” home assistant are listening and learning from children in the house just as much as they are from adults, to be sure. But there are also an entire category of “smart” toys that are designed to listen specifically to children.

Those, as we’ve reported before, present a huge number of challenges to kids’ privacy.

The My Friend Cayla and i-Que robot dolls, for example, sends voice recordings it gathers from children to a third party — also a defense contractor — that then uses that data to “train” its software.

More recently, the same coalition that discovered what Cayla and i-Que was doing also presented research showing that children’s smartwatches are likewise hackable and doing who-knows-what with kids’ personal data.

Earlier this year, a different connected toy, CloudPets, was found to have left its server unsecured and at least 2 million recordings of children vulnerable to anyone who digitally strolled by to pick them up.

No less an agency than the FBI itself has advised parents to be very careful when buying internet connected toys, as so many of them take advantage of kids’ data and leave them in the lurch when it comes to privacy.



Aucun commentaire:

Enregistrer un commentaire