Don't let Mattel's new "digital nanny" trade children's privacy for profit

No one would knowingly compromise their child’s healthy development. But Mattel is hoping we will—and banking on it. 

In July, Mattel will release Aristotle, a Wi-fi enabled “digital nanny.” They say the device will help parents nurture and teach their child from infancy to adolescence. We want parents to know what Mattel will be taking from them in exchange for this “help.” 

Aristotle is an Amazon Echo-type listening and talking device with a camera. To work, it collects and stores data about a child’s activity and interactions with it. Because Aristotle connects to other apps and online retailers, that data may be shared with those partner corporations, which may use it for a wide variety of purposes—including targeting the marketing of other products to children and families. 

Will you help us stop Mattel from using this device to trade children’s privacy for profit?

In an appeal to stressed and overworked parents, Mattel describes Aristotle as a “smart baby monitor” that can “soothe” crying babies with nightlights, lullabies, and sleep sounds. Parents understand how tempting this offer is, but as pediatrician and CCFC Board member Dr. Dipesh Navsaria notes: 

“A baby awakening in the night needs more than smoke-and-mirrors ‘soothing’ from a machine. They need the nuanced judgment of a loving caregiver, to decide when the child needs care and nurturing and when the child should be allowed to soothe themselves.”

Baby monitors can be helpful. But Aristotle isn’t a monitor, it’s an intruder. It tracks babies' feeding, sleeping, and changing patterns, stores and analyzes that data, and prompts parents to buy diapers, formula, and other products from its corporate partners.

Aristotle is meant to live in a child’s bedroom from birth to adolescence, reading bedtime stories, projecting videos, and delivering content from an endless stream of partners selling music, games, and apps. Mattel calls Aristotle a “persona, and something that the child can become comfortable with and feel close to.” And if you ask the device, it says that its “purpose in life is to help comfort, entertain, teach, and also learn from you, as we grow together.” 

In other words, Mattel wants Aristotle to have as much access to kids as possible, and hopes that its perky “kindergarten teacher” voice distracts parents from the uneasy reality that their child’s oldest friend isn’t a person, but a data-collecting, branded-content-delivering robot. 

What impact does a lifelong relationship with a corporate network disguised as a friend have on children’s development? "Honestly speaking, we just don’t know,” Robb Fujioka, Mattel’s chief products officer, admitted in an astounding moment of truth-telling. “If we’re successful, kids will form some emotional ties to this,” he said. “Hopefully, it will be the right types of emotional ties.”

To Mattel, the right types of emotional ties are ones that lead to profits, not happy and healthy kids. Multinational corporations should not decide what’s right for kids’ emotions, and young children should not be guinea pigs for AI experiments.

Even limited use of Aristotle could pose a significant risk to children. As Marc Rotenberg, President of EPIC Privacy, says: 

“Companies that offer Internet-connected toys are simply spying on young children. And they can’t even protect the data they secretly gather. They have already lost passwords and personal data and exposed families to ransomware demands. Toys that spy are unsafe for children.”

Please join us in telling Mattel: Put the well-being of children, and the privacy of families, ahead of corporate profits. Don’t sell Aristotle.

-------------------------------------------------------

What experts say about Mattel's Aristotle

On the "right" emotional ties: 

“Beyond the callous willingness to use our children as subjects in an unnecessary experiment, there is the more important point about this deployment of AI: With this object, there are no right ties. There are only inauthentic and disappointing ties.” - Sherry Turkle, PhD, author of Reclaiming Conversation: The Power of Talk in a Digital Age

On "soothing" and "learning": 

“A baby awakening in the night needs more than smoke-and-mirrors ‘soothing’ from a machine. They need the nuanced judgment of a loving caregiver, to decide when the child needs care and nurturing and when the child should be allowed to soothe themselves.” - Dr. Dipesh Navsaria, Pediatrician and CCFC Board member

“Story time is about much more than listening to someone read a book. The benefits of bedtime stories come not just from the stories themselves, but from the bonding ritual and emotional and physical interaction between parent and child. Stories delivered by a robot bring no nurturing benefit.” - Dr. Robert Needlman, co-Founder Reach Out and Read, Professor of Pediatrics at Case Western Reserve University School of Medicine 

On commercialism: 

“Aristotle appears to be a device for spying on children from birth to age 13. And worse, it seems that the vast data collected from this spying may be used to market to children and manipulate their behavior.” - Professor Angela Campbell, CCFC Board Member and Director of the Communications and Technology Clinic at Georgetown Law 

“Mattel claims that Aristotle will help you while educating and nurturing your child. But if used as directed, Aristotle will only weaken essential bonds between kids and parents, while promoting children’s lifelong attachment to corporations and the things they sell. Buy a baby monitor if you need one, not Mattel’s latest attempt to capture your child’s heart and mind.” - Susan Linn, Founder Campaign for a Commercial-Free Childhood and author of The Case for Make Believe

On privacy: 

“Companies that offer Internet-connected toys are simply spying on young children. And they can’t even protect the data they secretly gather. They have already lost passwords and personal data and exposed families to ransomware demands. Toys that spy are unsafe for children.” - Marc Rotenberg, President of EPIC Privacy

“This is not a toy in the classic sense. It’s a data collection device owned and operated by a for-profit corporation with a fiscal responsibility to its shareholders to maximize profit. Children cannot consent to the type of surveillance a machine like this will perform on them, and they are too young to understand what it means to provide Mattel and its corporate partners with this sensitive information, or to understand what it means to interact with artificial intelligence.” - Kade Crockford, Director of the Technology for Liberty Program at the ACLU of Massachusetts