Bearistotle

4 min read

In January 2017, Mattel and Microsoft announced the launch of Aristotle, a digital assistant explicitly focused on very young children. The device was marketed by Mattel, and used Microsoft's AI technology. The device was literally intended to work with children from the first weeks of their lives. 

Crying, for example, can trigger Aristotle to play a lullaby or a recorded message from the parent. Conversely, a child’s crying can also trigger nothing at all, to let the kid settle down on his own. Parents will be able to configure these behaviors via the app.

To state the obvious, the developmental risks to a newly born child from having a recorded message in lieu of parental attention are not clear, but I don't think we are at a place where we want to "disrupt" parenting.

Concerns about Aristotle mounted after the initial announcement. Many of these concerns are privacy-related, but many had nothing to do with privacy and focused on the blatant irresponsibility and lack of humanity involved in outsourcing care for a child to a plastic gadget that collected data and shuffled it off to remote storage. As recently as six days ago, Mattel talked about the product as if it was going to be released

The following quotation cites Alex Clark, a Mattel spokesperson, in an article from September 29th.

Aristotle wasn’t designed to store or record audio or video, Clark said. No third parties will have access to any personally-identifiable information, and any data shared is entirely anonymous and fully encrypted, he said.

A few key points jump out from this fantastic piece of doublespeak.

  • First, as of six days ago, the company was defending Aristotle. This suggests that they were still considering releasing this device.
  • Second, the definition of "store" needs to be clarified. Are they saying that the device has no local storage, and it just transmits everything it contacts? This statement is empty. An statement with actual use would be to define what this device transmits, what it stores, and who can access it. But, of course, he is just a spokesperson. Truth costs extra.
  • Third, the last sentence makes two astounding claims: third parties can't access personally identifiable information, and any data shared is "entirely anonymous and fully encrypted." To start, it's refreshing to hear explicit confirmation that Mattel was planning on sharing data with third parties. However, their claims about not sharing personal information are a red herring. Without clarity on how they are anonymizing information, what the prohibitions are on attempts to re-identify the data set, why they are sharing data, and with whom they are sharing data, they aren't offering anything reassuring here. Finally, claiming that data are "fully encrypted" is meaningless: encrypted in transit? At rest? Is encryption in place between storage devices inside their network? While strong encryption is a necessary starting point, encryption isn't a blanket. There are multiple layers to using encryption to protect information, and a robust security program focuses on human and technical steps. Encryption is a piece of this, but only a piece.

Yesterday, Mattel announced that they were cancelling Aristotle. This is the right decision, but we shouldn't confuse this with good news. It was only two years ago that Mattel brought Spy Barbie -- complete with multiple security issues -- into the world.

People of all ages are all currently exposed via devices that have sub-par privacy and security practices, and privacy policies that do not respect the people buying and using the products. Everything from Amazon's Echo and Alexa products, to Google Home and Family products, to Siri, to Cortana, to Google's voice search on phones, to "Smart" TVs, to connected toys, to online baby monitors -- all of these devices have potential security issues, and opaque privacy terms. In most cases, people using these products have no idea about what information is collected, when it is collected, how long it is stored, who can access it, and/or how it can be used over time. When adults use these devices around kids, we send the clear message that this invisible and constant surveillance should not be questioned because it provides a convenience.

The mistake Mattel made this time was introducing a utilitarian object. If they had wrapped Aristotle in a toy, they'd be home free.

My prediction: in 2018, Bearistotle will be the must have toy of the season -- the friendliest, most helpful bear any child will ever need. It will retail for the bargain price of 499.99, and if you enable geotagging it will create a digital portfolio of childhood highlights to use in preschool appications.