The annual “Trouble in Toyland” report, produced by the U.S. Public Interest Research Group (PIRG) and released before the holiday season, historically has focused on safety hazards found in traditional children’s toys.

But this year’s report highlights a new threat: “smart toys” that pose a privacy risk to children and families.

According to the 38th annual “Trouble in Toyland” report, released in mid-November, “Toys that spy on children are a growing threat.” The threats “stem from toys with microphones, cameras and trackers, as well as recalled toys, water beads, counterfeits and Meta Quest VR headsets.”

“The riskiest features of smart toys are those that can collect information, especially without our knowledge or used in a way that parents didn’t agree to,” said Teresa Murray, Consumer Watchdog at the U.S. PIRG Education Fund and author of the report. “It’s chilling to learn what some of these toys can do,” Murray said in a press release.

Murray told The Defender:

“This primarily means microphones, cameras, geolocators, Wi-Fi and Bluetooth capability or that connect to an app. We’re also watching for developments with artificial intelligence [AI] built into toys, although this isn’t happening much — yet.”

Smart toys include “stuffed animals that listen and talk, devices that learn their habits, games with online accounts, and smart speakers, watches, play kitchens and remote cars that connect to apps or other technology,” according to PIRG.

Dev Gowda, J.D., deputy director of Kids in Danger, told The Defender, “Parents and gift-givers should be concerned with toys that connect automatically to unsecured Wi-Fi networks or pair automatically with other devices through Bluetooth. Families may unknowingly share information through a toy’s microphone, camera, or video camera.”

According to the PIRG report, smart toys can pose the risk of data breaches, hacking, potential violations of children’s privacy laws such as the Children’s Online Privacy Protection Act of 1998 (COPPA), and exposure to “inappropriate or harmful material without proper filtering and parental controls.”

PIRG said:

“AI-enabled toys with a camera or microphone may be able to, for example, assess a child’s reactions using facial expressions or voice inflection. This may allow the toy to try and form a relationship with the child and gather and share information with others that could risk the child’s safety or privacy. …

“… Some [smart toys] can collect data on your child and transmit it off of the toy to a company’s external servers. For example, some interactive dolls with conversation capabilities use microphones and Wi-Fi to transmit a child’s words to speech recognition software maintained by the company.”

California-based attorney Robert Barnes told The Defender Big Tech is “targeting kids built on monetizing their private information and manipulating them to achieve that objective. So-called smart toys can pose that same risk.”

According to Research and Markets, the global market for smart toys grew to $16.65 billion in 2023, from $14.11 billion in 2022, and is expected to exceed $35 billion by 2027.

Austin, Texas-based technology attorney W. Scott McCollough told The Defender that smart toys are “another example of the alarming trend of corporate and government surveillance inside the home,” that threatens privacy and liberty.

“Simply put, this cabal of private and public interests are voyeurs, noisy busybodies, but they also have coercive power,” he said.

Along similar lines, California-based attorney Greg Glaser said “American moms and dads need to be careful, because tech companies are using toys to invade family privacy at home.”

“Companies today see the real world as product research. Where there is data to be harvested and analyzed, there is danger,” he said.