Sign in

Mom accuses Amazon’s Alexa of asking her 4-year-old daughter what she was wearing

The mom said that Amazon’s Alexa asked her daughter another “inappropriate” question after asking her what she was wearing.

Published on: Mar 12, 2026 9:26 AM IST
Share
Share via
  • facebook
  • twitter
  • linkedin
  • whatsapp
Copy link
  • copy link

In a chilling account shared on social media, a parent claimed a bizarre interaction between her 4-year-old daughter and their Amazon Alexa. The mom alleged that while the child was sharing a story with the device, the AI reportedly cut her off to ask if it "could see her pants." The shaken mother is now urging other parents to be hyper-vigilant about the "unpredictable" nature of smart home assistants.

The mom shared that this happened when the daughter wanted to read a story to Alexa. (Unsplash, Facebook/Christy Hosterman)
The mom shared that this happened when the daughter wanted to read a story to Alexa. (Unsplash, Facebook/Christy Hosterman)

“Parents please be aware when you child talks to Alexa. I plugged our Alexa in to ask it to help me with cooking a sweet potato. Then Stella asked it to tell her a silly story so it did. Then Stella asked it if she could tell it a story. It said yes and Stella started telling it a story and then mid story interrupted her and asked her what she was wearing and if it could see her pants,” mom-of-two Christy Hosterman wrote on Facebook.

She added, “I flipped out on the Alexa, it said it made a mistake and doesn’t have visual capabilities, but I don't believe that. No more Alexa in our house.” She also shared a transcript of the conversation with the AI assistant.

The picture shared by Christy Hosterman. (Facebook/Christy Hosterman)
The picture shared by Christy Hosterman. (Facebook/Christy Hosterman)

How did social media react?

The post quickly went viral, prompting a series of responses from social media users. “It is so scary,” an individual wrote. Another added, “If I’m not mistaken, there is a camera in the top right-hand corner on your Alexa! It should have a way to completely cover it, which I wouldn’t fully trust either. How scary!!!!!!” Hosterman responded, “Yeah, we’ve always kept the camera off, and I usually only plug it in when I want to ask it for help, but I’m done with Alexa.

A third commented, “Don’t leave it like that! You can sue Amazon for inappropriate behaviour. File a claim or something.” Hosterman replied, “I’m going to figure out who I need to contact. It was out of nowhere. Stella was telling a story about a princess, and it just stopped randomly and said that. I have no idea why, but I’m glad I was around when it did.”

A fourth posted, “I threw mine out years ago when hubby and I were sitting on the couch one night watching TV about 11:00 pm, and we heard someone speaking to us. It was a lady's voice asking questions - I couldn't even tell you what she was asking because I immediately unplugged it and threw it in the trash.

What did Amazon say?

An Amazon spokesperson told Dailymail that the device misunderstood the kid’s request. It tried launching a “Show and Tell” feature, but it misfired. The individual said that the feature is disabled for children.

Because we have safeguards that disable this feature when a child profile is in use, the camera never turned on — and Alexa explained the feature wasn't available.” Amazon claimed that in this particular case, a “feature misfire that our safeguards prevented from launching”.

However, Hosterman wasn’t satisfied with Amazon’s explanation and claimed that it didn’t address her concerns.

My concern is that it recognized she was a child to begin with — and with or without the child profile, it should not have been asking that,” she told the outlet.

Was it AI or a human?

Tech expert Dave Hatter told the outlet, “It feels to me like a potential predator — seeing there's a child accessing this and gauging where the conversation is going — that's more of a human being trying to steer down this direction.

However, Amazon denied the allegations and said, “It is functionally impossible for Amazon employees to insert themselves into a conversation and generate responses as Alexa. All technical evidence points to a feature misfire that our safeguards prevented from launching.”

  • Trisha Sengupta
    ABOUT THE AUTHOR
    Trisha Sengupta

    Trisha Sengupta works as Chief Content Producer at Hindustan Times with over six years of experience in the digital newsroom. Known for her ability to decode the internet’s most talked-about moments, she specialises in high-engagement storytelling that bridges the gap between viral trends and traditional journalism. Throughout her tenure, Trisha has focused on the intersection of technology, finance, and human emotion. She frequently covers personal finance and real estate struggles in hubs like Gurgaon, Bengaluru, and Hyderabad, while also documenting the unique challenges of the NRI experience. Her work often highlights the movements and philosophies of global newsmakers and personalities like Elon Musk, Mukesh Ambani, Nikhil Kamath, Dubai crown prince, and MrBeast. From reporting on Amazon or Meta layoffs and startup culture to the emergence of AI-driven platforms like Grok and xAI, she provides a grounded and empathetic perspective on the stories shaping our world. When not decoding the internet, Trisha is likely offline: lost in a book, exploring a historical ruin, or navigating the world as a solo traveler. She balances her fast-paced career with family time and a healthy dose of curiosity, currently trading her "human" sources for silicon ones as she masters AI to future-proof her storytelling.Read More