People anxious about deviating from what AI tells them to do?

Reddit r/artificial / 4/4/2026

💬 OpinionSignals & Early TrendsIdeas & Deep Analysis

Key Points

  • A Reddit user describes a friend who followed ChatGPT’s hair-dye steps instead of the dye manufacturer’s instructions and became visibly stressed about deviating from the AI’s guidance.
  • The user notes that the boxed dye instructions explicitly specify the correct method for that formula (mix and apply all at once), which conflicts with what ChatGPT recommended.
  • The core issue highlighted is not technical accuracy but the emotional effect of trusting AI advice even when reliable, on-hand instructions contradict it.
  • The post asks whether others have experienced similar anxiety or overreliance on AI instructions in everyday, non-AI tasks.

My friend came over yesterday to dye her hair. She had asked ChatGPT for the 'correct' way to do it. Chat told her to dye the ends first, wait about 20 minutes, and then do the roots.

Because of my own experience with dyeing my hair, that made me sceptical, so I read the instructions in the box dye package. It specifically said to mix it and apply everything all at once. That's how this particular formula is designed to work.

I read the instructions on the package out loud and told her we should just follow what the manufacturer says. She got visibly stressed and told me that 'ChatGPT said to do it differently'.

I pointed out that the company who made the dye probably knows how their own product is supposed to be applied. She still got visibly anxious about going against what ChatGPT told her to do.

It was such a weird moment. She was genuinely stressed about ignoring the AI even though the real instructions were right there in her hands.

Has anybody had similar experiences?

submitted by /u/qxrii4a
[link] [comments]