Writing the loss function: AI, feeds, and the engagement optimizer

Reddit r/artificial / 5/4/2026

💬 OpinionSignals & Early TrendsIdeas & Deep Analysis

Key Points

  • The article argues that much of the low-quality “AI slop” on social media is driven by recommender systems that optimize for what performs with audiences rather than for overall quality.
  • It suggests the problem is not that the AI is broken, but that the systems are intentionally targeting engagement-relevant objectives, which can reward content that works for someone “approximately like you.”
  • It frames the “loss function” as the real driver of outcomes, implying that changing metrics/objectives would be necessary to reduce harmful or low-value content loops.
  • The piece emphasizes that recommender systems are operating as designed, shifting responsibility toward the optimization goals chosen by product and platform operators.
  • It implicitly critiques the way training/optimization signals can produce feedback loops that amplify suboptimal content over time.
Writing the loss function: AI, feeds, and the engagement optimizer

There is growing AI slop on social media. Recommender systems push what works and there is some slop that works for someone approximately like you. These systems are functioning exactly as intended, which means the issue is what they're optimizing for. Not AI.

submitted by /u/AWildMonomAppears
[link] [comments]