We should decelerate AI adoption by law, at least for the short term.

Reddit r/artificial / 5/2/2026

💬 OpinionSignals & Early TrendsIdeas & Deep AnalysisIndustry & Market Moves

Key Points

  • The author argues that the biggest near-term risk of AI is not the end-state technology but the speed of adoption, which can cause sudden job displacement.
  • They claim that unlike prior innovations, LLM-driven change can happen almost overnight and potentially eliminate many white-collar roles in a very short time.
  • The piece warns that rapid automation will likely worsen existing wealth disparities, regardless of whether people are skeptical or enthusiastic about AI.
  • Culturally asking society to “slow down” adoption is unlikely to work because corporations prioritize growth and profit, not social stability.
  • The author concludes that governments should intervene—potentially through legal measures—to enable a smoother transition to an AI-centered economy in the short term.

This is probably a controversial take in this sub, but to be clear, this is not an anti-AI post; it is just about our implementation of it.

My biggest fear of AI is not the final product. I am fully confident that in 100 years, once we adjust to an AI-centred economy, there won't be any major problems. Not to say it would be perfect, but I think we would eventually structure ourselves around it in a (somewhat) healthy way.

My primary concern now is for the short term. Now, with every innovation, there is generally an accepted level of job loss. That will just happen. It usually wasn't a big deal, because innovation and adoption are usually a slow process.

But with AI, particularly LLMs, of course, this is happening literally all at once (almost overnight) and has the potential to wipe out every single white-collar job. Whether you are a Luddite or an Accelerationist, you cannot deny that it is going to have a huge effect on the economy and will contribute hugely to the wealth disparities that already exist.

Culturally, it is not enough to say "let's slow down our adoption of this, so millions don't lose their jobs." That will do nothing. Corporations do not exist to follow cultural norms or keep society from cracking; they exist to grow and make money, which is not illegal by any stretch.

However, I think that now, more than ever, governments should step in, in some capacity, which will ultimately give us a smoother transition to a fully AI-centred future. I know this is vague ("stepping in" means something different for everyone), but I believe this argument more addresses the philosophical side than the strictly political.

submitted by /u/palopatrol
[link] [comments]