Even Microsoft knows Copilot shouldn't be trusted with anything important

The Register / 4/3/2026

💬 OpinionIdeas & Deep AnalysisTools & Practical Usage

Key Points

  • Microsoft’s Copilot terms of service explicitly frame the tool as being for “entertainment” and warn that it can produce incorrect information.
  • The article argues that these contractual limitations effectively mean Copilot should not be trusted for anything important or high-stakes decision-making.
  • It highlights a mismatch between how such assistants are often marketed/used and the level of reliability and responsibility implied by the terms.
  • The piece encourages users and organizations to treat Copilot outputs as non-authoritative and to verify critical information through other means.
  • It positions the situation as a cautionary signal about governance, accountability, and risk management when adopting AI assistants.

Even Microsoft knows Copilot shouldn't be trusted with anything important

Terms admit it is for entertainment only and may get things wrong

Thu 2 Apr 2026 // 17:04 UTC

A recent surge of interest in Microsoft's Terms of Use for Copilot is a reminder that AI helpers are really just a bit of fun.

Despite the last update taking place in late 2025, the document for Copilot for Individuals recently attracted new attention from netizens. It includes this gem: "Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don't rely on Copilot for important advice. Use Copilot at your own risk."

Regular readers of The Register won't be shocked by Microsoft's warning that Copilot gets things wrong and should not be relied on. The company itself has long acknowledged the assistant's limitations. During the London leg of its AI tour, for example, every demonstration of Copilot wizardry came with a warning that the tool could not be fully trusted and that human verification was required.

The same applies to any other AI assistant: they can be useful, but their output still needs checking, particularly on anything consequential like medical advice or an investment plan.

As one commenter on Hacker News pointed out: "Anthropic does a somewhat similar thing. If you visit their ToS (the one for Max/Pro plans) from a European IP address, they replace one section with this: Non-commercial use only. You agree not to use our Services for any commercial or business purposes and we (and our Providers) have no liability to you for any loss of profit, loss of business, business interruption, or loss of business opportunity." (The Register checked this from a US and a European IP and can confirm this is the case.)

The commenter added: "It's funny that a plan called 'Pro' cannot be used professionally."

As for Copilot's Terms of Use, they may not be new, but the attention is useful for two reasons. It is a reminder to read the text users so often click through, and it underlines that chatbots such as Copilot are neither companions nor dependable sources of advice.

Instead, they are error-prone tools that can be helpful one moment and confidently wrong the next. Some in the tech industry may market AI assistants as though they put a genius in every laptop, but Microsoft's own warning is rather less grand: "It can make mistakes, and it may not work as intended."

Copilot for Individuals may be for entertainment purposes only. Microsoft 365 Copilot, meanwhile, can be just as inaccurate, only with fewer laughs. ®

More like these
×

More about

More like these
×

TIP US OFF

Send us news