OpenAI now lets you screenshot your privacy in the foot

The Register / 4/23/2026

💬 OpinionIdeas & Deep AnalysisTools & Practical UsageIndustry & Market Moves

Key Points

  • The article discusses a new OpenAI feature that lets users “screenshot” or otherwise capture privacy-relevant information produced during interactions with AI systems.
  • It frames the functionality as a form of self-surveillance that may make a model appear smarter while raising questions about user privacy.
  • The headline emphasizes the risk implied by the feature’s privacy trade-offs, suggesting the information captured could be sensitive.
  • The piece is written from a critical/analytical perspective rather than as a straightforward product announcement.

OpenAI now lets you screenshot your privacy in the foot

Make your model smarter through self-surveillance

Wed 22 Apr 2026 // 19:56 UTC

Those who cannot remember Microsoft Recall are condemned to repeat it. 

Today, that applies to OpenAI, which has quietly introduced an opt-in research preview called Chronicle. It's designed to capture the user's screen and feed those images to OpenAI's Codex agent so it has access to more contextual information.

"Chronicle augments Codex memories with context from your screen," the company explains in its documentation. "When you prompt Codex, those memories can help it understand what you've been working on with less need for you to restate context."

For those who have forgotten or may have missed the outcry, Microsoft in 2024 introduced a Windows feature called Recall that takes screenshots of the user's desktop environment every few seconds and saves the results to disk. The idea is that providing Copilot services with more contextual information makes them more useful.

The cybersecurity community promptly piled on, describing Recall as a keylogger, a privacy nightmare, and litigation bait. After a few months of public bludgeoning, Microsoft made some revisions to appease critics. 

Nonetheless, browser maker Brave went on to offer Recall screenshot blocking, which looks like a worthwhile endeavor given our own tests that found Recall saving images of credit card numbers and passwords despite supposed sensitive information filters.

OpenAI perhaps forgot about Microsoft's reputational flogging, or maybe it believes the needs of the model outweigh the needs of the few who bother with security and privacy. Another possibility is that the AI biz has embraced masochism as a public relations strategy.

No sooner had OpenAI's Chronicle documentation appeared this week than security researcher Michael Taggart took note of the resemblance, writing, "Oh my god, OpenAI reinvented Recall, but for macOS."

On the plus side, Chronicle is self-inflicted – it's opt-in – and available only in the Codex app for macOS.

The strikes against it are more extensive. OpenAI's documentation explains some of these problems: "Before enabling, be aware that Chronicle uses rate limits quickly, increases risk of prompt injection, and stores memories unencrypted on your device."

So it burns through Codex rate limits faster, increases the user's exposure to prompt injection through screen captures that may contain malicious instructions, and sends selected screenshot data to OpenAI's servers to generate local memories from OCR and other extracted context. That's not the most compelling sales pitch.

At least the local image storage is brief – OpenAI says its screenshots are only stored for six hours. 

But the data derived from those images via OCR text extraction may persist beyond that time in "memories" – text-based Markdown files that make information available in later sessions.

OpenAI's description of the memory generation process omits some details. The company says screen captures are temporarily stored on-device, then processed on its servers to generate "memories," which in turn get stored on-device.

The screen captures transmitted to OpenAI are not used for training or stored – unless required by law – the documentation claims. However, it's not clear whether the memories – the OCR-derived text – are stored on company servers, or could be stored given a lawful demand to do so. The Register asked OpenAI to clarify, and will update this story if we hear back.

In any event, while screen captures are short-lived, the text stored in memories ($CODEX_HOME/memories_extensions/chronicle/) remains until deleted. It's worth noting that anyone using Chronicle may end up re-sharing captured content with OpenAI through prompts to Codex that use those stored memories.

OpenAI does acknowledge that Chronicle poses some risk: "Both directories for your screen captures and memories might contain sensitive information. Make sure you do not share content with others, and be aware that other programs on your computer can also access these files."

You've been warned: The footgun shoots you in the foot. ®

More like these
×

Narrower topics

Broader topics

More about

More like these
×

Narrower topics

Broader topics

TIP US OFF

Send us news