Deliver hyper-personalized viewer experiences with an agentic AI movie assistant using Amazon Bedrock AgentCore and Amazon Nova Sonic 2.0
Amazon AWS AI Blog / 3/31/2026
💬 OpinionDeveloper Stack & InfrastructureSignals & Early TrendsTools & Practical Usage
Key Points
- The post demonstrates how to build an agentic AI movie assistant that delivers hyper-personalized viewing recommendations through natural dialogue.
- It uses Amazon Bedrock AgentCore alongside the Strands Agents SDK to orchestrate agent behavior and integrate user preference understanding.
- Amazon Nova Sonic 2.0 is applied to enhance the assistant’s conversational or audio-related interaction capabilities for a more engaging “entertainment concierge” experience.
- A Model Context Protocol (MCP) layer is used to connect the agent with contextual information so the assistant can tailor results to individual user preferences.
- Two concrete use cases are presented to show how the architecture improves the overall user viewing experience.
In this post, we walk through two use cases that help enhance the user viewing experience using agentic AI tools and frameworks including Strands Agents SDK, Amazon Bedrock AgentCore, and Amazon Nova Sonic 2.0. This agentic AI system uses a Model Context Protocol (MCP) to deliver a personal entertainment concierge that understands user preferences through natural dialogue.
💡 Insights using this article
This article is featured in our daily AI news digest — key takeaways and action items at a glance.
Related Articles

Black Hat Asia
AI Business

Claude Code tokens: what they are and how they're counted
Dev.to

How I Review AI-Generated Pull Requests (A Step-by-Step Checklist)
Dev.to

Freedom and Constraints of Autonomous Agents — Self-Modification, Trust Boundaries, and Emergent Gameplay
Dev.to
Von Hammerstein’s Ghost: What a Prussian General’s Officer Typology Can Teach Us About AI Misalignment
Reddit r/artificial