Deliver hyper-personalized viewer experiences with an agentic AI movie assistant using Amazon Bedrock AgentCore and Amazon Nova Sonic 2.0

Amazon AWS AI Blog / 3/31/2026

💬 OpinionDeveloper Stack & InfrastructureSignals & Early TrendsTools & Practical Usage

Key Points

  • The post demonstrates how to build an agentic AI movie assistant that delivers hyper-personalized viewing recommendations through natural dialogue.
  • It uses Amazon Bedrock AgentCore alongside the Strands Agents SDK to orchestrate agent behavior and integrate user preference understanding.
  • Amazon Nova Sonic 2.0 is applied to enhance the assistant’s conversational or audio-related interaction capabilities for a more engaging “entertainment concierge” experience.
  • A Model Context Protocol (MCP) layer is used to connect the agent with contextual information so the assistant can tailor results to individual user preferences.
  • Two concrete use cases are presented to show how the architecture improves the overall user viewing experience.
In this post, we walk through two use cases that help enhance the user viewing experience using agentic AI tools and frameworks including Strands Agents SDK, Amazon Bedrock AgentCore, and Amazon Nova Sonic 2.0. This agentic AI system uses a Model Context Protocol (MCP) to deliver a personal entertainment concierge that understands user preferences through natural dialogue.