Serverless AI Agent built on AWS Bedrock

by Ravi Yeluru

What is Amazon Bedrock?
Amazon Bedrock is AWS’s bold answer to making generative AI accessible to everyone. It gives developers access to powerful foundation models—like Anthropic's Claude, AI21's Jurassic, and Amazon's own Titan—through simple API calls.
No GPUs. No model training. No infrastructure to babysit.
Just plug in, prompt the model, and get smart responses.
The key difference? Bedrock lets you integrate these LLMs natively into your AWS applications. That means you can trigger AI responses from Lambda functions, pipe data from S3, schedule daily reports, and scale to thousands of users without lifting a finger.
Amazon Bedrock represents a major step forward in enabling developers to build AI-powered applications using foundation models without needing to manage infrastructure. While AWS users are familiar with services like Lambda, S3, API Gateway, and CloudFront, Bedrock is relatively new and still underexplored by most developers. The idea of AI agents—intelligent systems that can perceive, reason, and act—is also evolving beyond traditional chatbots.
Use Case: Personalized Stock Market Insights by AI
Imagine waking up each morning to an email in your inbox—not just a newsletter, but a smart research assistant that has already combed through stock data and breaking news to give you a personalized market briefing.
A real-time, AI-generated summary tailored to your favorite stocks—complete with market data, news, and a plain English recommendation like Buy, Sell, or Hold.
This is something you can build, today, using nothing but:
  • A bit of React and JavaScript
  • A dash of Python
  • Serverless AWS services
  • And the magic of Amazon Bedrock
What You'll Build
In this hands-on project, you’ll create an AI stock research agent that:
Lets users search for companies/tickers
Asks for their email
Stores their selections in DynamoDB
Runs a daily scheduled Lambda
Fetches price and volume using FMP API
Pulls news using NewsAPI
Crafts a smart prompt for Claude via Bedrock
Sends a personalized HTML email with insights
Includes a working unsubscribe link
Deploys the React frontend on S3 + CloudFront
Works fully serverless end-to-end
What Makes This an AI Agent?
An AI agent is more than just code running in the cloud. It’s a self-operating system that connects the dots between data, reasoning, and action. AI agents can take initiative, pull live data, analyze it in real-time, and make decisions autonomously. That’s exactly what we are going to build in this post. Every morning, it wakes up, gathers fresh stock prices and news, reasons through it with a large language model (Claude on Amazon Bedrock), and delivers personalized insights directly to your inbox.
But how is this different from just writing a Python script or building a chatbot?
It’s a much bigger story—and I break it down fully in this companion article on AI agents.
👉 If you're curious about what agents are, how they're structured, and how they reason, that article is your deep dive.
Tech Stack Overview
Before we dive into the how-to, let’s walk through the ingredients that make this agent work.
This is a fully serverless application, which means there are no servers to manage, patch, or scale. Everything runs in response to events — and each part of the tech stack plays a specific role in the pipeline.
We start with the frontend, which is a simple React app. This is what the user interacts with — a search bar, stock suggestions, and a form to enter their email. Once a user selects up to 10 stocks and hits subscribe, the data is securely sent to our backend.
The React app is hosted on Amazon S3 and made publicly accessible (and blazing fast) using CloudFront, Amazon’s content delivery network. This combo makes it super easy to deploy frontend changes.
The backend is handled entirely by AWS Lambda, where each function runs in isolation, only when triggered. For example, when someone subscribes, a Lambda function stores the data into a DynamoDB table. When it’s time to send daily reports, another Lambda wakes up, fetches the stock data, prompts the AI, and sends out the emails.
The data storage is powered by Amazon DynamoDB — a fast, serverless NoSQL database. We use it to store each user’s email and the list of stocks they selected. It’s highly scalable and requires zero maintenance.
The core of the intelligence is handled by Claude, a large language model from Anthropic, accessed through Amazon Bedrock. This is where the magic happens. Bedrock lets us pass structured prompts to Claude and receive a full analysis of each stock — written in natural language, with a recommendation at the end.
To power the analysis, we integrate with two public APIs:
  • Financial Modeling Prep (FMP): For live stock prices and volume
  • NewsAPI: For the most recent headlines related to each stock
Finally, once the insights are ready, we send out beautifully formatted HTML emails using Amazon Simple Email Service (SES). SES handles delivery, reputation management, and bounce tracking — and it does it affordably, at scale.
Let’s review everything at a glance:
Everything ties together cleanly and deploys without touching a single server.
Now that we understand the moving parts, let’s look at how they talk to each other.
End-to-End Architecture Diagram
Let’s look at how everything connects — from the moment a user visits the website to the moment they receive a personalized AI-generated stock insight in their inbox.
At a high level, the system flows like this:
Let’s break this down:
  • The React frontend is statically hosted and accessible via a clean CloudFront URL.
  • User interaction (stock selection + email input) triggers a POST request to API Gateway.
  • Lambda stores this data in DynamoDB in a clean { email, stocks[] } structure.
  • Once a day, a scheduled Lambda wakes up (via EventBridge), pulls all active subscribers, fetches their stock data and news, and calls Claude via Bedrock to generate insights.
  • The response is formatted into a neat HTML email and sent using SES.
  • Every email includes a live unsubscribe link, which hits another API Gateway + Lambda combo to remove the record from DynamoDB and display a confirmation.
This is a truly end-to-end serverless AI agent, and every component is scalable, cost-efficient, and production-ready.