New from O’Reilly: The memory architecture behind adaptive AI agents

Read the report

Blog

Redis at AWS re: Invent 2025: Advancing cloud and AI workloads

January 08, 20264 minute read
Cassy Speirs
Cassy Speirs
Redis-molly-zeiger
Molly Zeiger

AWS re:Invent 2025 has wrapped up. Redis showed up with a clear focus on customers, partners, and real world cloud workloads. As a Diamond Sponsor, we lit up the Venetian with packed sessions, a booth that was standing-room-only, high-value executive meetings, and three unforgettable events (including one very surreal GenAI experience).

Here’s what we brought to Las Vegas, and how Redis is shaping the future of real-time and AI-powered apps.

Telling the Redis story: Faster apps, smarter AI, lower cost

Redis open source continues to lead, and 8.4 extended that lead with more performance and better observability. Redis Cloud cemented itself as the best way to run Redis anywhere, and Redis Flex turned heads with its scale and cost efficiency.

Redis for AI owned the conversation this week. Semantic caching, vector search, agent memory – these are powering real workloads for companies like Sky and iFood. The nonstop traffic at our LangCache, Vector Sets, RDI (Redis Data Integration), and Redis Insight demos made that clear.

RDI was a standout

RDI (Redis Data Integration) resonated strongly with teams looking to bring fresh, reliable data into Redis without adding complexity. Live demos showed how RDI keeps Redis in sync with operational systems in real time, powering low-latency apps, analytics, and AI use cases without brittle pipelines or heavy lifting. From event-driven architectures to AI-ready data flows, attendees quickly saw how RDI turns Redis into a true real-time data hub, not just a cache. The steady crowds and deep technical questions made it clear: Customers are eager for simpler, faster ways to move data into Redis—and RDI delivers.

A growing relationship with AWS

Our presence this year reinforced how strong the Redis and AWS partnership has become. Thousands of AWS customers already rely on Redis for high-performance, real-time workloads across industries. Together, we help organizations scale globally, simplify operations, and roll out new real-time and AI-driven apps faster. That collaboration continues to deepen, and we’re focused on helping joint customers modernize, adopt real-time AI patterns, and expand into new data-intensive use cases.

Breakout sessions that put customers at the center

Sky's Global Streaming Technology team showed how Redis supports the future of streaming with real-time performance at a global scale. iFood walked through how they modernized their Redis cache to deliver consistently low-latency experiences. Both rooms were full, which only reinforced that customers tell our story better than we ever could.


A theater lineup powered by partners

The booth theater ran nonstop with sessions from Arcade AI, LangChain, Baseten, Stacker Group, Tensormesh, Tavily, Weights and Biases, and Redis experts. These talks showcased what’s possible when Redis meets the fastest-moving AI ecosystem in the industry.

Hands-on demos that made it real

We didn’t just talk about Redis. We showed Redis in action. Attendees lined up for Celebrity Face Match using Redis Vector Sets, Redis Chat powered by agent memory, an AI agent for financial advisors using Redis and AWS Bedrock, RDI for real-time sync, LangCache for semantic caching, Redis Insight visualizations, and a pricing calculator that routinely showed savings of more than $250K a year for 1.6 TB (Terabyte) workloads.

Building community with unforgettable events The Redis Welcome Party kicked off the week with more than 500 attendees and immediately created a space where customers, partners, and builders could connect in a genuine way. People stayed because the energy was high and the conversations were captivating.

The VIP Executive Reception brought key customer leaders together for focused discussions in a setting that encouraged candor and collaboration. It wasn’t just a networking hour; it gave executives room to talk strategy and compare where they’re betting on real-time data and AI.

The Hallucination Hub added a creative spark to the week. It blended GenAI storytelling with real Redis performance concepts, drawing steady interest and sparking conversations about what’s possible in production. Our partners Arcade.dev, Baseten, and Tavily helped turn it into an experience people kept talking about.

Industry roundtables that cut through the noise

The Redis Partner team held focused meetings on financial services, AI, and high resilience. These are three areas where real-time performance on AWS is becoming more important. The conversations explored how leaders are building trustworthy AI, low-latency decisioning, and resilient architectures in the cloud.

Across all three sessions, a clear theme emerged: Organizations running on AWS are looking for a real-time backbone they can trust, and Redis is increasingly at the center of those strategies. As companies modernize and scale their cloud footprints, Redis continues to play a defining role in powering high-accuracy, always-on systems.

Looking ahead

re:Invent 2025 confirmed what we already knew. Redis is the engine behind real-time apps and is now the foundation for the next generation of AI. Flex, vector search, agent memory. These aren’t incremental updates. They’re shifting what builders can deliver.

And we’re nowhere close to being done.

See you next year, Vegas.

Get started with Redis today

Speak to a Redis expert and learn more about enterprise-grade Redis today.