dot Stop testing, start deploying your AI apps. See how with MIT Technology Review’s latest research.

Download now

Visit Redis at re:Invent 2023

A lot has changed since last re:Invent. 

Last November, only the most astute followers of technology trends would have questioned if this blog – or any blog – was generated by AI. Given the recent emergence of large language models (LLMs) like ChatGPT, that is now a legitimate question we all frequently ask about content we consume.

Dystopian nightmares or ethical concerns aside, AI has suddenly arisen as a tremendous opportunity for businesses and society as a whole. Teams are investing heavily into this technology because it promises to make them far more agile and efficient. AI can be used to automate repetitive tasks, reduce human error, interact with customers, and even aid with innovation (developing software) or make smart and high stakes decisions (detecting cancer). If you’re like the majority of technology teams today, you’re abruptly faced with the challenges associated with your company’s clamor to realize these benefits and incorporate AI into your technology stack.

Powering the next generation of AI apps

Just as Redis was at the forefront of a new generation of real-time applications over a decade ago, it’s also emerged as a preferred technology for the coming generation of AI-driven apps. Redis is now an essential component of many AI-based Q&A chatbots, recommendation systems, document search and retrieval systems, and fraud detection applications.  

Find us at re:Invent and we’d be happy to dive deeper into how Redis can make it easier for you to build Large Language Model (LLM)-powered AI apps. Discover how we can make your AI apps faster, more accurate, easier to build, and less expensive, with: Retrieval Augmented Generation (RAG), semantic caching, and semantic search. 

If you’re interested in learning more about retrieval augmented generation, come see our session: “Real-time RAG: How to Augment LLMs with Redis and Amazon Bedrock” as our very own AI expert, Sam Partee, presents on November 29 at 12:00PM at the Lightning Theater – Data Zone. 

Do more with less

The need for cost savings has been another major focus area over the many months since last re:Invent as headlines have been replete with stories of missed revenue projections. This has furthered the need for businesses to tighten their belts, and technology budgets have been no exception. As a result, teams running workloads in AWS must do more with less. 

Thankfully if you’re looking to reduce costs in AWS, Redis Cloud offers up to 70% savings over ElastiCache! Find us at re:invent to learn how we can help you weather the economic storm, with: optimal infrastructure sizing, multi-tenancy, cost-efficient replication, auto-tiering, and best-in-class performance that brings up to 230% more throughput over ElastiCache on the same infrastructure footprint. 

Work smarter, not harder

But in tough economic times you don’t just seek ways to use less infrastructure, or cut budget –  you need to find and eliminate complexity. Thankfully, Redis has been against complexity since inception. In fact, it’is even a core principle of the Redis manifesto. True to form, we’ve been busy identifying ways we can reduce complexity for our customers and our vibrant open source community. 

We’ve continued our fight against database and cloud service sprawl, providing customers with flexibility and simplicity in their data layer. Redis Enterprise was built to handle a wide range of real-time use cases. Instead of using an array of individual services for things like; caching, session storage, messaging, JSON documents, feature storage, etc., you can simplify your technology stack and reduce vendor sprawl by consolidating around Redis. Many know Redis as a cache, but with Redis Enterprise you can do more than you thought possible. Redis Enterprise was built for all your real-time use cases, beyond caching, to help you do things like: 

  • Instantly detect fraud to keeps banks and consumers a step ahead of criminals
  • Provide responsive, online gaming leaderboards at massive scales
  • Use a lightning fast in-memory primary database for a wide variety of real-time use cases
  • Achieve the speed needed for the next generation of AI and ML applications with Redis Enterprise as a vector database or online feature store

We’ve also doubled down on our commitment to making our software easy and delightful to build with and use, expanding support to five new client libraries: Jedis, node-redis, redis-py, NRedisStack, and go-redis. And we’ve introduced a new tool called Redis Data Integration that can take data from a variety of sources and help transform it into real-time data in Redis Enterprise. 

Join Redis at re:Invent!

In case you couldn’t tell we’re obsessed with our customers and our community, and we’ve been busy at work innovating to make your lives easier. We have a lot to share this year at re:Invent and we’d love to hear from you as well.

  • Come chill out with us at booth 1681! We’d love to talk about Redis, your apps and challenges, and give you the latest and greatest Redis swag. Since we love speed (and know you do too) don’t forget to sign up for our daily raffle for a Premium Lego Technic Race Car Model.
  • Check out our session on Retrieval Augmented Generation (RAG) on November 29 at 12:00PM at the Lightning Theater 5 – Data Zone. 

And don’t forget to book a meeting with Redis executives at AWS re:Invent to talk about the specific needs of your organization and find out how Redis can help. Space is limited, so book your meeting now.