How Rappi Cut Search Latency by 40% with Oracle AI Vector Search
Rappi, Latin America’s fastest-growing on-demand delivery app serving over 300 cities, replaced its keyword-based search engine with Oracle AI Vector Search and Oracle Cloud Infrastructure Generative AI to enable semantic and image-based product discovery. The upgrade reduced search response latency by 40% and improved conversion rate by 25%, driving higher engagement and order volumes across the platform.
Impact
40%
Search response latency reduction
25%
Conversion rate improvement
Challenge
Rappi’s keyword-based search could not handle vague queries, misspellings, or low-interaction keywords, limiting its ability to accurately interpret user intent and surface relevant products — constraining conversion rates across a catalog of millions of items spanning restaurants and retail merchants.
Solution
Rappi deployed Oracle AI Vector Search on Oracle Autonomous AI Database and Oracle Cloud Infrastructure Generative AI to enable semantic and image-based search that interprets user intent and matches queries to catalog items based on underlying meaning rather than keyword overlap, without requiring data movement between systems.
Tools & Technologies
What Leaders Say
“By implementing Oracle AI Vector Search, we not only get lower latency, but we’ve also improved our conversion rate. Users are more engaged with the Rappi app to discover new products. At the end of the day, that means more orders that contribute to the growth of our business.”
Sign up to read complete case studies, access detailed metrics, and unlock all use cases.
Full Story
Rappi was founded in 2015 as a grocery delivery service and has since expanded into a super app covering food, retail, pharmacy, and financial services across more than 300 Latin American cities. With millions of user queries per minute flowing through its delivery platform and a catalog spanning restaurants and retail merchants, the quality of its search experience directly shapes order conversion and customer retention.
The company’s previous keyword-based search engine created a structural problem: it could not resolve vague queries, handle misspelled words, or interpret search intent from low-interaction keywords. A user searching for “something light for lunch” or making a typo in a restaurant name received poor or irrelevant results. This limited the app’s ability to surface relevant products and constrained conversion rates across its catalog.
Rappi selected Oracle AI Vector Search on Oracle Autonomous AI Database, running on Oracle Cloud Infrastructure, as its core search and data processing platform. The architecture brings AI capabilities directly to where the company’s data resides, eliminating the need to move data between systems or maintain separate vector databases. Oracle Cloud Infrastructure Generative AI powers large language model functionality within the search layer, enabling natural language and image-based queries against Rappi’s extensive retail and restaurant catalogs.
The shift to semantic search had an immediate effect on performance. Search response latency fell by 40%, while the increased relevance of results drove a 25% improvement in conversion rate. Users engaging with the Rappi app through natural language or image searches now receive faster and more contextually accurate results, increasing product discovery and driving higher average order values.
Rappi views Oracle as a strategic partner in its ongoing AI evolution. The combination of reliability, support for advanced AI capabilities, and the ability to scale to millions of queries per minute made Oracle the platform of choice for this infrastructure migration. The company continues to expand its AI use cases on Oracle’s infrastructure as it looks to deepen personalization and product recommendation across the platform.