Read the Report — State of Applied AI →
TechnologyProduct Development

How Rappi Cut Search Latency by 40% with Oracle AI Vector Search

Rappi, Latin America’s fastest-growing on-demand delivery app serving over 300 cities, replaced its keyword-based search engine with Oracle AI Vector Search and Oracle Cloud Infrastructure Generative AI to enable semantic and image-based product discovery. The upgrade reduced search response latency by 40% and improved conversion rate by 25%, driving higher engagement and order volumes across the platform.

Impact

40%

Search response latency reduction

25%

Conversion rate improvement

Challenge

Rappi’s keyword-based search could not handle vague queries, misspellings, or low-interaction keywords, limiting its ability to accurately interpret user intent and surface relevant products — constraining conversion rates across a catalog of millions of items spanning restaurants and retail merchants.

Solution

Rappi deployed Oracle AI Vector Search on Oracle Autonomous AI Database and Oracle Cloud Infrastructure Generative AI to enable semantic and image-based search that interprets user intent and matches queries to catalog items based on underlying meaning rather than keyword overlap, without requiring data movement between systems.

Tools & Technologies

What Leaders Say

By implementing Oracle AI Vector Search, we not only get lower latency, but we’ve also improved our conversion rate. Users are more engaged with the Rappi app to discover new products. At the end of the day, that means more orders that contribute to the growth of our business.

Juan Diego Sánchez, VP of Product Consumers, Rappi
Get the full context.

Sign up to read complete case studies, access detailed metrics, and unlock all use cases.

Full Story

Rappi was founded in 2015 as a grocery delivery service and has since expanded into a super app covering food, retail, pharmacy, and financial services across more than 300 Latin American cities. With millions of user queries per minute flowing through its delivery platform and a catalog spanning restaurants and retail merchants, the quality of its search experience directly shapes order conversion and customer retention.

The company’s previous keyword-based search engine created a structural problem: it could not resolve vague queries, handle misspelled words, or interpret search intent from low-interaction keywords. A user searching for “something light for lunch” or making a typo in a restaurant name received poor or irrelevant results. This limited the app’s ability to surface relevant products and constrained conversion rates across its catalog.

Rappi selected Oracle AI Vector Search on Oracle Autonomous AI Database, running on Oracle Cloud Infrastructure, as its core search and data processing platform. The architecture brings AI capabilities directly to where the company’s data resides, eliminating the need to move data between systems or maintain separate vector databases. Oracle Cloud Infrastructure Generative AI powers large language model functionality within the search layer, enabling natural language and image-based queries against Rappi’s extensive retail and restaurant catalogs.

The shift to semantic search had an immediate effect on performance. Search response latency fell by 40%, while the increased relevance of results drove a 25% improvement in conversion rate. Users engaging with the Rappi app through natural language or image searches now receive faster and more contextually accurate results, increasing product discovery and driving higher average order values.

Rappi views Oracle as a strategic partner in its ongoing AI evolution. The combination of reliability, support for advanced AI capabilities, and the ability to scale to millions of queries per minute made Oracle the platform of choice for this infrastructure migration. The company continues to expand its AI use cases on Oracle’s infrastructure as it looks to deepen personalization and product recommendation across the platform.

Similar Cases

P
Pfizer
93%
database reduction

Pfizer achieved a 93% database reduction and 20% cost avoidance by migrating their global SAP environment to S/4HANA on IBM Power10 infrastructure.

PharmaceuticalsTechnologyICIBM ConsultingIPIBM Power Virtual Server
C
Classmethod
up to 90%
reduction in development time

Classmethod, a leading Japanese cloud integrator, deployed Claude Code across its engineering teams to address chronic developer shortages. The tool automated code generation, review, and testing workflows, reducing development time by up to 90% on specific tasks and cutting code review time by 80%.

TechnologyCCClaude Code
C
Confluent
15,000+
hours saved monthly

Confluent, a data streaming platform company with 2,000+ employees and 4,000+ customers, deployed Glean to solve the knowledge fragmentation that came with rapid growth from 250 to 2,000+ employees across 20+ systems. Glean indexed the company's full tool stack — Slack, Salesforce, Confluence, and more — enabling instant knowledge retrieval across all teams. The result: 15,000+ hours saved monthly, a 13% increase in support team satisfaction, and over 70% employee adoption.

TechnologyGGlean
N
Nextdoor
2–3x
engineering productivity improvement

Nextdoor, the neighborhood social network, deployed Glean as a unified Work AI layer embedded directly into the tools employees already use. Rather than mandating adoption, the team built a self-reinforcing learning loop of Slack channels, live office hours, and quick-win storytelling that turned early experimentation into company-wide AI habits — with engineering productivity gains of 2–3x and RevOps workflows shrinking from hours to minutes.

TechnologyGGlean
L
Lusha
300%
increase in outbound leads

Lusha is a B2B sales intelligence platform with 1.5 million users and a database of over 200 million business contacts. By deploying Elasticsearch as both a full-text search engine and a vector database for AI-powered lead recommendations, Lusha helps customers generate 300% more leads, achieve conversion rates up to 10x higher, and realize return on investment of up to 1,000%.

TechnologyEElasticsearch
T
Tabnine
50%
improvement in response times

Tabnine integrated Claude 3.5 Sonnet via Amazon Bedrock into its AI coding assistant, serving over 1 million monthly developers. The migration delivered 50% faster response times, a 20% increase in free-to-paid conversions, and a 20-30% reduction in churn—while meeting strict security and compliance requirements for regulated industries.

TechnologyABAmazon BedrockCClaude
F
Factory
550,000 hours
development time saved

Factory built autonomous AI agents called Droids using Claude 3 Opus and Claude 3 Haiku to automate labor-intensive software engineering tasks at scale. These Droids handle code review, documentation, refactors, migrations, and feature requests across the entire software development lifecycle. Customers using Factory Droids saved 550,000 hours of development time and saw a 20% reduction in development cycle time.

TechnologyC3Claude 3 HaikuC3Claude 3 Opus
A
Apollo
35%
increase in meeting bookings

Apollo integrated Claude 3.5 Haiku into its sales engagement platform to power intelligent, personalized messaging and prospect research at scale. The AI-driven features help sales teams write highly effective outreach without requiring technical expertise, generating over 5 million Claude-powered messaging actions per month. Customers using Claude-powered messaging saw a 35% boost in meeting bookings and a 15% increase in retention rates.

TechnologyC3Claude 3.5 Haiku