AI tools that generate, understand, and reason with natural language, including foundation models, instruction-tuned LLMs, and multimodal models.
30
Use Cases
21
Tools
29
Companies
Large Language Models Tools
Fully managed service for foundation model deployment
Large language model by Anthropic designed for safe, helpful enterprise AI applications and workflows.
Azure-hosted OpenAI models for enterprise AI applications
Large language model by OpenAI for conversational AI and natural language understanding in enterprise apps.
Large language model by Anthropic for natural language understanding and generation in enterprise apps.
Efficient large language model by Anthropic for fast, low-cost inference in high-volume applications.
Fast, compact large language model by Anthropic for high-throughput, cost-efficient AI applications.
Advanced large language model by Anthropic balancing high capability with cost for enterprise deployments.
Fast, cost-efficient large language model by Anthropic optimized for high-volume, latency-sensitive tasks.
Flagship large language model by Anthropic for the most complex reasoning and enterprise AI tasks.
Frontier large language model by Anthropic with advanced reasoning for complex enterprise AI tasks.
High-capability large language model by Anthropic balancing performance and speed for business applications.
Large language model by Cohere optimized for enterprise text generation and retrieval-augmented tasks.
Large language model by Google with multimodal capabilities for reasoning, coding, and text generation.
Google multimodal AI model family
Open-source foundation model family by IBM optimized for enterprise language and code tasks.
General-purpose large language model used for text generation and understanding in AI applications.
Large language model technology for generating and understanding natural language in AI applications.
Foundation model technology for generating text, answering questions, and powering conversational AI.
Generative AI platform by OpenAI providing LLM APIs for building intelligent applications.
Suite of large language models by OpenAI powering text generation, reasoning, and conversational AI.
Sign up to read complete case studies, access detailed metrics, and unlock all use cases.
Use Cases (30)
TaskUs is a leading outsourced digital services company providing next-generation customer experience (CX) for innovative global brands. To move beyond flat-file embedding storage and scaling limitations, TaskUs built TaskGPT—a proprietary GenAI platform—with Pinecone as the core vector database for semantic search, RAG-based knowledge retrieval, and client-specific recommendations. The result: a 20% reduction in average handle time and a 5% increase in customer satisfaction across client deployments.
Assembled is a workforce management and customer support optimization platform serving enterprises like Stripe, Etsy, and DoorDash. To power Assembled Assist, the company built a hybrid RAG pipeline combining Pinecone vector search with Algolia keyword retrieval and LLMs from OpenAI and Anthropic. Support tasks that previously took 40 minutes now complete in 2 minutes—a 95% reduction in handling time.
Fujitsu, the global IT and digital transformation company with 124,000 employees, partnered with Cohere to develop Takane — a state-of-the-art Japanese large language model built on the Cohere Command series. Designed for private deployment in regulated sectors such as finance, healthcare, and government, Takane delivers world-class performance on the JGLUE benchmark and is now integrated into Fujitsu’s AI service offerings and data intelligence platform.
Icatu Seguros, one of Brazil’s largest life and pension insurers, deployed A.V.I.—a WhatsApp-based AI assistant powered by generative AI and orchestrated by n8n—to put real-time quoting and product information directly in brokers’ hands. The assistant reduced quotation time from roughly five minutes to under 40 seconds, now serves more than 1,000 brokers daily, and earned second place at the Gartner Eye on Innovation Awards for Insurance 2025.
Morrisons, one of the UK’s largest supermarkets serving nine million customers weekly across 500 stores, migrated its on-premise data warehouse to BigQuery and Looker, reducing reporting lag by 98.96% from one day to 15 minutes. Real-time data now powers Vertex AI demand forecasting models and a customer-facing Product Finder app that receives 50,000 hits per day during peak periods.
Giles AI, a London-based healthcare AI startup, built its medical research assistant on Google Cloud using Vertex AI, Gemini Pro, and Document AI to help researchers extract structured insights from millions of scientific articles. The platform achieved 95% accuracy in data extraction, a 98% agreement rate with human researchers, and helped one clinical customer cut research task time by 85%.
Fifth Dimension, a global AI platform for commercial real estate asset managers and owner-operators, built a multi-model workflow on Google Cloud using Gemini for large-scale document ingestion and Claude for high-precision reasoning. The platform compressed investment memo drafting from days or weeks to just 30 minutes and achieved 99.9% reliability for multi-hour workflows, driving deals with top-10 U.S. asset managers.
Etsy, the global marketplace for handcrafted and vintage goods, serves nearly 90 million buyers across more than 130 million listings from 5 million sellers. Using Vertex AI, BigQuery, Dataflow, and Gemini, the company built a personalized search and discovery platform it calls “algotorial curation” — increasing listings per theme by 80x, driving a 5% lift in SEO-driven visits, and delivering a 3% conversion improvement for sellers.
Class Editori, a leading Italian media company specializing in finance, fashion, and lifestyle with 40 years of content archives, partnered with Softlab to build MFGPT on Google Cloud — one of Italy’s first generative AI agents in the media industry. The system unified four decades of journalistic archives and real-time financial data into BigQuery, powered by Gemini and Vertex AI, converting trial users into paid subscribers and securing B2B enterprise agreements with major financial institutions.
AXA Switzerland, the country’s leading insurer covering over 40% of Swiss companies, migrated its entire data infrastructure to Google Cloud and deployed BigQuery, Vertex AI, and Gemini to become a data-driven organization. The transformation reduced complex query times from days to minutes or seconds and generated a high double-digit million Swiss franc profit improvement through Smart Data initiatives.
Epic Systems — the healthcare technology company behind MyChart, used by 195 million patients — deployed Claude Code across its entire workforce, not just engineers. Today, more than half of Claude Code usage at Epic comes from non-technical employees, including a pharmacist who built a fully interactive MyChart prototype without writing a single line of code.
The Metropolitan Museum of Art partnered with OpenAI to create a conversational AI experience called "Chat with Natalie" for its Sleeping Beauties fashion exhibition, letting visitors interact with a historically accurate AI portrayal of a 1930s New York socialite whose wedding dress is on display.
Fifth Dimension, a UK-based AI analytics company serving the real estate industry, migrated to Google Cloud to overcome critical infrastructure bottlenecks. By adopting Vertex AI, Cloud Run, and serverless architecture, the company achieved 50x processing scalability, 6x revenue growth, and a 30% reduction in infrastructure costs — all within a rapid growth trajectory from founding in 2023 to global scale by 2025.
Airtree, a $2 billion Australian venture capital firm, deployed Claude Cowork as shared firm-wide infrastructure to unify fragmented data across tools like Notion, Slack, Google Drive, and Affinity. The team built custom Skills to automate board meeting prep, market research, and portfolio reporting — cutting multi-hour tasks down to minutes. What began as individual productivity gains quickly scaled into a collaborative system where Skills built by one person benefit the entire firm.
Anything built a full-stack AI coding agent on Claude and the Agent SDK, enabling 1.5 million non-technical users to create production-ready software — from recruiting platforms to mobile apps — without writing a single line of code. In just five months, users shipped over 800,000 apps with a 91–96% agent success rate. Claude's reliable tool-calling, coding quality, and personality made it the clear choice for Anything's agent architecture.
InpharmD's AI assistant, Sherlock, leverages Pinecone's vector database to deliver fast, accurate drug information to healthcare professionals. By embedding 30 million medical documents into a RAG pipeline, InpharmD achieved 70% better query accuracy, 95x faster first response times, and 80% cost savings on data storage.
Yoodli is an AI-powered experiential learning platform that helps enterprise sales teams practice high-stakes conversations before they happen. By integrating Claude into its real-time roleplay engine, Yoodli delivers realistic AI personas that coach reps at scale — helping customers like Snowflake and Google Cloud achieve measurable performance gains.
Slack partnered with Anthropic to integrate Claude's AI models into its platform, enabling intelligent search, conversation summaries, and automated recaps. The collaboration saves the average user 97 minutes per week while unlocking organizational knowledge across billions of daily messages and files.
Thomson Reuters integrated Claude via Amazon Bedrock into its AI platform, CoCounsel, to make the expertise of 3,000+ subject matter experts and 150 years of authoritative content accessible to legal and tax professionals. The solution combines Retrieval-Augmented Generation (RAG) architecture with multi-model deployment to deliver comprehensive, accurate professional analysis. Early adopters report dramatic efficiency gains, with some estimating task time cut in half or more.
CANCOM, a leading EMEA digital solutions provider, built an AI-powered assistant using ServiceNow's Now Assist and generative AI to unify internal and customer-facing services. The CANCOM Assistant deflects 80% of support tickets across all departments and doubled adoption within a single year. The solution connects IT, HR, finance, supply chain, and sales onto one unified platform powered by agentic AI.
Attention built an AI-powered sales platform using Claude as its core reasoning engine, automating post-call admin work and delivering actionable sales intelligence at scale. By replacing manual CRM updates, follow-up emails, and coaching reviews with Claude-driven agents, Attention has saved over 1.6 million hours of admin work. Customers report up to 40% improvements in win rates thanks to AI outputs accurate enough to trust in live deals.
Replit on Microsoft Azure democratized software development so that 75% of enterprise users are non-engineers, compressing development timelines from weeks to minutes with natural language.
Cox Automotive deployed 17 production AI agent solutions using Amazon Bedrock AgentCore, reducing estimate completion from 48 hours to 30 minutes, achieving 3x consumer response rates, and projecting 17,000 hours saved.
Pinterest built an AI-powered discovery engine on AWS processing 18TB daily, delivering 10 million AI recommendations per second across 10,000+ GPU instances, driving 17% revenue growth and 70% AI-driven discovery.
Postman selected Claude Opus 4.6 as the default model for Agent Mode, saving developers up to 1,150 hours per year and nearly $1M annually for a 10-person team in API development automation.
Super-Pharm leveraged Google Vertex AI for ML-powered demand forecasting, improving inventory accuracy from 50% to 90% and making forecasting 10x more efficient.
EVERSANA built the first AI-powered marketing agency platform on Google Cloud using three master AI agents that create complete pharma campaigns in 30 minutes.
UFC partnered with IBM to build the UFC Insights Engine using IBM watsonx Orchestrate, tripling insight volume while reducing query generation time by 40% across 40+ annual live events for 700 million fans worldwide.
Scuderia Ferrari HP partnered with IBM Consulting to reimagine their mobile fan app using IBM watsonx AI, delivering AI-generated race summaries, personalized content, and interactive features that doubled daily active users and increased in-app engagement by 35%.
Blue Origin deployed 2,700+ AI agents with 70% company-wide adoption, achieving a 90% reduction in hardware development time using Amazon Bedrock.