How Morrisons Reduced Data Reporting Lag by 99% with BigQuery and Looker
Morrisons, one of the UK’s largest supermarkets serving nine million customers weekly across 500 stores, migrated its on-premise data warehouse to BigQuery and Looker, reducing reporting lag by 98.96% from one day to 15 minutes. Real-time data now powers Vertex AI demand forecasting models and a customer-facing Product Finder app that receives 50,000 hits per day during peak periods.
Impact
98.96%
Data reporting lag reduction
50,000 hits per day
Product Finder app usage at peak
Daily
Previous reporting frequency
Challenge
Morrisons’ on-premise data warehouse could not connect to cloud systems, forcing daily manual exports that meant reports and ML forecasting models were always working one day behind real-world operations at a supermarket serving nine million customers weekly.
Solution
Morrisons migrated all operational data to BigQuery, integrated Looker for self-service reporting, and built Vertex AI forecasting models and a Gemini-powered customer Product Finder that delivers real-time shelf location data across 500 stores.
Tools & Technologies
What Leaders Say
“By migrating our data to Google Cloud, we’ve gone from being able to report daily to having real-time reports to ensure our operation is running as effectively as possible for our customers.”
“With Looker, anybody in the business can access the data they need without needing to understand where the data is, how it works, or how to query it. The business insights we can now give our employees are endless.”
“The difference between then and now is like night and day. When we started, our data scientists had to wait to access the data each day. Now, not only can the data scientist access it in real time, our business colleagues can too. That’s all been made possible by Google Cloud.”
Sign up to read complete case studies, access detailed metrics, and unlock all use cases.
Full Story
Morrisons has operated across the UK since 1899 and today serves nine million customers a week across 500 supermarkets and 1,600 Morrisons Daily convenience stores. Its supply chain is unusually complex: the company operates its own farms and abattoirs, enabling it to control freshness from field to shelf. That level of vertical integration means Morrisons’ operations depend on precise, timely data — accurate demand forecasting, real-time inventory levels, and rapid customer feedback loops are all essential to getting the right product to the right shelf at the right time.
The problem was the data stack. Morrisons stored its operational data in an on-premise database that could not be connected to cloud systems for security reasons. Before any analysis or ML model could run, data had to be manually exported. Reports were produced daily at best. The Chief Data Officer, Peter Laflin, described the fundamental limitation: if you can’t access data in real time, you’re always building algorithms that are a day behind the real world.
Morrisons migrated to BigQuery as its central data warehouse, selecting it specifically for its native integration with Vertex AI, where the company’s ML engineers build demand forecasting models. Every sale across all Morrisons stores now flows directly into BigQuery, with data pipelines automatically updating Looker dashboards within 15 minutes. Looker replaced a legacy BI tool because its semantic layer significantly reduced the data modeling burden on analysts, allowing new dashboards to be shipped rapidly. A store manager can now look at a dashboard to see exactly what products are available at that moment and what needs to be ordered to maintain stock. Employees can even convert reports to audio files using NotebookLM with Gemini 2.0 to consume them while on the move.
With BigQuery fully populated in real time, Morrisons built a customer-facing Product Finder feature in its app. At Easter, when seasonal promotions temporarily rearrange store layouts, the Product Finder received 50,000 hits per day. Customers type a product name, Gemini 1.5 Pro interprets the search and matches it to a product code, BigQuery returns availability and aisle location, and Cloud Run delivers the result in seconds. The same Gemini and BigQuery stack now processes all incoming customer feedback in real time, summarizing sentiment trends across the business so teams can respond immediately.
Laflin describes the before-and-after as night and day. ML engineers no longer maintain infrastructure — Vertex AI’s managed service model lets them focus entirely on model development. The data team can now explore new forecasting problems in days rather than weeks, whether predicting next week’s sales volumes, identifying customers likely to purchase a given product, or optimizing the delivery network to reduce mileage.