List of Google BigQuery Customers
Mountain View, 94043, CA,
United States
Since 2010, our global team of researchers has been studying Google BigQuery customers around the world, aggregating massive amounts of data points that form the basis of our forecast assumptions and perhaps the rise and fall of certain vendors and their products on a quarterly basis.
Each quarter our research team identifies companies that have purchased Google BigQuery for Data Warehouse from public (Press Releases, Customer References, Testimonials, Case Studies and Success Stories) and proprietary sources, including the customer size, industry, location, implementation status, partner involvement, LOB Key Stakeholders and related IT decision-makers contact details.
Companies using Google BigQuery for Data Warehouse include: Home Depot, a United States based Retail organisation with 470000 employees and revenues of $159.51 billion, Lowe'S, a United States based Retail organisation with 270000 employees and revenues of $83.67 billion, Air France-KLM, a France based Transportation organisation with 78399 employees and revenues of $36.48 billion, PayPal, a United States based Banking and Financial Services organisation with 24400 employees and revenues of $31.80 billion, Morrisons, a United Kingdom based Retail organisation with 101138 employees and revenues of $22.35 billion and many others.
Contact us if you need a completed and verified list of companies using Google BigQuery, including the breakdown by industry (21 Verticals), Geography (Region, Country, State, City), Company Size (Revenue, Employees, Asset) and related IT Decision Makers, Key Stakeholders, business and technology executives responsible for the software purchases.
The Google BigQuery customer wins are being incorporated in our Enterprise Applications Buyer Insight and Technographics Customer Database which has over 100 data fields that detail company usage of software systems and their digital transformation initiatives. Apps Run The World wants to become your No. 1 technographic data source!
Apply Filters For Customers
| Logo | Customer | Industry | Empl. | Revenue | Country | Vendor | Application | Category | When | SI | Insight |
|---|---|---|---|---|---|---|---|---|---|---|---|
|
|
Activision Blizzard | Professional Services | 13000 | $7.5B | United States | Google BigQuery | Data Warehouse | 2020 | n/a |
In 2020, Activision Blizzard deployed Google BigQuery as a central Data Warehouse within a broader Google Cloud Platform resource delivery program. The Google BigQuery implementation was provisioned to support enterprise application teams and cross-functional business units, aligning the Data Warehouse with application development, analytics, and operations needs.
The implementation delivered a suite of GCP services including Compute Engine, Cloud Storage, Cloud Volumes, Google BigQuery, Cloud SQL, and Dataflow to enterprise application teams. Architects participated in project architecture decisions focused on IAM access, security controls, and network design, ensuring that Google BigQuery and adjacent services conformed to corporate identity and segmentation policies.
Standards for logging and monitoring were identified and established across cloud and on-premise datacenter environments, leveraging GCP-native Stackdriver and Solarwinds for centralized observability. The program emphasized closing gaps in telemetry and traceability so operational teams could monitor workloads running in Google BigQuery and other GCP services alongside on-prem systems.
Operationally the IT organization maintained a global virtualized infrastructure stack including hypervisor platforms, compute hardware, and SAN storage, and led implementation of hyperconverged infrastructure at remote sites. Server infrastructure lifecycle management covered initial deployment, access controls, optimization, maintenance, vulnerability management, patching, and decommissioning. IT services delivered to partner studios and business units included file sharing platforms, identity management, and SSL certificate administration, integrating these operational responsibilities with the Google BigQuery Data Warehouse and broader GCP estate.
|
|
|
|
Air France-KLM | Transportation | 78399 | $36.5B | France | Google BigQuery | Data Warehouse | 2024 | n/a |
In 2024 Air France-KLM implemented Google BigQuery as a central Data Warehouse component of a broader multi-cloud data and analytics strategy. The program focuses on consolidating group-level data into a common lakehouse architecture while leveraging Google BigQuery for scalable analytical processing and query federation across cloud-hosted datasets, enabling the Air France-KLM Google BigQuery Data Warehouse to serve as the primary analytical store.
The implementation layers Google Cloud data and analytics tooling alongside a dedicated, secured generative AI cloud instance to support gen AI and multimodal workloads. Functional capabilities highlighted include high-volume analytics, near real-time query performance through BigQuery, gen AI-enabled customer agent support with automatic documentation, and analytics-driven predictive plane maintenance that reduced analysis time from hours to minutes as stated by the customer.
Operational scope covers the entire Air France-KLM group, including commercial and cargo operations, engineering and maintenance teams, and product teams across the organization, reflecting the scale of three major airline companies, 551 aircraft, and the group passenger base. The deployment follows a multicloud posture and the group has emphasized maintaining full ownership and control of its data while using Google Cloud security and privacy controls to govern access and processing.
Governance and capability building are being addressed through training and support from Google Cloud, including data science and engineering curricula, security and infrastructure upskilling, hackathon-like events, and on-site and online sessions. These activities aim to democratize data access, embed Cloud and Data Warehouse practices into product teams, and create operational workflows that surface analytics and gen AI outputs into customer experience, maintenance, and airport and flight operations.
|
|
|
|
ANWB | Professional Services | 3948 | $1.2B | Netherlands | Google BigQuery | Data Warehouse | 2019 | n/a |
In 2019 ANWB implemented Google BigQuery as its Data Warehouse and established a centralized analytics store for web and experimentation data. Google BigQuery was positioned to receive event-level exports and to serve as the SQL backbone for conversion research and A B test analysis, supporting both Universal Analytics and the later GA4 event model.
The implementation covered schema design, event-level export pipelines, and measurement plan alignment. Workstreams included managing Universal Analytics, migrating to Google Analytics 4, configuring Google Tag Manager, implementing Google Consent Mode, and piloting GTM server-side tagging to improve data quality and consent-aware collection.
Integrations were explicitly built between Google BigQuery and Google Analytics exports, Google Tag Manager instrumentation, and reporting layers in Google Data Studio. Teams used queries against Google BigQuery for conversion research, to feed experimentation workflows, and to populate dashboards in Google Data Studio for stakeholders.
Operational scope focused on analytics and experimentation functions within ANWB, with processes centered on measurement plan governance and tagging standards. The rollout emphasized tagging governance, consent controls, and a server-side tagging pilot to reshape collection workflows and dataset access procedures.
|
|
|
|
ANZ Bank | Banking and Financial Services | 43094 | $13.4B | Australia | Google BigQuery | Data Warehouse | 2019 | n/a |
In 2019, ANZ Bank implemented Google BigQuery as its Data Warehouse to accelerate analytics across its Institutional Banking division, which operates in 34 markets globally. The initiative aimed to deliver rapid, meaningful insights to institutional customers on issues such as liquidity, risk, cash management, store location strategy, inventory and market positioning.
Google BigQuery was provisioned to support heavy computational queries and data science workloads, processing aggregated, de-identified datasets to produce customer recommendations and strategic analysis. According to the implementation notes, analysis that previously required five days to complete on a single table was reduced to 20 seconds, and bankers received meaningful business insights up to 250 times faster.
The deployment architecture combined Google BigQuery with Google Cloud Composer for orchestration of data movement and transformation, managing dependencies and multiple layers of the data pipeline, and Google Kubernetes Engine to host customized data services and data visualization experiences for customers. This stack separated responsibilities, with BigQuery handling analytic processing, Composer orchestrating ETL and workflows, and GKE providing a containerized platform for bespoke services and visualization components.
Rollout progressed from a proof of concept to production, with a governance emphasis on using aggregated, de-identified datasets to address regulatory requirements in a highly regulated industry. Operational coverage prioritized institutional bankers and customer-facing analytics workflows, automating previously manual tasks such as aggregated credit card data analysis and embedding faster analytical responses into institutional customer engagements.
|
|
|
|
AssemblyAI | Professional Services | 120 | $12M | United States | Google BigQuery | Data Warehouse | 2025 | n/a |
In 2025 AssemblyAI moved its data lake and analytics into Google BigQuery as the core of a Google Cloud Data Warehouse deployment. Google BigQuery sits alongside Google Cloud Storage and Bigtable within a petabyte scale AI lake house, with Looker used for visualization and interactive analysis to support research and product teams.
The deployment uses a Kubernetes based orchestration fabric on Google Kubernetes Engine, and AssemblyAI built an orchestration layer using Cloud Composer so researchers can author scalable ETL and feature pipelines autonomously. Data processing pipelines leverage Dataflow and Dataproc for large scale transformation and ingestion, while Vertex AI and TPU pods provide on demand training and evaluation capacity; Google BigQuery is used for analytical queries, model telemetry, and aggregated training metadata.
Operational scope centers on AssemblyAI research and data infrastructure teams, enabling parallel experiments at scale and autonomous pipeline execution by research engineers. The environment expanded processing capacity from a few nodes and GPUs to thousands of nodes and GPUs and TPUs, and increased audio training volume from roughly 1M hours to over 12M hours while growing storage from about 100 TB to more than 1 PB.
Governance and workflow changes include centralized audit, logging, and data lineage in Google Cloud to trace which datasets trained which models, addressing privacy and data ownership requirements. The Cloud Composer orchestration plus Vertex AI evaluation loop shortened model evaluation cycles, enabling teams to run many experiments concurrently without deep engineering support.
Explicit outcomes reported by AssemblyAI include a 75% reduction in data storage and infrastructure costs, model evaluation time reduced from 24 hours to 20 minutes, and over 10 times the prior data storage capacity. By consolidating storage, processing, orchestration, model training, and analysis on Google Cloud and Vertex AI, AssemblyAI accelerated complex speech model development and compressed go to market cycles for new models.
|
|
|
|
|
Retail | 700 | $320M | France | Google BigQuery | Data Warehouse | 2024 | n/a |
|
|
|
|
|
Banking and Financial Services | 1000 | $150M | Mexico | Google BigQuery | Data Warehouse | 2021 | n/a |
|
|
|
|
|
Banking and Financial Services | 490 | $92M | Indonesia | Google BigQuery | Data Warehouse | 2024 | n/a |
|
|
|
|
|
Banking and Financial Services | 2357 | $400M | Malaysia | Google BigQuery | Data Warehouse | 2024 | n/a |
|
|
|
|
|
Banking and Financial Services | 400 | $432M | United States | Google BigQuery | Data Warehouse | 2018 | n/a |
|
Buyer Intent: Companies Evaluating Google BigQuery
- Stelo, a United States based Professional Services organization with 30 Employees
- Blackstone, a United States based Banking and Financial Services company with 4895 Employees
- Copel, a Brazil based Utilities organization with 6600 Employees
Discover Software Buyers actively Evaluating Enterprise Applications
| Logo | Company | Industry | Employees | Revenue | Country | Evaluated | ||
|---|---|---|---|---|---|---|---|---|
| No data found | ||||||||