AI Buyer Insights:

Moog, an UKG AutoTime customer evaluated Workday Time and Attendance

Citigroup, a VestmarkONE customer evaluated BlackRock Aladdin Wealth

Swedbank, a Temenos T24 customer evaluated Oracle Flexcube

Michelin, an e2open customer evaluated Oracle Transportation Management

Cantor Fitzgerald, a Kyriba Treasury customer evaluated GTreasury

Westpac NZ, an Infosys Finacle customer evaluated nCino Bank OS

Wayfair, a Korber HighJump WMS customer just evaluated Manhattan WMS

Moog, an UKG AutoTime customer evaluated Workday Time and Attendance

Citigroup, a VestmarkONE customer evaluated BlackRock Aladdin Wealth

Swedbank, a Temenos T24 customer evaluated Oracle Flexcube

Michelin, an e2open customer evaluated Oracle Transportation Management

Cantor Fitzgerald, a Kyriba Treasury customer evaluated GTreasury

Westpac NZ, an Infosys Finacle customer evaluated nCino Bank OS

Wayfair, a Korber HighJump WMS customer just evaluated Manhattan WMS

List of Google Cloud AI Infrastructure Customers

loading spinner icon



Apply Filters For Customers

Logo Customer Industry Empl. Revenue Country Vendor Application Category When SI Insight
Avathon Professional Services 251 $35M United States Google Google Cloud AI Infrastructure AI infrastructure 2025 n/a
In 2025, Avathon implemented Google Cloud AI Infrastructure to expand its Autonomy Platform and accelerate autonomous operations across the energy sector. The deployment integrates the Autonomy Platform, a unified AI solution, to connect data, decisions, and execution for oil and gas, utilities, and renewable energy providers. The implementation of Google Cloud AI Infrastructure situates Avathon to deliver AI infrastructure for energy operations, tying platform capabilities to asset management and supply chain business functions. The implementation configures modules for AI-powered asset management, supply chain orchestration, real-time monitoring, predictive analytics, anomaly detection, predictive maintenance, and energy forecasting. Vertex AI is used to train and serve machine learning models to improve precision in anomaly detection, predictive maintenance, and forecasting. Gemini Enterprise enables the creation of agentic operational agents that automate complex workflows, deliver on-demand insights, and produce prescriptive recommendations embedded in operational processes. Architecturally the solution leverages Google Cloud’s scalable cloud-native infrastructure for model training, inference, and agent orchestration, unifying disparate software systems and asset telemetry into a common operational data layer. The integrated platform supports real-time video intelligence, telemetry ingestion from compressor stations, pipelines, conventional power plants, wind turbines, solar inverters, and battery energy storage systems, and orchestrates decisioning across geographically dispersed assets. Integrations explicitly include Vertex AI and Gemini Enterprise alongside cloud services for scalable compute and data management to support continuous model lifecycle operations. Governance emphasis centers on operationalizing data-driven decision-making and embedding autonomous, semi-autonomous, and adaptive workflows into maintenance, operations, and supply chain processes. The Avathon Autonomy Platform powered by Google Cloud delivers real-time insights, predictive analytics, and prescriptive recommendations to optimize energy production, minimize downtime, extend asset lifespan, improve safety, and enhance compliance readiness. This expanded collaboration builds on Avathon’s prior deployments with leading energy providers and aligns Google Cloud AI Infrastructure with enterprise operational workflows and resilience objectives.
Bandai Namco Entertainment America Media 220 $40M United States Google Google Cloud AI Infrastructure AI infrastructure 2024 n/a
In 2024, Bandai Namco Entertainment America deployed Google Cloud AI Infrastructure to support Tekken 8, released in January 2024, using the platform as its AI infrastructure backbone for live game operations. The implementation was explicitly oriented to sustain tens of thousands of concurrent players and to strengthen online multiplayer, community features, and competitive modes across global launches. The technical implementation centered on Google Kubernetes Engine and Spanner combined with Google Cloud secure networking, leveraging autoscaling and Google Cloud's global infrastructure to manage live game workloads at planet scale. Google Cloud AI Infrastructure was configured to host multiplayer backend services, a large scale online visual lobby, and high speed matching pipelines, using Spanner for globally consistent game state and GKE for container orchestration and elastic capacity. Integrations included a collaboration with Diarkis, whose proprietary matchmaking algorithms were paired with Google Cloud capacity to enable fast opponent matching and reliable session connectivity. Operational coverage focused on online game services for Tekken 8, encompassing matchmaking, lobby communications, multiplayer session management, and community interaction features, with a deployment topology designed to minimize network latency and preserve consistent player experience worldwide. Governance and rollout were executed as a multi‑party collaboration between Bandai Namco Entertainment America, Google Cloud, and Diarkis, aligning live service operational responsibilities and playbook procedures for peak load events and launch windows. Outcomes documented in the deployment include reliable performance during peak usage, consistent and reliable gameplay access for players around the world, and enhanced multiplayer and community capabilities delivered via Google Cloud AI Infrastructure and related platform services.
Lockheed Martin Aerospace and Defense 121000 $71.0B United States Google Google Cloud AI Infrastructure AI infrastructure 2025 n/a
In 2025 Lockheed Martin implemented Google Cloud AI Infrastructure into its AI Factory ecosystem, integrating AI infrastructure capabilities to support national security, aerospace, and scientific applications. The Google Cloud AI Infrastructure deployment is intended to expand the company’s ability to train, deploy, and sustain high-performance AI models as part of a multi-provider AI Factory approach, preserving traceability, reliability, and monitoring across model lifecycles. The implementation centers on Vertex AI capabilities for model training, deployment, and customization of large language models, combined with AI Factory functions for traceability and monitoring. Functional modules include model training pipelines, deployment pipelines, model registry and versioning, runtime monitoring, and provenance recording to enable high-assurance model operations. These components align with the AI infrastructure category and support secure customization and lifecycle management of generative AI models. Google Cloud AI Infrastructure integrates directly with Lockheed Martin’s AI Factory ecosystem and coexists with other leading AI providers, enabling hybrid workflows and model interchange. The architecture is described to support deployment and sustainment across global operations including air-gapped environments, applying capabilities to advanced intelligence analysis, real-time decision-making, predictive aerospace maintenance, optimized engineering designs, supply chain optimization, secure software development, customized workforce training, and accelerated scientific discovery. Governance and operational changes emphasize secure, trustworthy AI practices, with built-in monitoring, traceability, and adherence to high standards of security and reliability. Process restructuring includes embedding generative AI outputs into decision-making workflows, formalizing model provenance and audit trails, and aligning development practices with supply chain and software security requirements. These governance measures are intended to enable controlled rollouts and high-assurance operations across relevant engineering and mission support functions. The announced collaboration states the integration will enhance Lockheed Martin’s ability to train, deploy, and sustain models and accelerate AI-driven capabilities across targeted business functions. Lockheed Martin’s use of Google Cloud AI Infrastructure is positioned as a component of a broader, multi-provider AI Factory architecture that prioritizes trust, security, and operational sustainment for mission-critical applications.
Professional Services 400 $220M Australia Google Google Cloud AI Infrastructure AI infrastructure 2024 n/a
Utilities 16800 $24.8B United States Google Google Cloud AI Infrastructure AI infrastructure 2025 n/a
Communications 2400 $600M United States Google Google Cloud AI Infrastructure AI infrastructure 2025 n/a
Showing 1 to 6 of 6 entries

Buyer Intent: Companies Evaluating Google Cloud AI Infrastructure

ARTW Buyer Intent uncovers actionable customer signals, identifying software buyers actively evaluating Google Cloud AI Infrastructure. Gain ongoing access to real-time prospects and uncover hidden opportunities.

Discover Software Buyers actively Evaluating Enterprise Applications

Logo Company Industry Employees Revenue Country Evaluated
No data found
FAQ - APPS RUN THE WORLD Google Cloud AI Infrastructure Coverage

Google Cloud AI Infrastructure is a AI infrastructure solution from Google.

Companies worldwide use Google Cloud AI Infrastructure, from small firms to large enterprises across 21+ industries.

Organizations such as Lockheed Martin, NextEra Energy, Resemble AI, New Aim and Bandai Namco Entertainment America are recorded users of Google Cloud AI Infrastructure for AI infrastructure.

Companies using Google Cloud AI Infrastructure are most concentrated in Aerospace and Defense, Utilities and Communications, with adoption spanning over 21 industries.

Companies using Google Cloud AI Infrastructure are most concentrated in United States and Australia, with adoption tracked across 195 countries worldwide. This global distribution highlights the popularity of Google Cloud AI Infrastructure across Americas, EMEA, and APAC.

Companies using Google Cloud AI Infrastructure range from small businesses with 0-100 employees - 0%, to mid-sized firms with 101-1,000 employees - 50%, large organizations with 1,001-10,000 employees - 16.67%, and global enterprises with 10,000+ employees - 33.33%.

Customers of Google Cloud AI Infrastructure include firms across all revenue levels — from $0-100M, to $101M-$1B, $1B-$10B, and $10B+ global corporations.

Contact APPS RUN THE WORLD to access the full verified Google Cloud AI Infrastructure customer database with detailed Firmographics such as industry, geography, revenue, and employee breakdowns as well as key decision makers in charge of AI infrastructure.