AI Buyer Insights:

Citigroup, a VestmarkONE customer evaluated BlackRock Aladdin Wealth

Swedbank, a Temenos T24 customer evaluated Oracle Flexcube

Michelin, an e2open customer evaluated Oracle Transportation Management

Moog, an UKG AutoTime customer evaluated Workday Time and Attendance

Cantor Fitzgerald, a Kyriba Treasury customer evaluated GTreasury

Wayfair, a Korber HighJump WMS customer just evaluated Manhattan WMS

Westpac NZ, an Infosys Finacle customer evaluated nCino Bank OS

Citigroup, a VestmarkONE customer evaluated BlackRock Aladdin Wealth

Swedbank, a Temenos T24 customer evaluated Oracle Flexcube

Michelin, an e2open customer evaluated Oracle Transportation Management

Moog, an UKG AutoTime customer evaluated Workday Time and Attendance

Cantor Fitzgerald, a Kyriba Treasury customer evaluated GTreasury

Wayfair, a Korber HighJump WMS customer just evaluated Manhattan WMS

Westpac NZ, an Infosys Finacle customer evaluated nCino Bank OS

List of Google Cloud AI Infrastructure Customers

loading spinner icon



Apply Filters For Customers

Logo Customer Industry Empl. Revenue Country Vendor Application Category When SI Insight
Avathon Professional Services 251 $35M United States Google Google Cloud AI Infrastructure AI infrastructure 2025 n/a
In 2025, Avathon implemented Google Cloud AI Infrastructure to expand its Autonomy Platform and accelerate autonomous operations across the energy sector. The deployment integrates the Autonomy Platform, a unified AI solution, to connect data, decisions, and execution for oil and gas, utilities, and renewable energy providers. The implementation of Google Cloud AI Infrastructure situates Avathon to deliver AI infrastructure for energy operations, tying platform capabilities to asset management and supply chain business functions. The implementation configures modules for AI-powered asset management, supply chain orchestration, real-time monitoring, predictive analytics, anomaly detection, predictive maintenance, and energy forecasting. Vertex AI is used to train and serve machine learning models to improve precision in anomaly detection, predictive maintenance, and forecasting. Gemini Enterprise enables the creation of agentic operational agents that automate complex workflows, deliver on-demand insights, and produce prescriptive recommendations embedded in operational processes. Architecturally the solution leverages Google Cloud’s scalable cloud-native infrastructure for model training, inference, and agent orchestration, unifying disparate software systems and asset telemetry into a common operational data layer. The integrated platform supports real-time video intelligence, telemetry ingestion from compressor stations, pipelines, conventional power plants, wind turbines, solar inverters, and battery energy storage systems, and orchestrates decisioning across geographically dispersed assets. Integrations explicitly include Vertex AI and Gemini Enterprise alongside cloud services for scalable compute and data management to support continuous model lifecycle operations. Governance emphasis centers on operationalizing data-driven decision-making and embedding autonomous, semi-autonomous, and adaptive workflows into maintenance, operations, and supply chain processes. The Avathon Autonomy Platform powered by Google Cloud delivers real-time insights, predictive analytics, and prescriptive recommendations to optimize energy production, minimize downtime, extend asset lifespan, improve safety, and enhance compliance readiness. This expanded collaboration builds on Avathon’s prior deployments with leading energy providers and aligns Google Cloud AI Infrastructure with enterprise operational workflows and resilience objectives.
Bandai Namco Entertainment America Media 220 $40M United States Google Google Cloud AI Infrastructure AI infrastructure 2024 n/a
In 2024, Bandai Namco Entertainment America deployed Google Cloud AI Infrastructure to support Tekken 8, released in January 2024, using the platform as its AI infrastructure backbone for live game operations. The implementation was explicitly oriented to sustain tens of thousands of concurrent players and to strengthen online multiplayer, community features, and competitive modes across global launches. The technical implementation centered on Google Kubernetes Engine and Spanner combined with Google Cloud secure networking, leveraging autoscaling and Google Cloud's global infrastructure to manage live game workloads at planet scale. Google Cloud AI Infrastructure was configured to host multiplayer backend services, a large scale online visual lobby, and high speed matching pipelines, using Spanner for globally consistent game state and GKE for container orchestration and elastic capacity. Integrations included a collaboration with Diarkis, whose proprietary matchmaking algorithms were paired with Google Cloud capacity to enable fast opponent matching and reliable session connectivity. Operational coverage focused on online game services for Tekken 8, encompassing matchmaking, lobby communications, multiplayer session management, and community interaction features, with a deployment topology designed to minimize network latency and preserve consistent player experience worldwide. Governance and rollout were executed as a multi‑party collaboration between Bandai Namco Entertainment America, Google Cloud, and Diarkis, aligning live service operational responsibilities and playbook procedures for peak load events and launch windows. Outcomes documented in the deployment include reliable performance during peak usage, consistent and reliable gameplay access for players around the world, and enhanced multiplayer and community capabilities delivered via Google Cloud AI Infrastructure and related platform services.
Lockheed Martin Aerospace and Defense 121000 $71.0B United States Google Google Cloud AI Infrastructure AI infrastructure 2025 n/a
In 2025 Lockheed Martin implemented Google Cloud AI Infrastructure into its AI Factory ecosystem, integrating AI infrastructure capabilities to support national security, aerospace, and scientific applications. The Google Cloud AI Infrastructure deployment is intended to expand the company’s ability to train, deploy, and sustain high-performance AI models as part of a multi-provider AI Factory approach, preserving traceability, reliability, and monitoring across model lifecycles. The implementation centers on Vertex AI capabilities for model training, deployment, and customization of large language models, combined with AI Factory functions for traceability and monitoring. Functional modules include model training pipelines, deployment pipelines, model registry and versioning, runtime monitoring, and provenance recording to enable high-assurance model operations. These components align with the AI infrastructure category and support secure customization and lifecycle management of generative AI models. Google Cloud AI Infrastructure integrates directly with Lockheed Martin’s AI Factory ecosystem and coexists with other leading AI providers, enabling hybrid workflows and model interchange. The architecture is described to support deployment and sustainment across global operations including air-gapped environments, applying capabilities to advanced intelligence analysis, real-time decision-making, predictive aerospace maintenance, optimized engineering designs, supply chain optimization, secure software development, customized workforce training, and accelerated scientific discovery. Governance and operational changes emphasize secure, trustworthy AI practices, with built-in monitoring, traceability, and adherence to high standards of security and reliability. Process restructuring includes embedding generative AI outputs into decision-making workflows, formalizing model provenance and audit trails, and aligning development practices with supply chain and software security requirements. These governance measures are intended to enable controlled rollouts and high-assurance operations across relevant engineering and mission support functions. The announced collaboration states the integration will enhance Lockheed Martin’s ability to train, deploy, and sustain models and accelerate AI-driven capabilities across targeted business functions. Lockheed Martin’s use of Google Cloud AI Infrastructure is positioned as a component of a broader, multi-provider AI Factory architecture that prioritizes trust, security, and operational sustainment for mission-critical applications.
New Aim Professional Services 400 $220M Australia Google Google Cloud AI Infrastructure AI infrastructure 2024 n/a
In 2024, New Aim deployed Google Cloud AI Infrastructure to consolidate its digital infrastructure and centralize AI and big data capabilities as part of an AI infrastructure initiative. The deployment began in March 2024 to streamline multiple cloud and on premise workloads into a single Google Cloud environment for the 400 employee professional services company serving Australian retailers. The scope explicitly covered New Aim’s proprietary platforms and its subsidiary Dropshipzone, supporting small challenger e tailers through major national brands. New Aim configured AimCore on BigQuery to structure and analyze data from product sourcing, end to end logistics, and warehousing, establishing a managed data foundation on Google Cloud AI Infrastructure. AirOxy.AI was developed out of AimCore to operationalize generative AI workflows, leveraging Vertex AI and selected models from the Model Garden to produce pricing insights, surface market trends, and optimize product listing images. The implementation emphasizes model selection, data preparation pipelines, and inference orchestration native to Google Cloud. All AirOxy.AI capabilities and underlying data reside on Google Cloud, with native integrations reducing prior compatibility and complexity when combining open source and proprietary systems. Operational coverage spans New Aim’s e commerce platforms, its Dropshipzone marketplace with more than 2,500 active retailers, and customer integrations that include national retailers such as Bunnings, Woolworths, Big W, and Baby Bunting. Data flows into BigQuery from sourcing, logistics, and warehousing services to feed real time analytics and model training workloads. Governance focused on centralizing cloud operations and building skills internally, with Google Cloud providing engineering support and education resources to upskill New Aim’s engineers and address local requirements across its international offices. New Aim implemented automated data optimization pipelines and layered security controls to prepare data for AI adoption and to reduce outage and threat risks. Internal processes were restructured around a single cohesive cloud environment and centralized AI infrastructure governance to simplify operations for a lean IT organization. New Aim reports reduced overall IT costs, strengthened cybersecurity, and an increase in service uptime from 97% to 99% following consolidation onto Google Cloud, and AirOxy.AI’s pilot launch in July 2024 has attracted strong industry demand. The Google Cloud AI Infrastructure now underpins productized AI capabilities delivered through AirOxy.AI and Dropshipzone, extending pricing and market insight functions to SME retailers.
NextEra Energy Utilities 16800 $24.8B United States Google Google Cloud AI Infrastructure AI infrastructure 2025 n/a
In 2025, NextEra Energy implemented Google Cloud AI Infrastructure as part of an enterprise-wide digital transformation and to support development of multiple gigawatt-scale data center campuses paired with accompanying generation and capacity. The deployment centers Google Cloud AI Infrastructure as the core AI infrastructure platform for NextEra Energy, aligning AI platform, infrastructure, and models with the companys strategic energy and data center buildouts. The implementation configures Google Cloud AI Infrastructure to provide model training and inference capacity, data pipeline orchestration, and scalable compute provisioning for energy planning and data center operations. Google Cloud AI Infrastructure is being used to centralize AI workloads, standardize model lifecycle processes, and enable platform services such as model hosting and automated data ingestion across enterprise analytics and engineering workflows. Operationally the work links AI infrastructure to NextEra Energys data center campus architecture and the accompanying generation and capacity projects, creating a coengineered relationship between compute demand and energy supply planning. The arrangement frames integrations between Google Cloud platform services and NextEra Energys enterprise systems for planning, construction oversight, and operational orchestration, supporting both core cloud and edge compute patterns across the campuses. Governance is structured as a joint NextEra Energy and Google Cloud collaboration to sequence campus buildouts, AI deployment, and go to market acceleration, with crossfunctional coordination across engineering, operations, and commercial teams. The stated goals include accelerating AI deployment at scale and reimagining how energy companies operate, while aligning AI infrastructure investments with physical energy infrastructure buildouts.
Communications 2400 $600M United States Google Google Cloud AI Infrastructure AI infrastructure 2025 n/a
Showing 1 to 6 of 6 entries

Buyer Intent: Companies Evaluating Google Cloud AI Infrastructure

ARTW Buyer Intent uncovers actionable customer signals, identifying software buyers actively evaluating Google Cloud AI Infrastructure. Gain ongoing access to real-time prospects and uncover hidden opportunities.

Discover Software Buyers actively Evaluating Enterprise Applications

Logo Company Industry Employees Revenue Country Evaluated
No data found
FAQ - APPS RUN THE WORLD Google Cloud AI Infrastructure Coverage

Google Cloud AI Infrastructure is a AI infrastructure solution from Google.

Companies worldwide use Google Cloud AI Infrastructure, from small firms to large enterprises across 21+ industries.

Organizations such as Lockheed Martin, NextEra Energy, Resemble AI, New Aim and Bandai Namco Entertainment America are recorded users of Google Cloud AI Infrastructure for AI infrastructure.

Companies using Google Cloud AI Infrastructure are most concentrated in Aerospace and Defense, Utilities and Communications, with adoption spanning over 21 industries.

Companies using Google Cloud AI Infrastructure are most concentrated in United States and Australia, with adoption tracked across 195 countries worldwide. This global distribution highlights the popularity of Google Cloud AI Infrastructure across Americas, EMEA, and APAC.

Companies using Google Cloud AI Infrastructure range from small businesses with 0-100 employees - 0%, to mid-sized firms with 101-1,000 employees - 50%, large organizations with 1,001-10,000 employees - 16.67%, and global enterprises with 10,000+ employees - 33.33%.

Customers of Google Cloud AI Infrastructure include firms across all revenue levels — from $0-100M, to $101M-$1B, $1B-$10B, and $10B+ global corporations.

Contact APPS RUN THE WORLD to access the full verified Google Cloud AI Infrastructure customer database with detailed Firmographics such as industry, geography, revenue, and employee breakdowns as well as key decision makers in charge of AI infrastructure.