List of Apache Hudi Customers
Wilmington, 19801, DE,
United States
Since 2010, our global team of researchers has been studying Apache Hudi customers around the world, aggregating massive amounts of data points that form the basis of our forecast assumptions and perhaps the rise and fall of certain vendors and their products on a quarterly basis.
Each quarter our research team identifies companies that have purchased Apache Hudi for Data Warehouse from public (Press Releases, Customer References, Testimonials, Case Studies and Success Stories) and proprietary sources, including the customer size, industry, location, implementation status, partner involvement, LOB Key Stakeholders and related IT decision-makers contact details.
Companies using Apache Hudi for Data Warehouse include: Walmart, a United States based Retail organisation with 2100000 employees and revenues of $681.00 billion, Uber, a United States based Transportation organisation with 31100 employees and revenues of $43.98 billion, Disney+ Hotstar India, a India based Media organisation with 1700 employees and revenues of $500.0 million and many others.
Contact us if you need a completed and verified list of companies using Apache Hudi, including the breakdown by industry (21 Verticals), Geography (Region, Country, State, City), Company Size (Revenue, Employees, Asset) and related IT Decision Makers, Key Stakeholders, business and technology executives responsible for the software purchases.
The Apache Hudi customer wins are being incorporated in our Enterprise Applications Buyer Insight and Technographics Customer Database which has over 100 data fields that detail company usage of software systems and their digital transformation initiatives. Apps Run The World wants to become your No. 1 technographic data source!
Apply Filters For Customers
| Logo | Customer | Industry | Empl. | Revenue | Country | Vendor | Application | Category | When | SI | Insight | Insight Source |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
|
Disney+ Hotstar India | Media | 1700 | $500M | India | Apache Software | Apache Hudi | Data Warehouse | 2021 | n/a | In 2021, Disney+ Hotstar India implemented Apache Hudi in the Data Lakehouse category, adopting Apache Hudi to stabilize and scale its analytics data lake for streaming ad and behavioral analytics. The implementation targeted high read volume use cases and aimed to simplify operational complexity in production analytics pipelines. The deployment focused on Apache Hudi incremental ingestion and CDC style pipelines as described in public presentations, using Hudi capabilities for record level upserts, incremental reads, and file level compaction to support continuous ingestion and queryable change capture. Functional configuration emphasized incremental pull and compacted storage layouts to reduce read amplification and support ad hoc and analytical query patterns. Architecturally the work involved migrating portions of the analytics data lake from HBase to Apache Hudi, integrating Hudi tables into the existing analytics storage and query layers within India. The scope included the streaming ad technology stack and behavioral analytics pipelines, with Hudi functioning as the table storage and incremental consumption layer for downstream analytics and reporting. Governance and rollout were executed as phased migrations of data domains, with pipeline orchestration adjusted to CDC style flows and operational controls added for schema evolution and compaction policy management. The stated outcome was improved scale and query performance for Disney+ Hotstar India’s streaming ad and behavioral analytics platform while addressing very high read volumes and previous operational complexity. | |
|
|
Uber | Transportation | 31100 | $44.0B | United States | Apache Software | Apache Hudi | Data Warehouse | 2016 | n/a | In 2016, Uber implemented Apache Hudi to build a large-scale transactional data lake supporting trip and order analytics and finance and settlements pipelines, establishing a Data engineering/analytics foundation for analytics and financial reporting workloads. The implementation centered on Apache Hudi as the application layer for transactional ingestion and incremental processing, deployed in the United States and scoped to analytics and finance teams responsible for settlement and trip-level reporting. Configuration leveraged Apache Hudi table types, using copy-on-write and merge-on-read patterns for incremental ingestion, compaction, and upsert semantics, and applied Hudi transactional write capabilities to enable consistent incremental pipelines. Operational design included scheduled compaction policies and incremental ingestion workflows, plus pipeline orchestration and partitioned table organization to support continuous ingest and incremental query performance. The deployment powered trip and order analytics and finance and settlements pipelines, and produced explicit runtime and cost outcomes reported by Uber, reducing batch pipeline runtime from approximately 20 hours to approximately 4 hours and cutting single-run costs by about 60 percent. Governance emphasized compaction scheduling, incremental processing controls, and monitoring of transactional consistency across Hudi tables to sustain operational reliability for analytics and financial workflows. | |
|
|
Walmart | Retail | 2100000 | $681.0B | United States | Apache Software | Apache Hudi | Data Warehouse | 2022 | n/a | In 2022 Walmart implemented Apache Hudi as the table format for its lakehouse initiative to support near-real-time retail analytics across supply chain and store transaction workloads. The decision followed internal evaluation and benchmarking, and Walmart reported improved ingestion performance and better schema enforcement in internal tests, aligning the work with the Apps Category . The deployment positioned Apache Hudi as the canonical table format within the enterprise data lakehouse architecture, leveraging Hudi row-level upserts alongside merge-on-read storage with compaction to enable low-latency updates and efficient storage management. CDC ingestion patterns were used to capture transactional change streams, and Hudi configuration emphasized compaction scheduling and write optimization consistent with near-real-time ingestion requirements. Operational coverage was based in the United States and focused on supply chain analytics and store transaction workloads, with implementation scope spanning data engineering and analytics functions that consume lakehouse tables for reporting and downstream analytics. The implementation integrated Hudi-managed tables into existing ingestion pipelines and analytics reporting stacks, supporting both incremental update workflows and historical query patterns. Governance and operational practices were adapted to enforce Hudi table schemas and compaction policies, introducing routines for CDC handling, schema evolution controls, and compaction orchestration to maintain read performance. Outcomes reported by Walmart included improved ingestion performance and enhanced schema enforcement in internal benchmarks, as documented in Walmart technical communications. |
Buyer Intent: Companies Evaluating Apache Hudi
Discover Software Buyers actively Evaluating Enterprise Applications
| Logo | Company | Industry | Employees | Revenue | Country | Evaluated | ||
|---|---|---|---|---|---|---|---|---|
| No data found | ||||||||