AI Buyer Insights:

Swedbank, a Temenos T24 customer evaluated Oracle Flexcube

Cantor Fitzgerald, a Kyriba Treasury customer evaluated GTreasury

Michelin, an e2open customer evaluated Oracle Transportation Management

Westpac NZ, an Infosys Finacle customer evaluated nCino Bank OS

Citigroup, a VestmarkONE customer evaluated BlackRock Aladdin Wealth

Moog, an UKG AutoTime customer evaluated Workday Time and Attendance

Wayfair, a Korber HighJump WMS customer just evaluated Manhattan WMS

Swedbank, a Temenos T24 customer evaluated Oracle Flexcube

Cantor Fitzgerald, a Kyriba Treasury customer evaluated GTreasury

Michelin, an e2open customer evaluated Oracle Transportation Management

Westpac NZ, an Infosys Finacle customer evaluated nCino Bank OS

Citigroup, a VestmarkONE customer evaluated BlackRock Aladdin Wealth

Moog, an UKG AutoTime customer evaluated Workday Time and Attendance

Wayfair, a Korber HighJump WMS customer just evaluated Manhattan WMS

Experiment Tech Stack and Enterprise Applications

Experiment HCM
Vendor
Previous System
Application
Category
Market
VAR/SI
When
Live
Insight
Clara Labs Legacy Clara Interview Scheduling Interview Scheduling HCM n/a 2017 2017
In 2017, Experiment implemented Clara Interview Scheduling as its Interview Scheduling solution to coordinate interviews and meetings for its small professional services firm. The deployment was centered on the CEO as primary user, with Cindy Wu using Clara Interview Scheduling while traveling across America to arrange meetings with scientists. The implementation prioritized automated, assistant-like scheduling workflows to present an always-on scheduler for external interviewees. Functional use emphasized availability polling, automated meeting confirmations, and natural language scheduling flows that are typical of the Interview Scheduling category, configured for a single executive workflow rather than enterprise role based access. Operational coverage was narrow and practical, focused on CEO level calendar and external interview coordination for a five person company, and rollout was effectively a single-user production use case with minimal governance. Cindy Wu reported that Clara was most useful while on the road and that Clara Interview Scheduling made it seem like she had an assistant without hiring a human assistant, indicating the implementation stressed personal productivity and lightweight operational controls.
Experiment Collaboration
Vendor
Previous System
Application
Category
Market
VAR/SI
When
Live
Insight
Google Legacy Google Workspace (Formerly Google G-Suite) Collaboration Collaboration n/a 2013 2013
In 2013, Experiment implemented Google Workspace (Formerly Google G-Suite) as its primary Collaboration platform. The deployment covers the full 5 person professional services firm in the United States and is used as the central system for corporate email, calendar, document collaboration and cloud storage. Google Workspace (Formerly Google G-Suite) was configured to support company domain email hosting, shared Drive folders for client deliverables, collaborative editing with Docs and Sheets, and Meet for virtual meetings, aligning with standard Collaboration category capabilities. Configuration and administration are managed through the Google Admin console for user provisioning, role based access control and domain management, with shared folder permission models to manage client and project level access. Operational scope is company wide across all departments, consolidating communications and document management into Google Workspace, and positioning Experiment Google Workspace (Formerly Google G-Suite) Collaboration as the primary platform supporting communications, document collaboration and project coordination.
Experiment CRM
Vendor
Previous System
Application
Category
Market
VAR/SI
When
Live
Insight
Mixpanel Legacy Mixpanel Marketing Analytics CRM n/a 2014 2014
In 2014, Experiment implemented Mixpanel on its public website. Mixpanel is deployed as the Marketing Analytics solution to instrument event level user behavior and session interactions across the Experiment.com site. The implementation uses the Mixpanel JavaScript tracking library to capture custom events, properties, timestamps, and user identifiers to support product and marketing analytics. Instrumentation and configuration emphasized event taxonomy, client side SDK tagging, and dashboarding for funnel and retention analysis, aligning Experiment Mixpanel Marketing Analytics with product and marketing workflows. Operational coverage is limited to the Experiment website and associated front end pages, with analytics responsibilities concentrated within the small internal team. Governance and rollout practices included centralized event naming conventions and a staged tagging rollout to establish consistent data feeds for Mixpanel dashboards and analysis.
Marketing Automation CRM 2016 2016
Experiment ITSM
Vendor
Previous System
Application
Category
Market
VAR/SI
When
Live
Insight
Application Performance Management ITSM 2014 2014
Experiment PaaS
Vendor
Previous System
Application
Category
Market
VAR/SI
When
Live
Insight
Apps Development PaaS 2014 2014
Transactional Email PaaS 2014 2014
Experiment IaaS
Vendor
Previous System
Application
Category
Market
VAR/SI
When
Live
Insight
Application Hosting and Computing Services IaaS 2014 2014
Cloud Storage IaaS 2017 2017
Content Delivery Network IaaS 2018 2018

IT Decision Makers and Key Stakeholders at Experiment

First Name Last Name Title Function Department Email Phone
No data found

Apps Being Evaluated by Experiment Executives

APPS RUN THE WORLD tracks software evaluation trends across 2 million companies worldwide, including buyer insights from Experiment IT executives and key decision makers. As part of ARTW Buyer Intent and technographics insights, these findings provide useful visibility into the Experiment digital transformation priorities and AI adoption trends.
Date Company Status Vendor Product Category Market
No data found
FAQ - APPS RUN THE WORLD Experiment Technographics
Experiment is a Professional Services organization based in United States, with around 5 employees and annual revenues of $1.0 million.
Experiment operates a diverse technology stack with applications such as Clara Interview Scheduling, Google Workspace (Formerly Google G-Suite) and Mixpanel, covering areas like Interview Scheduling, Collaboration and Marketing Analytics.
Experiment has invested in cloud applications and AI-driven platforms to optimize efficiency and growth, collaborating with vendors such as Clara Labs, Google and Mixpanel.
Experiment recently adopted applications including Amazon CloudFront in 2018, Clara Interview Scheduling in 2017 and Amazon S3 in 2017, highlighting its ongoing modernization strategy.
APPS RUN THE WORLD maintains an up-to-date database of Experiment’s key decision makers and IT executives, available to Premium subscribers.
Our research team continuously updates Experiment’s profile with verified software purchases, vendor relationships, and digital initiatives identified from public and proprietary sources.
Subscribe to APPS RUN THE WORLD to access the complete Experiment technographics profile, including detailed breakdowns by category, vendor, and IT decision makers.