What large-scale experience data reveals, and how leaders use it to calibrate decisions.
Most large organizations now collect some form of IT experience data. Surveys, feedback tools, and operational metrics are commonplace. Yet senior leaders still struggle to judge which signals matter, which are persistent, and which represent real risk.
This benchmark report exists to provide proportion.
Based on five years of consistent sentiment measurement, it offers a comparative view of how enterprise IT experience has evolved from 2020 to 2024 — and where frustration, confidence, and expectation continue to concentrate across core IT services.
Why benchmarks matter at executive level
Internal experience scores are easy to misread in isolation.
Without external context, leaders risk overreacting to noise or overlooking structural issues that persist year after year.
Benchmarks allow leaders to:
- Distinguish recurring experience patterns from localised complaints
- Understand which frustrations are industry-wide versus organization-specific
- Calibrate attention and investment rather than simply reacting
This report is designed to support that calibration.
The data behind the benchmarks
The analysis draws on a large, longitudinal dataset representing real enterprise environments:
- Experience feedback from approximately 25,000 employees
- Over 60,000 verbatim comments analysed using natural language processing
- Coverage across 1,000 locations and 20 industries
- Consistent measurement across 13 IT service domains
- Data collected continuously between 2020 and 2024
Automated analysis was combined with human review to ensure patterns reflect lived experience rather than isolated sentiment spikes .
What the benchmarks show
Rather than focusing on headline scores, the report highlights patterns that remain stable across time and context:
- How experience pressure has shifted since 2020 Including the sustained prominence of collaboration tools, rising frustration with security and authentication, and the growing gap between home and in-office technology.
- Where frustration concentrates Across devices, support models, enterprise applications, and identity controls — often in ways leaders underestimate because issues appear operational rather than strategic.
- Which signals are often over-weighted And which tend to be early indicators of broader experience risk, confidence erosion, or disengagement.
- Where leadership attention is commonly misaligned Based on repeated patterns in how experience issues are surfaced, discussed, and acted upon.
How leaders use this report in practice
This report is typically used to:
- Provide external context during planning and prioritisation cycles
- Challenge or validate internal narratives about IT performance
- Support executive and finance discussions with comparative evidence
- Establish a shared reference point across IT, workplace, and leadership teams
It is not a one-time read, but a reference leaders return to as conditions change.
Download the report
The full report includes detailed domain-level benchmarks, trend analysis, and interpretive commentary designed for senior decision-makers.

