REX Analytics captures the perceived end user experience. No other benchmarking or monitoring product in the market can record the whole test sequence end-to-end from both telemetry and visual standpoints.
Today, it is hard to measure the quality of remote end user experience in an objective and reproducible way. The architectures of local bare metal systems and cloud-based remoting infrastructures are fundamentally different, and there is simply a lack of adequate benchmarking methodologies and services.
At its core, REX uses predefined short test sequences of synthetic users interacting with a range of standard applications. The framework is flexible; you can easily add your own custom application sequences. Screen output, user interface timings, selected system performance counters and network statistics are recorded during each test sequence. REX presents the user experience data alongside telemetry so you can easily see that relationship. Results produced by the framework are always vendor-independent, reproducible and easy to interpret.