Automated Five-Step Approach for Quick, Actionable Results

VMware® IT Benchmarking™ products simplify the process of benchmarking the overall cost and quality of IT services. Its powerful Automated Analysis Engine automates data collection and validation, completes both industry and functional peer selection and performs all benchmarking calculations to produce a final set of actionable benchmarking reports and data that can be analyzed within the VMware® IT Business Management Suite. VMware streamlines IT Benchmarking in five intuitive steps:

Phase 1: Select Towers (Standard Modules) for Benchmarking – Simply select the Towers you want to include in the benchmark and move immediately to data collection. Select any combination of modules and even perform multiple benchmarks on the same module. Available benchmarking Towers include:

  • Distributed Computing
  • Midrange Computing (Unix, Linux, MAC OS, WIN NT/Legacy OS, Win 2K-­2K3, OS 400, Win 2008+.)
  • IT Help Desk
  • Mainframe Computing
  • Applications Development
  • Applications Support
  • Telecommunications - PBX
  • Telecommunications - Wireline Services
  • Telecommunications - Wireless Services
  • Wide Area Data Networking
  • End-User Survey*
  • Business Unit Survey*

Phase 2: Data Collection – The most critical phase for the benchmarking process, you’ll define the inputs into the benchmark that will result in the final analysis and generated reports. Simply enter the information into each field to cover workload, cost, complexity, staffing and aging. Offline data collection documents and tools are also available to assist in preparing for data collection. VMware Professional Services are available to assist in providing scoping and data collection support during this phase.

Phase 3: Data Validation – To ensure quality data input, the Automated Analysis Engine includes a robust validation process that performs multiple levels of data integrity validation to help ensure the data is right before the final analysis begins. Complex validations are performed and warnings are provided to indicate potential issues. Upon completion of all data updates based on the validation results, simply freeze the entered data in preparation for the analysis phase.

Phase 4: Data Analysis – The Engine automatically performs all critical statistical generation, peer selection and analysis processing, giving each module its own separate focus, statistics, peers and metrics. Analysis is performed through the comparisons of your metrics against peer group averages, industry averages and database averages. This results in multiple views into resource investments throughout the IT organization.

Phase 5: Report Generation – The report generation process utilizes all entered data, generated statistics, comparatives and charts in the creation of a final benchmark report. Each report is specific to your data and provides a standardized and understandable format across all modules.

* These Towers are qualitative and involve surveying end-users of IT and IT line of business decision makers. The only data required for these Towers is the list of respondents to include in the surveys and review of the questions.