Cost Metrics in the data center industry are way over estimated, and too simple.There’s no complexity, which is often never the case when designing and building data center infrastructure.
Typically, conceptual estimating of technical facilities, including data centers, have fallen short as they have followed the commercial building $/sq. ft. cost model. In the last several years, leaders in the industry know that a cost model associated with cost per KW of IT load is a more viable way to plan and develop a conceptual estimate for funding a data center project. However, the industry has continued to evolve with the cost of developing a new data center dependent on an array of factors. General costs have greatly reduced for many reasons, ranging from efficiency in the delivery of A&E and construction services to the commoditization of the data center by large aggregators of data center space (specifically the co-location providers).
Gone are the days of reliability as the only criteria for conceptual estimating of complex data centers. A one tier fits all metric is too simple and results in bad decision making. The Uptime Institute’s Tiering system warrants revision as the industry is yearning for a new set of more sophisticated metrics that reflect the current conditions of the data center market.
Many data center owners want to understand the choices they have regarding design strategies and their total cost of ownership (TCO) during the design process. There is necessity for real data based on real input and real business strategies to deliver the right cost for a data center engagement. This includes balancing reliability with a modular approach to deployment, and other key design strategies like energy efficiency (PUE), water usage, and rapid scalability of the infrastructure systems, including a MEP infrastructure that can be incrementally commissionable. Together with an integrated service provider (architect, engineer, contractor) or with collaboration between all building team members, which takes into account the initial capital cost and labor and energy use over the life of the facility to determine a true TCO.
This approach to conceptualizing cost metrics helps data centers owners make decisions as they perceive their needs and balance their options in a smart way. Decisions based on real data resulting in better outcomes.
PlanNet has created a proprietary design and construction cost analysis tool to help users make design decisions based on real construction costs. The tool is derived from PlanNet’s experience with the design and construction of over 10 million square feet of data center space over the last decade. As a Data Center Designer, General Contract and IT group, all in-house, PlanNet provides clients with technical knowledge and experience in all major areas of data center facility development.