At DCD Connect London, a panel about using data to drive better operational decisions stood out as it discussed a crucial point in optimizing data center operations that most data centers can implement.
Omdia view
Summary
The DCD Connect London event, held in early October, was focused on helping future-proof data centers. The atmosphere was optimistic, and most talks were at full capacity. A panel about using data to drive better operational decisions stood out because it discussed a crucial point in optimizing data center operations that can be implemented by most data centers: realizing the value of operational data.
The power of data in making better operational decisions
On the first day, there were many panel discussions about various topics ranging from heat reuse to the current talent shortage. However, one panel named “Data, data everywhere – but how do we actually use it to drive better operational decisions?” gave more cutting-edge and forward-looking insights than other panels. The speakers were Brian Kortendick, director of strategy and growth at MCIM; Lex Coors, chief data center technology and engineering officer at Digital Realty; John Shingler, executive vice president at T5 Data Centers; and Colm Shorten, Senior Director – Data Centers at JLL. Emma Brookes from DCD moderated the panel.
Operational data
At the start, the panel participants agreed that companies and data centers mostly use data from operations only for internal purposes. Operational data gives many valuable insights, such as providing up-to-date information on data center equipment performance, implementing predictive maintenance strategies, making informed decisions about resource allocation, and much more. Brian Kortendick from MCIM stated: “Unfortunately, data centers have no incentive to share operational data with the public or other data centers, such as where technical failures have occurred or potential system optimization opportunities, as those data centers want this information to be private.” Data centers hide this information to help maintain a competitive advantage and to ensure network resiliency, because operational information could provide attackers with insights into network and system vulnerabilities. The consensus among the panel was that there is a growing trend of releasing that data to the public and local authorities. This newfound openness is a positive development, but with it there is a need for that data to be accurate.
Data collection and standardization
The discussion moved on to accuracy and security concerns when collecting personal data from members of the public, such as contact information, individual habits, and opinions. In most cases, companies use anonymized data to circumvent the associated risks. One crucial flaw of anonymized data is its tendency to contain inaccuracies, such as systematic and random errors. The panel agreed that regulating the data collection techniques and ensuring a standardized format for data collection when gathering personal data would be a step in the right direction. Data standardization allows for an accurate comparison of multiple data sources, while anonymity maintains the privacy of those individuals.
The panel shifted to data centers’ data reporting and collection. Lex Coors from Digital Realty noted that data collection and reporting could be challenging “due to the overlapping nomenclature used in different fields, which can cause misinterpretation of data, even if researchers use the same terms.” The panel detailed that data collection techniques can also skew results, leading to different techniques causing dissimilar results when gathering the same data. The data center industry needs a standard data collection and reporting nomenclature to address these issues. Once standardized, data centers can use their operational data to make educated decisions, such as when to replace components based on past trends. Lex Coors added, “Having a strategy before analyzing the data is vital, as the same data can give many meaningful insights.” The panel concluded that generative artificial intelligence (AI) can find valuable trends and insights from multiple operational data sources once data centers have a strategy. This technique would previously have been impossible due to how long it takes humans to process large amounts of data and the fact that many of the data sources would have been incompatible due to different nomenclature. It is essential to use focused data and disregard unimportant data; otherwise, data center clutter can occur.
Data accuracy
The panel’s final topic was the importance of data accuracy. John Shingler from T5 Data Centers observed, “A data set’s accuracy is crucial when analyzing the most important metrics, such as a data center’s climate change emission metrics.” He then explained that data centers not accurately reporting these metrics could lead to a bad public image for the industry, causing detrimental consequences. These repercussions can materialize in sentiments such as “not in my backyard” (NIMBY), which cause the creation of local laws or regulations that hinder the building of new data centers. The panel agreed that the current data center model works well from an operational perspective, so many owners and operators want to stay within it, making change hard. To address this issue, data centers must improve operational data accuracy for critical metrics and make the infrastructure changes necessary to meet and exceed those metrics. Once most data centers have achieved this they can move on to other metrics over time. This approach will help to ensure that the data center industry moves in the right direction, with high-quality data used to make informed decisions and drive innovation.
Analyst’s view
Leveraging operational data to enhance decision making is a practice that data centers, regardless of their size or budget constraints, can readily implement. Doing so offers a pathway to optimizing performance and efficiency, even for those with limited resources.
Establishing a standardized nomenclature when data centers collect operational data brings consistency, deepens our understanding of familiar data, and opens the door to uncovering novel insights. Data centers can glean a more-comprehensive perspective by amalgamating data from multiple sources. Omdia recognizes this shift as a positive progression in data center operations, where the emphasis lies on making the most of existing operational data, ultimately maximizing its value with minimal additional effort. This approach aligns perfectly with the broader goal of data centers to refine their processes, reduce costs, and improve service delivery.
Appendix
Further reading
Blockchain Technology and Adoption Trends (December 2019)
“Blockchain is good for more than just Bitcoin” (September 2019)
“CenturyLink goes ‘colorless’ and takes on the edge cloud” (February 2020)
Service Provider Routers & Switches Market Tracker – Q4 2019 (February 2020)
Li You, “Tech-savvy Hangzhou tries out new ‘City Brain’,” China Daily (retrieved June 17, 2021)
Author
Aaron Lewis, Analyst, Cloud and Data Center