During a presentation at the Data Center Dynamic (DCD) Virtual conference earlier this year, Rhonda Ascierto, Vice President of Research at the Uptime Institute, said that 70 percent of IP professionals believe artificial intelligence (AI) will be used to control and operate Data Centers while only 30 percent believe it will reduce staffing. While the feeling that AI may not yield substantial staff savings, many data center operators are deploying AI in their data centers to autonomously handle various tasks like server optimization and equipment monitoring.
Artificial intelligence generally describes situations where a Virtual Engineer (a computer with a software program) can perform tasks that typically require human intelligence but AI also encompasses machine learning and deep learning. Machine Learning involves computers using algorithms and statistical models to carry out tasks without explicit instructions and learning by experience and acquiring skills without human involvement. Deep learning is a subset of machine learning and is essentially an information processing model used to process nonlinear information inputs and outputs in parallel like a human brain does. Similar to how humans learn from experience, the deep learning algorithm performs tasks repeatedly, each time making minor modifications in order to improve the outcome.
The use of AI (broadly speaking) in Data Centers requires a number of preconditions. First, there needs to be a decent understanding of the business rules for operations and various outcomes. Second, the quality of the AI responses is dependent on the amount and accessibility of data. In order for there to be good accuracy, AI needs at least 100,000 observations and the data needs to be accessible (i.e. not siloed).
Data Center Infrastructure Management (DCIM) vendors have been talking about adding AI and especially predictive-analytics dimensions to their Data Center management tools for years. One of the interim challenges was not having enough data to inform the AI and ML models. Schneider Electric reports that it is now collecting data from between 250,000 and 300,000 devices deployed in customer Data Centers. According to Ascierto, “DCIM can now be considered a mainstream technology.”
The diagram below from Schneider Electric is a great representation of this lifecycle. As Frank Panza, Secure Power, Director of Business Development & Strategy at Schneider Electric, explains:
“We offer a full suite of solutions that range from license-based to Software as a Service that provides for: i) the collection of the data being generated by the DC Hardware (ie. UPS, HVAC, etc.); ii) the ability to analyze the collected data & iii) then work with clients to implement the Business Processes that meet the client’s Hardware Servicing needs.”
The other requirement for effective deployment and utilization of AI is to involve qualified data scientists and professionals that understand the business requirements, the intricacies of the sophisticated models and how to improve the models over time.
What are some of the use cases?
Efficiency. Businesses can deploy AI in Data Centers for energy savings. AI can learn and analyze temperature set points, test flow rates, and evaluate cooling equipment. Organizations can also train their AI by collecting critical data with the help of smart sensors. With this approach, AI can identify sources of energy inefficiencies and autonomously fix these inefficiencies to reduce energy consumption.
Equipment monitoring. Artificial intelligence can identify defects in Data Center equipment using pattern-based learning through smart sensors installed in the hardware. In case AI systems find any excessive or low vibrations and unwanted sounds, the system would notify data center engineers about possible defects. With this approach, implementing AI in the data center can predict potential equipment failures to avoid downtimes.
Server optimization. Deploying AI in Data Centers can help distribute the workload across various servers with the help of predictive analytics. AI-powered load-balancing algorithms can learn from past data to distribute workloads efficiently while AI-based server optimization can help find possible flaws in Data Centers, reduce processing times and resolve risk factors quicker than traditional approaches. With this approach, organizations can maximize server optimization and performance.
Network Security. Identifying and analyzing security threats in a data center is extremely labor-intensive. Companies can use AI in Data Centers to learn normal network behavior and alert operators to abnormal activity, detect malware and security loopholes in data and analyze incoming and outgoing data for security threats. AI can also be used for physical screening and monitoring with facial recognition and video monitoring of equipment which will trigger an alarm if something is operating out of the ordinary.
Reduce IT Level downtime. AI can monitor server performance, network congestions, and disk utilization to detect and predict data outages. More advanced systems can automatically take corrective actions, notify affected users and help data center operations self-heal and recover from the outage.
Concerns over AI
While there are many benefits to AI for Data Centers, there are concerns to address. The use of automated modelling places some of your systems on autopilot. IT Operators need to have a thorough understanding of the range of outcomes yielded by the models. It is not always possible to understand why a machine made a given decision and some operators are concerned about losing the skills of operating their own Data Centers as more and more AI is employed. There are also legal and regulatory risks if things go wrong—where is the liability? With the client? With the AI vendor?
For more information on Cushman & Wakefield’s Global Data Center Advisory Group, contact us today.