Federal data center teams will have a new member suiting up for their strategic plans this summer to help drive efficiency and optimization. And this team member doesn’t take vacations or lunch breaks.
Nlyte Software has partnered with IBM’s Watson IoT group to develop a cognitive data center infrastructure management (DCIM) solution that can tap into the power of advanced analytics and artificial intelligence to make data center operations more resilient and efficient.
Today’s data centers are powered by an ecosystem of power distribution devices, cooling technologies, data backup applications, security software, backup generators, and batteries. And these data center environments keep getting more and more complex.
“The scale, complexity, and optimization in modern data centers requires analytics,” for data center managers to make informed decisions, said Enzo Greco, chief strategy officer of Nlyte. “There are so many variables and inputs that it is very difficult to manage them traditionally. Analytics are a perfect use case for a complex environment like the data center,” Greco told MeriTalk.
The future efficient data center services will be “always-available, always-healing,” and optimized for predictive maintenance where faults are detected before they happen, and workloads can be optimally placed based on informed data.
This is why IBM Watson IoT is being embedded into Nlyte’s data center infrastructure management solution. IBM’s Watson, an AI supercomputer, provides a cognitive search and analytics platform to connect and analyze a rich set of distributed data to improve decision-making and business outcomes, providing plans for center operations.
As part of the Nlyte’s DCIM family, Nlyte Energy Optimizer (NEO) provides real-time monitoring, alarming, trending, and power systems analysis of both IT and facilities infrastructure. “Nlyte Software is combining NEO with IBM Watson’s AI abilities to provide data centers with a new level of operational comprehensiveness. It is in the form of a cognitive solution that provides current analysis of total operations and also future insights into device failures,” Amy Bennett, a manager with IBM Watson IoT’s marketing team, wrote in IBM’s blog.
The union is accomplished by IBM’s Predictive Maintenance and Optimization (PMO) solution within NEO. PMO enables asset-intensive organizations to apply machine learning and analytics to improve maintenance strategies and center automation. “PMO will take data streamed from NEO and apply pre-determined patterns. The resulting analysis will be used by NEO to produce data center-specific reports or take action, such as controlling set points on thermal equipment,” according to Bennett.
Nlyte will provide implementation and support expertise for the combined NEO and PMO product. Nlyte is already testing the DCIM offering with customers and the general rollout for agencies is slated for the end of June, Greco said.
This is all good news for Federal data center managers struggling to meet requirements mandated by the Federal Data Center Optimization Initiative (DCOI). More than 30 Federal agencies deploy Nlyte DCIM tools to help them with data center management, optimization, cost reductions, and data center consolidation, said Andrew Ryan, Nlyte’s VP-federal accounts. “We’ve built a secure architecture for agency headquarters to take data from sub-agencies and consolidate it and report up to OMB [Office of Management and Budget],” Ryan said. He noted that Nlyte was one of the first DCIM vendors to provide all DCOI metrics and report on them within a single platform.
Other DCIM vendors are applying machine learning capabilities to help organizations achieve greater efficiency in data center operations, such as Vigilent, said Rhonda Ascierto, research director for data centers and critical infrastructure at 451 Research.
“However, what Nlyte, APC Schneider Electric, and Eaton to a limited degree, are doing is offering analytics as-a-service,” Ascierto said. To do that you need big data. Nlyte can pull data from large data lakes, which include information from many different data centers, locations, configurations, and equipment. “The greater the pool of data, the more effective that analysis can be rather than if you are just using your own data sets,” Ascierto added. “That’s the key.”