Hewlett Packard Enterprise (HPE) has enhanced its AIOps network management capabilities by integrating multiple generative AI (GenAI) large language models (LLMs) in HPE Aruba Networking Central, its cloud-native network management offering, hosted on the HPE GreenLake Cloud Platform, which has now been adopted by Verizon Business as part of its expanded managed services portfolio.
Explaining how these enhancements will give the company an edge in an ever more competitive business marketplace, HPE said that unlike other GenAI networking approaches that simply send API calls to public LLMs, Aruba Networking Central’s self-contained set of LLM models was designed with “innovative” pre-processing and guardrails to improve user experience and operational efficiency, with a focus on search response times, accuracy and data privacy.
Will Townsend, vice-president and principal analyst at Moor Insights and Strategy, added: “Today, customers are searching for practical AI applications that can make meaningful improvements to their business. When it comes to effective network management, data and models matter. AI is not new to networking and security operations, but generative AI is given its natural language interface. The challenge lies in tightly aligning queries with AI algorithms to deliver relevant and meaningful business outcomes. A set of purpose-built LLMs delivered in a self-contained sandbox delivers the best of both privacy and performance.”
Claiming to have one of the largest data lakes in the industry, HPE Aruba Networking has collected telemetry from nearly four million network-managed devices and more than one billion unique customer endpoints, which power the service’s machine learning models for predictive analytics and recommendations.
The GenAI LLM functionality will be incorporated into HPE Aruba Networking Central’s AI Search feature, complementing existing machine learning (ML)-based AI throughout HPE Networking Central to provide deeper insights, better analytics and more proactive capabilities.
As part of its expanded capabilities, the platform’s training sets for the GenAI models are up to 10 times larger than other cloud-based platforms, and include tens of thousands of HPE Aruba Networking-sourced documents in the public domain. There are also more than three million questions that have been captured from the customer base over many years of operations.
The launch will also focus on what the company calls a commitment to taking advantage of AI safely, with a security-first approach to personal- and customer-identifiable information (PII and CII), as the LLMs are “sandboxed” in HPE Aruba Networking Central, running on the HPE GreenLake Cloud Platform. HPE Aruba Networking Central also ensures customer data security with proprietary, purpose-built LLMs that remove PII and CII data, and improve search accuracy, all while delivering sub-second response to network operations questions.