HPE introduces AI Cloud for large language models

By IANS | Published: June 21, 2023 11:36 AM2023-06-21T11:36:08+5:302023-06-21T11:55:29+5:30

Las Vegas, June 21 Hewlett Packard Enterprise (HPE) has entered the AI cloud market through the expansion of ...

HPE introduces AI Cloud for large language models | HPE introduces AI Cloud for large language models

HPE introduces AI Cloud for large language models

Las Vegas, June 21 Hewlett Packard Enterprise (HPE) has entered the AI cloud market through the expansion of its HPE GreenLake portfolio to offer large language models (LLMs) for any enterprise.

The new offering is the first in a series of industry and domain-specific AI applications with future support planned for climate modelling, healthcare and life sciences, financial services, manufacturing, and transportation, the company said during its event here.

With the introduction of HPE GreenLake for LLMs, enterprises can privately train, tune, and deploy large-scale AI using a sustainable supercomputing platform that combines HPE's AI software and market-leading supercomputers.

"We have reached a generational market shift in AI that will be as transformational as the web, mobile, and cloud," said Antonio Neri, president and CEO, at HPE.

Now, organisations can embrace AI to drive innovation, disrupt markets, and achieve breakthroughs with an on-demand cloud service that trains, tunes, and deploys models, at scale and responsibly, Neri added.

HPE GreenLake for LLMs will be delivered in partnership with HPE's first partner Aleph Alpha, a German AI startup, to provide users with a field-proven and ready-to-use LLM to power use cases requiring text and image processing and analysis.

Unlike general-purpose cloud offerings that run multiple workloads in parallel, HPE GreenLake for LLMs runs on an AI-native architecture uniquely designed to run a single large-scale AI training and simulation workload, and at full computing capacity.

The offering will support AI and HPC jobs on hundreds or thousands of CPUs or GPUs at once, said the company.

Disclaimer: This post has been auto-published from an agency feed without any modifications to the text and has not been reviewed by an editor

Open in app