Red Hat Enterprise Linux AI

Red Hat® Enterprise Linux® AI is a foundation model platform to seamlessly develop, test, and run Granite family large language models (LLMs) for enterprise applications.

Red Hat Enterprise Linux AI hero visual with logo and technology icon overlain on top of swirls and clouds

Deploy with partners

Artificial intelligence (AI) model training requires optimized hardware and powerful computation capabilities. Get more from Red Hat Enterprise Linux AI by extending it with other integrated services and products.

Dell Technologies Logo

Dell and Red Hat deliver a consistent AI experience through an optimized and cost-effective single-server environment.

Lenovo Logo

Lenovo and Red Hat deliver optimized performance, helping customers quickly put AI initiatives into production.

Nvidia Logo

NVIDIA and Red Hat offer customers a scalable platform that accelerates a diverse range of AI use cases with unparalleled flexibility.

How Red Hat Enterprise Linux AI works

Red Hat Enterprise Linux AI brings together:

  • The Granite family of open source-licensed LLMs, distributed under the Apache-2.0 license with complete transparency on training datasets.
  • InstructLab model alignment tools, which open the world of community-developed LLMs to a wide range of users.
  • A bootable image of Red Hat Enterprise Linux, including popular AI libraries such as PyTorch and hardware optimized inference for NVIDIA, Intel, and AMD.
  • Enterprise-grade technical support and Open Source Assurance legal protections. 

Red Hat Enterprise Linux AI allows portability across hybrid cloud environments, and makes it possible to then scale your AI workflows with Red Hat OpenShift® AI and to advance to IBM watsonx.ai with additional capabilities for enterprise AI development, data management, and model governance.

Take control of LLMs with open source

Generative AI (gen AI) is a catalyst for groundbreaking change, disrupting everything from how software is made to how we communicate. But frequently, the LLMs used for gen AI are tightly controlled, and cannot be evaluated or improved without specialized skills and high costs.

The future shouldn’t be in the hands of the few. With Red Hat Enterprise Linux AI and its open source approach, you can encourage gen AI innovation with trust and transparency, while lowering costs and removing barriers to entry.

You can even contribute directly to AI model development with InstructLab, a community-driven solution for enhancing LLM capabilities.

Icon illustration of open source bubbles

LLMs for the enterprise

Open source-licensed IBM Granite LLMs are included under the Apache-2.0 license, and fully supported and indemnified by Red Hat.

Icon illustration of community

Community collaboration

InstructLab makes it possible to simplify generative AI model experimentation and alignment tuning.

Icon illustration of a diagram signifying simplify

Cloud-native scalability

Red Hat Enterprise Linux image mode lets you manage your AI platform as a container image, streamlining your approach to scaling.

Icon illustration of a toolbox

Acceleration and AI tooling

Open source hardware accelerators, plus the optimized deep learning features of PyTorch, support faster results.

Explore related resources

Article

RAG vs. fine-tuning

Article

What is InstructLab?

Product

Red Hat OpenShift AI

Overview

Red Hat Consulting: MLOps Foundation

Contact Sales

Talk to a Red Hatter

Reach out to our sales team below for Red Hat Enterprise Linux AI pricing information. 
To learn more about our partnerships, visit our catalog page, visit our catalog page.