- Mobile Computing
- Computing
- Displays
- Storage
- Network
- Components
- Communication
- Photo / Video
- Server
- Input
- Cabel & Adapter
- Presentation
- Print & Scan
-
Software
- See all
- Security
- Data management
- Network
- Office applications
- Collaboration software
- Graphic / Multimedia
- Virtualization
- Operating systems
-
Security
-
Data management
-
Network
-
Office applications
-
Collaboration software
-
Graphic / Multimedia
-
Virtualization
-
Operating systems








Dell Starter AI Bundle: The perfect test environment for AI workloads
This bundle is the optimal test environment for AI. It allows you to test models in a small, AI-optimized environment and prepare them for later integration.
The Dell PowerEdge server forms the high-performance hardware foundation and is equipped with two NVIDIA L4 graphics cards. Red Hat Enterprise Linux AI provides an AI-optimized operating system – including InstructLab tools for model adaptation and Inference Server for professional model deployment. Ideal for testing your own LLMs locally and deploying them as an inference endpoint.
Configuration Details:
Compute
1 x Dell PowerEdge R660 (incl. 3 years of support)
Software & Services
1 x Red Hat Enterprise Linux AI (incl. 3 years of support)
Benefits at a Glance:
Solid NVIDIA graphics cards
AI-optimized RHEL image with integrated Red Hat AI Inference Server (vLLM-based, OpenAI compatible)
Validated open-source models (IBM Granite) ready to use
InstructLab tools for model adaptation
Container-native with GPU-optimized images for NVIDIA, Intel, and AMD
Easy upgrade to OpenShift AI for MLOps


Dell Starter AI Bundle: The perfect test environment for AI workloads
This bundle is the optimal test environment for AI. It allows you to test models in a small, AI-optimized environment and prepare them for later integration.
The Dell PowerEdge server forms the high-performance hardware foundation and is equipped with two NVIDIA L4 graphics cards. Red Hat Enterprise Linux AI provides an AI-optimized operating system – including InstructLab tools for model adaptation and Inference Server for professional model deployment. Ideal for testing your own LLMs locally and deploying them as an inference endpoint.
Configuration Details:
Compute
1 x Dell PowerEdge R660 (incl. 3 years of support)
Software & Services
1 x Red Hat Enterprise Linux AI (incl. 3 years of support)
Benefits at a Glance:
Solid NVIDIA graphics cards
AI-optimized RHEL image with integrated Red Hat AI Inference Server (vLLM-based, OpenAI compatible)
Validated open-source models (IBM Granite) ready to use
InstructLab tools for model adaptation
Container-native with GPU-optimized images for NVIDIA, Intel, and AMD
Easy upgrade to OpenShift AI for MLOps