Intel has introduced its first update for the software side of Project Battlematrix. The company’s new inference-optimized software stack optimizes AI workload orchestration on the company’s Arc Pro B-series GPUs in multi-GPU workstations. The suite includes a Linux-based LLM Scaler for AI inference workflows.
Project Battlematrix is an Intel AI-focused initiative, designed to provide highly capable Intel-powered AI workstations for the market. The project combines Intel hardware and software to create a cohesive workstation solution involving multiple Arc Pro B-series GPUs in a single system. Project Battlematrix workstations will come with Xeon CPUs, up to eight GPUs, and up to 192GB of total VRAM, with pricing for ‘Project Battlematrix’ AI workstations ranging from $5,000 to $10,000.
Supporting Project Battlematrix workstations is a validated full-stack containerized Linux solution, which will include everything needed to get the servers up and running quickly and easily. The LLM Scaler is just one of several containers Intel is developing for its full-stack containerized Linux solution.
You may like
-
Maxsun unveils Intel dual-GPU Battlemage graphics card with 48GB GDDR6 to compete with Nvidia and AMD
-
MLPerf Client 1.0 AI benchmark released
-
Intel says you can download more FPS for Lunar Lake