IT Brief Canada - Technology news for CIOs & IT decision-makers
Msi launches xpertstation ws300 on nvidia dgx station architecture media 1200x630

MSI unveils XpertStation WS300 for deskside AI power

Tue, 17th Mar 2026

MSI has launched the XpertStation WS300, a deskside system based on Nvidia DGX Station architecture, as companies look for local compute options for large language models and other data-heavy AI work.

The system uses the Nvidia GB300 Grace Blackwell Ultra Desktop Superchip and is positioned as a compact alternative to data centre infrastructure for development and deployment. XpertStation WS300 is available to order immediately.

"MSI has a strategic vision to advance AI-first computing," said Danny Hsu, General Manager of MSI's Enterprise Platform Solutions. "With NVIDIA, we are defining the next era of AI infrastructure, bridging centralized performance and distributed innovation, and enabling organizations to move from experimentation to production with greater speed, scale, and confidence."

Deskside compute

XpertStation WS300 includes up to 748GB of what MSI calls large coherent memory. The design combines HBM3e GPU memory and LPDDR5X CPU memory into a unified memory domain, improving CPU-GPU data sharing for model training and fine-tuning.

Networking is central to the WS300 specification. The system includes dual 400GbE connections based on Nvidia ConnectX-8 SuperNICs, for up to 800Gb/s aggregate bandwidth aimed at distributed AI workloads and multi-node scaling.

The machine also supports high-speed PCIe Gen5 and Gen6 NVMe storage for dataset ingestion and AI data pipelines during training and inference. It supports the Nvidia AI Software Stack.

Workflow scope

MSI is positioning the WS300 as a single system spanning development and deployment, covering model training, data-intensive analytics, and real-time inference. It also cited physical AI and robotics workloads.

The system can also serve as a central compute node for collaborative fine-tuning and on-demand deployment-an option for teams that want to keep proprietary data and intellectual property under their own control rather than running workloads in the cloud.

Agent tooling

Alongside the launch, MSI said it is collaborating with Nvidia on OpenShell, an open-source runtime for autonomous agents. It also referenced NemoClaw, described as an open-source stack that installs the OpenShell runtime and applies a policy-controlled sandbox.

OpenShell is intended for building and deploying autonomous, self-evolving agents with a stronger safety posture. It is optimised for dedicated AI systems and can be deployed on platforms based on DGX Spark and DGX Station architectures.

Developers can run OpenShell on XpertStation WS300 and run trillion-parameter models locally, with up to 20 petaFLOPS of AI compute and 748GB of memory. MSI described this as a route to always-on AI agents running at the desk without relying on cloud infrastructure.

EdgeXpert line

MSI also highlighted EdgeXpert, a separate platform line it said will support OpenShell. EdgeXpert is based on the Nvidia GB10 Grace Blackwell Superchip, with up to 1 petaflop of AI compute and 128GB of coherent unified memory.

The EdgeXpert configuration can host models with up to 200 billion parameters. MSI also described clustering via Nvidia ConnectX-7 networking, with support for up to four EdgeXpert units per cluster for larger workloads.

MSI also outlined how OpenShell could fit into existing development workflows. It said OpenShell lets AI agents, including coding agents, run inside a secure development sandbox without code changes, citing Claude Code, Codex, Cursor, and OpenCode as examples.

MSI said it will show live OpenShell demonstrations and display XpertStation and EdgeXpert systems at industry event booths during Nvidia GTC 2026.