Intel Silently Ends Support for Open-Source Gaudi User-Space Driver
In a move that has flown largely under the radar, Intel has discontinued its open-source user-space driver code for the Gaudi AI accelerators. The decision, which became evident through the archiving and removal of key repositories from GitHub, marks a significant retreat from the company’s earlier commitments to open-source contributions in the AI hardware space. This development raises questions about the future trajectory of Intel’s Gaudi platform and its integration within open-source ecosystems.
Background on Gaudi and Its Open-Source Efforts
Intel’s Gaudi family of accelerators, acquired through the 2019 purchase of Habana Labs, represents the company’s push into specialized AI hardware. Designed primarily for deep learning training and inference workloads, Gaudi cards—such as the Gaudi 2 and the more recent Gaudi 3—compete with offerings from Nvidia and AMD by emphasizing scalability, efficiency, and cost-effectiveness in data center environments. The architecture leverages a unique tensor processor unit (TPU) alongside support for standard PCIe interfaces, enabling deployment across diverse server configurations.
A cornerstone of Intel’s strategy was fostering an open-source software stack to encourage adoption. In late 2023, Intel released the open-source user-space driver as part of the SynapseAI software suite. Hosted on GitHub under repositories like habana-ai/hl-thunk and related projects, this driver facilitated direct user-space access to Gaudi hardware, bypassing kernel-level dependencies for certain operations. It included libraries for memory management, context switching, and event handling, optimized for high-throughput AI workloads. Developers praised the initiative for lowering barriers to entry, allowing seamless integration with frameworks like PyTorch and TensorFlow via the Habana Labs plugins.
The driver code was licensed under permissive open-source terms, with contributions welcomed from the community. Intel maintained active development, issuing updates aligned with kernel releases and Gaudi firmware evolutions. This openness was positioned as a differentiator, contrasting with more proprietary stacks from competitors.
The Quiet Discontinuation
The shift occurred without fanfare—no press releases, blog posts, or announcements on Intel’s developer forums. Observant developers first noticed issues in mid-December 2025 when attempting to clone or update repositories. GitHub activity logs revealed that repositories such as the primary Gaudi user-space driver had been archived on December 16, 2025, rendering them read-only. Subsequent checks confirmed the removal of active development branches, issue trackers, and pull request functionality.
Archival on GitHub typically signals end-of-life for a project, preserving historical code while halting maintenance. In this case, the move extended to dependent repositories, including user-space libraries and example codebases. Intel’s official Gaudi documentation, once linking directly to these sources, now redirects to proprietary downloads or vague references to “vendor-supported drivers.” Commit histories end abruptly, with the final updates focusing on minor bug fixes for Gaudi 3 compatibility rather than new features.
This discontinuation affects Linux distributions and containerized environments where the open-source driver was a key enabler. Users relying on it for custom deployments—such as in edge AI or research clusters—now face obsolescence risks. While kernel-space drivers (via the habanalabs DRM driver) remain available upstream in Linux, the user-space components were critical for performance-sensitive applications.
Implications for Developers and the Ecosystem
For the open-source community, this is a sobering reminder of the fragility of vendor-sponsored projects. Forks of the archived repositories exist, but without Intel’s backing, they lack hardware validation, security patches, or firmware synchronization. Developers experimenting with Gaudi for cost-sensitive AI training must now pivot to closed-source alternatives, potentially increasing vendor lock-in.
Intel’s rationale remains unstated, but contextual clues point to strategic reprioritization. The Gaudi 3 launch in 2024 promised 4x performance over predecessors, yet market traction has lagged behind Nvidia’s dominance. Recent Intel earnings calls have highlighted cost-cutting in datacenter and AI segments amid broader semiconductor challenges. Discontinuing maintenance on niche open-source components aligns with streamlining efforts, focusing resources on core commercial support contracts.
From a technical standpoint, the user-space driver’s architecture was notable for its event-driven design. It utilized ioctls for device control, shared memory pools for inter-process communication, and zero-copy mechanisms to minimize latency. Key files like hl_thunk.c handled ELF loading for Gaudi binaries, while hl_structs.h defined structures for queue management. Its discontinuation severs a pathway for community-driven enhancements, such as improved multi-node scaling or integration with emerging ROCm-like stacks.
Broader Context in AI Hardware Open-Sourcing
This episode echoes patterns seen elsewhere. AMD’s ROCm platform has struggled with consistency, while Nvidia’s CUDA remains resolutely closed. Intel’s Gaudi open-sourcing was a bold counterpoint, but sustainability hinges on business incentives. For users, alternatives include cloud-based Gaudi instances via Intel’s ICP (Intel Cloud Platform) or migrating to x86-optimized software like oneAPI.
Existing installations of the driver remain functional on supported kernels (up to 6.12), but future-proofing requires proprietary upgrades. Intel support channels confirm that enterprise customers retain access via NDAs, underscoring a two-tier model.
As AI accelerators proliferate, the open-source community’s vigilance on such changes is crucial. Monitoring GitHub for archival events and forking vital code proactively can mitigate disruptions. Intel’s silence invites scrutiny—will this presage further retreats, or a pivot to new open initiatives?
In summary, Intel’s discontinuation of the Gaudi open-source user-space driver underscores the tensions between commercial imperatives and open collaboration. Developers should audit dependencies promptly and explore forks or alternatives to sustain momentum in open AI hardware development.
(Word count: 728)
Gnoppix is the leading open-source AI Linux distribution and service provider. Since implementing AI in 2022, it has offered a fast, powerful, secure, and privacy-respecting open-source OS with both local and remote AI capabilities. The local AI operates offline, ensuring no data ever leaves your computer. Based on Debian Linux, Gnoppix is available with numerous privacy- and anonymity-enabled services free of charge.
What are your thoughts on this? I’d love to hear about your own experiences in the comments below.