AI5 views6 min read

Huawei Commits to Open-Source AI Stack by End of 2025

Huawei has announced a comprehensive plan to make its entire artificial intelligence software stack open-source by December 31, 2025, to boost developer adoption.

Kevin Bryant
By
Kevin Bryant

Kevin Bryant is a technology correspondent for Neurozzio, focusing on the intersection of software development, artificial intelligence, and business strategy. He reports on developer tools, APIs, and the companies building foundational AI infrastructure.

Author Profile
Huawei Commits to Open-Source AI Stack by End of 2025

Huawei has announced a comprehensive plan to make its entire artificial intelligence software stack open-source by December 31, 2025. The move, detailed at Huawei Connect 2025, includes the company's foundational CANN toolkit, the Mind series developer tools, and its openPangu foundation models. This strategy is designed to address developer feedback and improve the usability of its Ascend AI hardware.

The company's leadership acknowledged previous challenges faced by developers using its ecosystem. Eric Xu, Huawei’s Deputy Chairman and Rotating Chairman, stated that the decision was heavily influenced by customer feedback, signaling a strategic shift toward greater transparency and community collaboration to accelerate AI development on its platform.

Key Takeaways

  • Full Stack Release: Huawei will open-source its AI software, including the CANN toolkit, Mind series tools, and openPangu models.
  • Firm Deadline: The company has set a target date of December 31, 2025, for the public release of the software.
  • Developer-Focused: The strategy directly addresses past developer friction and aims to build a community around Huawei's Ascend hardware.
  • Compatibility: The plan includes support for major AI frameworks like PyTorch and integration with existing operating systems to lower adoption barriers.

Addressing Past Challenges with an Open Approach

During his keynote at Huawei Connect 2025, Eric Xu openly discussed the difficulties developers have encountered with the company's Ascend AI infrastructure. He noted that feedback from customers revealed numerous issues and expectations that prompted a strategic review.

"Our customers have raised many issues and expectations they’ve had with Ascend. And they keep giving us great suggestions," Xu stated, framing the open-source initiative as a direct response to the developer community.

This admission of past friction provides important context for the company's new direction. By making its software stack public, Huawei aims to leverage community contributions to improve tooling, documentation, and overall platform maturity. The goal is to bridge the gap between the technical power of Ascend hardware and its practical usability for developers.

A Detailed Look at the Open-Source Components

Huawei's plan involves a multi-layered release of its core AI software. The strategy distinguishes between providing open interfaces for some components and fully open-sourcing others, giving developers varying levels of access and control.

The Foundational CANN Toolkit

The Compute Architecture for Neural Networks (CANN) is the foundational layer that connects AI frameworks to Ascend hardware. According to the plan, Huawei will open the interfaces for its compiler and virtual instruction set. This allows developers to see how their code is translated and optimized for Ascend processors.

CANN Open-Source Details

  • Open Interfaces: Compiler and virtual instruction set.
  • Full Open-Source: All other CANN software components.
  • Hardware Basis: The release will be based on the existing Ascend 910B and 910C chip designs.

While the compiler implementation may remain partially proprietary, open interfaces provide crucial visibility for performance tuning. This is especially important for developers working on applications where latency and efficiency are critical. The release is scheduled for December 31, 2025.

Mind Series Tools and openPangu Models

In contrast to CANN's tiered approach, Huawei has committed to fully open-sourcing its Mind series application development kits and toolchains. This includes the SDKs, libraries, debuggers, and profilers that developers use in their daily workflows.

A full open-source release means the community can inspect, modify, and extend the entire toolchain. This could lead to community-driven improvements, such as enhanced debugging features or libraries optimized for specific use cases.

What are Foundation Models?

Foundation models like openPangu are large-scale AI models trained on vast amounts of data. By open-sourcing them, companies allow developers to build specialized applications on top of a pre-trained base, significantly reducing the computational cost and time required to develop new AI tools.

Additionally, Huawei will fully open-source its openPangu foundation models. This move places Huawei in direct competition with other open-source model providers like Meta (Llama) and Mistral AI. However, key details regarding the models' parameter counts, training data, and licensing terms have not yet been announced.

Focus on Integration and Compatibility

A significant part of Huawei's strategy is to ensure its AI ecosystem can be easily adopted by developers and organizations. This involves prioritizing compatibility with widely used tools and operating systems.

Operating System Flexibility

Huawei announced that its UB OS Component, which manages interconnectivity in large AI clusters, will be made open-source. This allows it to be integrated into existing operating systems like openEuler, Ubuntu, or Red Hat Enterprise Linux.

Organizations can choose to integrate the component's source code directly or use it as a plug-in. This modular design removes the need to migrate to a proprietary Huawei operating system, lowering a major barrier to adoption. However, organizations that integrate the source code will be responsible for its maintenance and updates.

Support for Major AI Frameworks

Recognizing the dominance of existing AI frameworks, Huawei is focusing on ensuring its hardware works seamlessly with popular tools. The company confirmed it is prioritizing support for open-source communities like PyTorch and vLLM.

PyTorch compatibility is crucial, as it is one of the most widely used frameworks in both research and production. If developers can run their existing PyTorch code on Ascend hardware with minimal changes, it dramatically simplifies the process of evaluating and adopting the platform.

Support for vLLM specifically targets the growing demand for efficient large language model (LLM) inference. This indicates Huawei is focused on addressing practical deployment challenges faced by businesses building LLM-powered applications.

Unanswered Questions and the Path Forward

While Huawei has provided a clear timeline and detailed the components to be released, several critical questions remain unanswered. The answers will likely determine the long-term success of this open-source initiative.

The choice of software licenses will be a key factor. Permissive licenses like Apache 2.0 would encourage broad commercial adoption, while copyleft licenses like the GPL would have different implications for businesses building proprietary products. Huawei has not yet specified which licenses will apply to the different components.

The governance structure for these new open-source projects is also unclear. Questions remain about whether an independent foundation will be established to oversee development, how community contributions will be managed, and how roadmap decisions will be made. A transparent and community-driven governance model is often essential for attracting external contributors.

The initial quality of the release on December 31, 2025, will be critical. A successful launch requires comprehensive documentation, functional examples, and stable tools. The period following the release, through mid-2026, will serve as an evaluation window for the global developer community to determine if Huawei's open-source platform is a viable ecosystem for long-term investment.