
South Korea's National AI Computing Center, the country's largest artificial intelligence infrastructure project, is set to incorporate domestically produced neural processing units (NPUs). The move is interpreted as a strategy to reduce dependence on Nvidia, which dominates the market with its graphics processing units (GPUs), and to diversify AI infrastructure. Industry observers expect the participation of capable domestic NPU companies in this major national project will contribute to the development of Korea's AI semiconductor ecosystem.
An official from the Ministry of Science and ICT told The Seoul Economic Daily on the 9th, "The Samsung SDS consortium is reviewing plans to install domestic NPUs at the National AI Computing Center." The Samsung SDS consortium is currently discussing supply arrangements with domestic NPU companies. Specific installation plans are expected to take shape once the private operator selection process is completed.

The National AI Computing Center, to be built in Haenam, South Jeolla Province, is a major project with more than 2 trillion won in funding from government and private investments as well as policy loans. Construction is targeted to begin in July. The center will provide AI semiconductor infrastructure in cloud form to domestic companies, startups, and research institutions. The public and private sectors plan to secure 15,000 advanced GPUs by 2028, with the consortium planning to gradually expand GPU capacity through 2030.
The Samsung SDS consortium aims to reduce dependence on Nvidia, which effectively monopolizes GPU manufacturing in the United States, by introducing domestic NPUs. This aligns with the government project's objective of enhancing Korea's AI competitiveness. The government's AI ecosystem development strategy encompasses not only AI model and service companies but also semiconductor design firms.
Economic viability is also a key factor. While Nvidia GPUs are recognized for their superior performance, they come with high prices and substantial power consumption. NPUs, by contrast, can deliver high performance with relatively low power consumption in AI inference tasks, resulting in better performance-per-watt efficiency. Some NPUs designed by domestic companies such as Rebellions and FuriosaAI are evaluated to be more than three times more efficient than Nvidia GPUs in terms of power efficiency.
However, there are limitations as NPU optimization levels have not been sufficiently verified in large-scale data center environments. The Samsung SDS consortium is proceeding with adoption under the assumption that NPU operating environments will be stabilized by the official opening of the National AI Computing Center in 2029. With companies experienced in data center operations, such as Naver Cloud, participating in the consortium, designs for parallel operation of GPUs and NPUs are also under review. Discussions on utilizing tensor processing units (TPUs), a type of NPU, are also reportedly underway.
An official familiar with the project said, "Early in the project, the Ministry of Science and ICT considered mandating a certain ratio of domestic NPU usage at the National AI Computing Center, but that provision was excluded due to bidding failure issues." The official added, "Recently, however, there is a growing momentum to actively pursue NPU installation as a way to revitalize the domestic AI semiconductor industry."
