Tencent boosts 100,000 GPU-capable AI clusters with network optimization — Xingmai 2.0 increases communication efficiency by 60% and LLM training efficiency by 20%

Tencent
(Image credit: Tencent)

Tencent Holdings has significantly enhanced its high-performance computing network, Xingmai 2.0, by upgrading its network performance, reports the South China Morning Post. The move boosted the company's AI capabilities and improved large language model (LLM) training efficiency. This development aligns with China's efforts to advance its AI prowess despite restrictions on shipments of advanced processors, such as Nvidia's H100, to China. 

The Xingmai 2.0 network supports over 100,000 GPUs in a single computing cluster, doubling the capacity of the initial network launched in 2023. The report says the upgraded Xingmai 2.0 network increases network communication efficiency by 60% and LLM training efficiency by 20%. Tencent achieved these performance improvements by optimizing existing infrastructure instead of investing in new processors, which are hard (almost impossible for a Chinese entity) to get due to U.S. export rules.

Tencent's efforts are part of a strategic move to strengthen its position in China's rapidly evolving AI sector. The Shenzhen-based high-tech giant has been promoting the use of its proprietary LLMs in enterprise applications and offering services to assist other businesses in developing their own AI models.

China's AI industry is currently in a price war, with major companies like Alibaba, Baidu, and ByteDance slashing prices to promote their AI technologies. In May, Tencent made the lite version of its Hunyuan LLM available for free and reduced prices for its standard versions. This competitive pricing strategy aims to increase the commercial adoption of their AI technologies.

Tencent's approach primarily reflects China's push to enhance its technological capabilities using available resources. For example, Baidu has reported significant efficiency improvements in its Ernie LLM, with a fivefold increase in training efficiency and a 99% reduction in inferencing costs. These gains highlight the ongoing efforts by Chinese tech companies to make AI training more efficient and cost-effective. Such advancements are crucial amid the price war, making AI technologies more accessible and affordable.

By improving the efficiency of AI training and reducing costs, Chinese tech companies are positioning themselves to compete more effectively globally while advancing their technological self-reliance. Along with other Chinese tech giants, Tencent strives to compete with Western counterparts by leveraging efficiency improvements rather than relying on advanced processors. 

Anton Shilov
Contributing Writer

Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

  • bit_user
    It's kind of disappointing to get zero technical details. However, the tidbits about the AI market in China was an unexpected surprise that displaced at least some of my disappointment.
    Reply