
On December 8 (local time), U.S. President Donald Trump announced that Nvidia will be allowed to ship its H200 AI accelerators to approved customers in China and other regions — under a unique condition: the U.S. government will take a 25% share of Nvidia's H200 sales revenue from those markets.
This is not the first time Nvidia has made such a deal. Earlier this August, the company agreed to hand over 15% of its China revenue from the H20 chip to the U.S. government in exchange for approval to resume exports of that model. However, almost immediately after shipments restarted, Chinese state media raised concerns about the security risks and aging performance of the H20. As a result, demand weakened sharply. Nvidia CEO Jensen Huang later admitted that the company's AI chip market share in China had fallen to zero, and he expects it to remain there for at least the next two quarters.
To reverse the situation, Huang has been pushing hard to bring more advanced products into China, including the H200 and even the newer Blackwell GPUs. He has openly said he hopes both governments can find common ground and allow Nvidia back into one of the world's biggest AI markets.
Huang estimates the China AI chip market is worth around $50 billion today and could grow to $200 billion by the end of the decade. "Not being part of that is painful," he said. "That revenue matters. It drives faster innovation and larger investment. Right now, we have to assume our revenue there is zero."
Despite this, the Trump administration remains firmly opposed to exporting Nvidia's Blackwell GPUs to China. Instead, officials are considering approval for the H200, which uses the same Hopper architecture as the H20, but with much stronger performance.
Technically, both H20 and H200 are based on Nvidia's Hopper architecture, but the H20 is a significantly cut-down version of the H100. The H200, by contrast, is far more powerful. It features 141GB of HBM3e memory and memory bandwidth of up to 4.8 TB/s, offering a major jump over the H100. Industry estimates suggest the H200 delivers roughly double the performance of the H20.
However, even if exports are approved, the H200 will not arrive unchanged. Under existing U.S. export controls that limit compute performance and bandwidth, Nvidia is expected to ship a restricted version to China. That likely means reduced core counts and lower memory bandwidth. Even so, the downgraded H200 is still expected to significantly outperform the H20, which was limited to 96GB of HBM3.
For Nvidia, the H200 may be its last realistic pathway back into China's AI hardware market — at least under the current political climate. Whether the deal moves forward will shape not just Nvidia's future in China, but also the global balance of AI computing power.