AI Token will explode in the future, and TSMC will use process efficiency to meet the computing power wave

Technology 8:33am, 22 October 2025 99

With the explosion of large-scale language model applications, the number of tokens required for global AI operations is rising at an exponential rate. TSMC Chairman Wei Zhejia pointed out at the conference that the current growth of tokens is almost "doubling every three months." This surge shows that the computing demand for generative AI is still far from peaking.

Faced with such dramatic computing expansion, Wei Zhejia emphasized that TSMC will become the core driving force supporting this wave of AI growth. Although the growth rate of tokens on the AI ​​application side has been much higher than the approximately 45% compound growth rate of foundry revenue, TSMC has enabled customers to process more tokens on the same chip area through "advanced process node advancement and energy efficiency improvement" to achieve higher computing density and energy efficiency.

Wei Zhejia pointed out that as customers move to the next node, the computing load that the chip can carry has increased significantly. This means that even if the number of tokens doubles, overall wafer demand can still maintain steady expansion through improvements in process efficiency.

He emphasized: "TSMC works closely with customers to continuously improve chip efficiency to ensure that it can continue to meet the actual needs of AI training and inference amid the wave of token growth."