Intel may have pulled a fast one on its industry rivals by buying up a majority of the 3nm node capacity at TSMC in order to fabricate its new GPU and a number of server chips, something that could inhibit AMD and Apple from ramping up production of their own next-gen chips in 2022.
Production with the 3nm node is expected to start in Q2 2022, with mass production expected to start in mid 2022, according to Wccftech. Production capacity would reach about 4,000 in May of next year, with mass production capacity ramping up to 10,000 wafers a month.
Intel, unlike AMD or Apple, has its own fabrication plants that it uses for most of its chip production, though it has struggled in recent years to hit its own development roadmap targets. None of the 3nm node process orders will be for its consumer market processors, namely Raptor Lake, which is expected to launch in mid-to-late 2022.
Instead, reports indicate that the product lines in the order will coming out of its graphics and server units, specifically a new GPU and three new server processors, most likely next-gen Xeon processors meant for data centers.
Analysis: Is the Intel Iris Xe graphics card finally making it to production?
We don't know much about any of these chips yet, though the GPU could be the long-awaited Intel discrete graphics card based on the company's Iris graphics processor.
The Intel Iris Xe graphics card has been in the works for a long time now, but we've yet to ever really see it beyond some prototypes and presentation material. If Intel is making a substantial investment in its graphics unit however – and eating up a substantial chunk of TSMC's 3nm node capacity in the process – then we certainly hope this is an indication that Intel's discrete graphics card is on its way to customers.
While we're not expecting it to immediately dethrone AMD and Nvidia in the graphics card space, turning things into a three-way fight rather than a head-to-head matchup will push the three companies to innovate even more. This can only be good for gamers and other PC enthusiasts in the end, assuming we're ever able to get our hands on any of these graphics cards in the first place.
Recommended Comments
There are no comments to display.
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.