Karlston Posted October 9, 2020 Share Posted October 9, 2020 AMD could pay billions for this little-known company to close the gap with Nvidia and Intel Xilinx could be at the centre of an EPYC battle (Image credit: Shutterstock) AMD is getting ready for its biggest acquisition to date, worth in the region of $30 billion dollars. According to the Wall Street Journal, the semiconductor company is in advanced talks over the purchase of little-known chip designer Xilinx. Why would AMD buy Xilinx? Well, it’s all about the data center market. Xilinx specializes in Field Programmable Gate Arrays (FPGAs) - semiconductors that can be altered even once they have been deployed, which often makes them slower than GPUs or CPUs but also far more versatile. AMD Xilinx Being so polyvalent means Xilinx technology can be used almost anywhere in a datacenter, great when you face a plurality of tasks. A cloud storage provider would have a different workload compared to a VPN service, a website builder company or a video conferencing vendor. As such, it complements rather than competes with AMD, whose chiplet design strategy would seem a natural fit to expand Xilinx's footprint in the datacenter ecosystem. Nvidia’s acquisition of Mellanox and ARM may have pushed AMD and Xilinx to make the move sooner rather than later, to avoid a future in which the datacenter is dominated by Intel (which acquired Xilinx rival Altera in 2015) and Nvidia. Xilinx shares were up marginally in pre-market sessions, but are still way off their all-time high from April 2019. The Wall Street Journal expects AMD to finance the acquisition through a shares-and-cash transaction, as its share price has more than quadrupled over the past two years. The company is now worth more than $100 billion, up by almost 50x in only a handful of years. The deal, if it happens, comes almost 14 years after AMD’s acquisition of Canadian graphics powerhouse - and former Nvidia nemesis - ATI technologies. AMD could pay billions for this little-known company to close the gap with Nvidia and Intel Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.