In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-04-06 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >
Share
Shulou(Shulou.com)11/24 Report--
CTOnews.com, April 18, AI server shipments can strongly boost HBM demand. According to TrendForce Jibang Consulting Research, the market share of the three major HBM manufacturers in 2022 is 50% of SK SK hynix, about 40% of Samsung, and 10% of Micron. In addition, advanced in-depth learning AI GPU specifications also stimulate the change of HBM products. In the second half of 2023, with the loading of NVIDIA H100 and AMD MI300, the three original factories have also planned mass production of the corresponding specifications HBM3. Therefore, in the expectation that more customers will introduce HBM3 this year, SK Hynix, as the only supplier of mass production of new-generation HBM3 products, is expected to increase its overall HBM market share to 53%, while Samsung and Micron expect mass production from the end of this year to the early next year, with HBM market share of 38% and 9%, respectively. (CTOnews.com Note: the full name of HBM is high bandwidth memory, a new type of CPU / GPU memory chip, which stack multiple DDR chips together and package them with GPU to achieve a large capacity, high bit width DDR combination array)
It is estimated that AI server shipments will increase by 15.4% annually in 2023.
At present, the DL/ ML AI servers defined by NVIDIA carry an average of four or eight high-end graphics cards, with two mainstream x86 servers CPU, and the main pulling power comes from American cloud operators Google, AWS, Meta and Microsoft. According to TrendForce Jibang Consulting, shipments of high-end servers equipped with GPGPU increased by about 9% a year in 2022, of which nearly 80% were concentrated in the top eight cloud operators in China and the United States. Looking forward to 2023, Microsoft, Meta, Baidu and Byte Jump have successively launched products and services derived from generative AI and actively increased their orders. It is estimated that the annual growth rate of AI server shipments this year is expected to reach 15.4%. The annual compound growth rate of AI server shipments in 2023 2027 is about 12.2%.
AI server stimulates Server DRAM, SSD and HBM demand to rise synchronously
According to the survey, the AI server is expected to drive the growth of memory demand. At this stage, the general configuration of Server DRAM is about 500~600GB, while the AI server uses more 64~128GB on a single module, and the average capacity can reach between 1.2~1.7TB. As far as Enterprise SSD is concerned, because the AI server pursues a higher speed, it is required to meet the requirements of DRAM or HBM first, and it is unnecessary to expand the capacity of SSD, but on the transmission interface, PCIe 5.0will be preferred for the need of high-speed computing. Compared with the general server, the AI server increases the use of GPGPU, so based on the NVIDIA A100 80GB configuration of 4 or 8 sheets, the HBM usage is about 320~640GB. In the future, under the trend of the gradual complexity of the AI model, it will stimulate more memory usage and synchronously drive the demand growth of Server DRAM, SSD and HBM.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.