Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

More than $580 billion, Microsoft Google Fairy fight, Nvidia's market value soared, about 5 Intel

2025-03-26 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >

Share

Shulou(Shulou.com)11/24 Report--

Microsoft, Google, Meta and other large technology companies are chasing each other to seize the market opportunity. The AI competition has soared Nvidia's market capitalization, which is about five Intel.

ChatGPT is in hand, answer every question.

You know, the computational cost of each conversation with it is simply tearful.

Previously, analysts said it would cost 2 cents for a ChatGPT reply.

You know, GPU is behind the computing power needed by artificial intelligence chatbots.

This has made a lot of money for chip companies like Nvidia.

On Feb. 23, Nvidia shares soared, adding more than $70 billion to its market capitalization of more than $580 billion, about five times the size of Intel.

Outside Nvidia, AMD is arguably the second-largest vendor in the graphics processor industry, with a market share of about 20 per cent.

Intel holds less than 1% of the market.

ChatGPT is running, and Nvidia is making money as ChatGPT unlocks potential applications, which could usher in another inflection point in artificial intelligence applications.

Why would you say that?

John Hennessy, chairman of Google's parent company Alphabet, said in an interview with Reuters that the conversation cost of artificial intelligence such as large language models could be more than 10 times that of traditional search engines.

Morgan Stanley Analytics says Google made a total of 3.3 trillion searches last year, each costing just 0.20 cents.

It is estimated that if Google's chat robot Bard is introduced into the search engine and used to handle half of Google's searches and questions, the company's cost could increase by $6 billion in 2024 based on 50 words of answers.

Similarly, the addition of chatbots to search engines could cost Google an extra $3 billion, according to SemiAnalysis, a consultancy that specialises in chip technology, such as Google's internal chip Tensor Processing Units.

He believes that Google must reduce the operating costs of this kind of artificial intelligence, but the process is not easy, and the worst-case scenario will take years.

Precisely because of this, searching through the AI language model requires more computing power than traditional search.

Analysts say the additional costs over the next few years could be billions of dollars.

Gartner predicts that the share of professional chips such as GPU used in data centres is expected to rise by more than 15 per cent by 2026, up from less than 3 per cent in 2020.

While it's hard to pinpoint exactly how much artificial intelligence now accounts for Nvidia's revenue, it has the potential to grow exponentially as large technology companies compete to develop similar AI applications.

On Wednesday, Nvidia also announced an artificial intelligence cloud service and is committed to a partnership between Oracle, Microsoft and Google to provide it with the ability to use Nvidia GTX for artificial intelligence processing through simple browser access.

The new platform will be provided by other cloud service providers and will help technology companies that do not have the ability to build their own infrastructure.

Huang Renxun said, "people's enthusiasm for ChatGPT shows business leaders the power of artificial intelligence. But now, it is mainly a general-purpose software. The realization of its real value also needs to be tailor-made according to the company's own needs, so that we can improve our own services and products. "

Nvidia has 95 per cent of the graphics processor market, according to New Street Research, the leader of the GPU market.

In the Philadelphia stock exchange semiconductor index, Nvidia shares have risen 42% this year, the best performance.

Investors are flocking to Nvidia, betting that demand for artificial intelligence systems such as ChatGPT will push up orders for its products, making it once again the world's most valuable chipmaker.

For a long time, whether it is the top-rate ChatGPT, or Bard, Stable Diffusion and other models, the computing power is provided by a chip Nvidia A100 worth about $10, 000.

The Nvidia A100 can perform many simple calculations at the same time, which is very important for training and using neural network models.

The technology behind the A100 was originally used to render complex 3D graphics in games. Now, the goal is to handle machine learning tasks and run in the data center.

Investor Nathan Benaich said that the A100 has now become the "main force" of artificial intelligence professionals. His report also lists some companies that use A100 supercomputers.

Machine learning tasks may take up the processing power of the entire computer, sometimes for hours or days.

This means that companies with a best-selling AI product often need to buy more GPU to cope with peak visits or improve their models.

In addition to a single A100 that can be plugged into an existing server on the card, many data centers use a system that contains eight A100 GPUs.

The system, known as the Nvidia DGX A100, costs as much as $200000 per system.

Nvidia said on Wednesday that it would sell cloud access to DGX systems directly, which could reduce entry costs for researchers.

So what is the cost of running the new version of Bing?

An assessment by New Street Research found that the OpenAI-based ChatGPT model in Bing search may require eight GPU to give an answer to a question in less than a second.

At this rate, Microsoft needs more than 20, 000 eight GPU servers to deploy the model to everyone.

Then Microsoft could spend 4 billion dollars on infrastructure.

This is just Microsoft, and it will cost $80 billion if it wants to reach Google's daily query scale of 8 billion-9 billion queries a day.

For example, the latest version of Stable Diffusion is calculated for 200000 hours on 256 A100 GPUs or 32 DGX A100.

Mostaque, chief executive of Stability AI, said it would cost $600000 to train the model alone at market prices. Compared with the competitors, the price is very cheap. This does not include the cost of inferring or deploying the model.

Huang Renxun said in an interview

In terms of the amount of computation required for such models, Stability AI products are actually not expensive.

We adopted a data center that would have cost $1 billion to run CPU, and then shrunk it to a data center that cost $100 million. Now, it doesn't matter that the $100 million data center is shared by 100 companies in the cloud.

Nvidia GPU allows startups to train models at a lower cost. Now you can build a large language model, such as GPT, which costs about $10 million to $20 million. It's really, really affordable.

According to the 2022 artificial Intelligence status report, as of December 2022, more than 21000 open source AI papers used Nvidia chips.

Most researchers in the National artificial Intelligence Computing Index (State of AI Compute Index) use the Nvidia chip V100, which was launched in 2017, but the A100 grew rapidly in 2022 to become the third most frequently used chip.

Perhaps the fiercest competition for the A100 is that its successor, H100Magi, launched in 2022 and has begun mass production. In fact, Nvidia said on Wednesday that H100 had more revenue than A100 in the quarter to January.

For now, Nvidia is sprinting to Qian on the AI Express.

Reference:

Https://www.reuters.com/technology/tech-giants-ai-like-bing-bard-poses-billion-dollar-search-problem-2023-02-22/

Https://www.businessinsider.com/ai-expensive-google-chatbot-bard-may-cost-company-billions-dollars-2023-2

Https://www.bloomberg.com/news/articles/2023-02-22/nvidia-gives-strong-forecast-as-ai-helps-fuel-computing-demand?srnd=technology-vp

Https://www.reuters.com/technology/nvidia-results-show-its-growing-lead-ai-chip-race-2023-02-23/

Https://www.cnbc.com/2023/02/23/nvidias-a100-is-the-10000-chip-powering-the-race-for-ai-.html

This article comes from the official account of Wechat: Xin Zhiyuan (ID:AI_era)

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

IT Information

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report