In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-19 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >
Share
Shulou(Shulou.com)11/24 Report--
The price of the most important "raw material" that underpins the current artificial intelligence boom is falling rapidly, which will help the technology enter the mainstream more quickly, Beijing time reported on March 10. However, it also threatens the finances of startups looking to profit from the boom and could lead to a concentration of industry dominance in the hands of a small number of companies.
The raw materials here refer to the processing power of the large language model (LLM). These models support services such as ChatGPT and Microsoft's New Bing search.
Running these models originally requires a high computational cost, so it may seriously hinder the wide application of the model. Richard Socher, chief executive of search engine You.com and a prominent computer scientist, says that just a few weeks ago, You.com used artificial intelligence to provide search services that were 50 per cent more expensive than traditional internet searches. But by the end of last month, the cost gap had fallen to only about 5 per cent because of fierce competition between OpenAI, Anthropic and Cohere, the big language model companies.
A few days later, OpenAI released a new service that allows developers to use ChatGPT directly and cut the price of using the technology by 90 per cent.
This is good for customers, but it can be devastating for OpenAI's competitors. Many companies, including Anthropic and Inflection, have completed or are in the process of financing to support the development of their large language models.
Few technologies like large language models have evolved so quickly from laboratory research to large-scale commercial use, which has prompted researchers to "industrialize" the development process of the laboratory environment. Most of the improvement in performance and the decrease in cost mainly come from the optimization of the underlying computing platform running the large language model, as well as the improvement of the training and operation mode of the model.
In a way, the sharp drop in hardware costs benefits all market participants. These include high performance chips designed to meet the requirements of the latest artificial intelligence models, such as the Nvidia H100 GPU. Microsoft runs OpenAI's model on its Azure cloud computing platform and provides the same cost-effective hardware support to other large language model companies.
However, the big language model is both a science and an art. OpenAI said that since December last year, ChatGPT has made "a series of system-wide optimizations" in the way queries are handled, reducing costs by 90 per cent and eventually leading to sharp price cuts for users.
Training large language models costs tens of millions of dollars, and the techniques for handling such tasks are changing rapidly. At least in the short term, a small number of people with experience in model development and training will have a greater advantage.
Early participants may have gained a first-mover advantage when the best technologies are widely understood and adopted. Scott Guthrie, head of cloud computing and artificial intelligence at Microsoft, mentions new services such as GitHub Copilot. The service was launched last summer to provide code advice to software developers. After being widely used, such services will be quickly optimized. He told an investor conference at Morgan Stanley this week that "signals" from users of such services would soon become an important point of product differentiation.
The main hope of OpenAI competitors is to provide additional services to make it easier for developers and large enterprise customers to use large language models, and to explore models that meet specific business needs for market segments.
For example, Israeli startup AI21 Labs released the latest big language model this week, as well as a series of API (application programming interfaces) to provide more advanced services such as text summarization or rewriting.
Ori Goshen, co-chief executive of AI21, says most companies do not use generic models like ChatGPT, but instead need to be trained in industries such as finance or healthcare, or based on a company's own data.
He believes that the big language model is still in its infancy, and there is still a lot of work to be done, such as reducing the tendency of models to tell lies, preventing models from "hallucinating" and providing plausible answers that have nothing to do with facts. Artificial intelligence companies need to continue to explore the cutting edge if they are to succeed.
But the fact now is that the basic cost of these generative artificial intelligence is falling sharply. OpenAI's price reduction is a sign that the technology will enter large-scale commercial use at a very rapid rate. But it also serves as a warning that there may not be many companies involved in the industry in the future.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.