In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >
Share
Shulou(Shulou.com)11/24 Report--
CTOnews.com Sept. 12, Microsoft Research yesterday released a new pre-training language model called phi-1.5, with a total of 1.3 billion parameters, suitable for QA Q & A, chat format, code and other scenarios.
Phi-1.5 uses a variety of data sets from the StackOverflow platform about Python content, competition codes in code_contests, synthetic Python textbooks, gpt-3.5-turbo-0301 generation and other data sets, as well as new data sources composed of various NLP composite texts.
Microsoft says that on the basis of testing common sense, language understanding and logical reasoning, phi-1.5 outperforms most models with parameters less than 1 million. Phi-1.5 surpasses llama-2; from Meta with 7 billion parameters in AGIEval scores and is comparable to llama-2 with 7 billion parameters in the GPT4AL running Suite with LM-Eval Harness.
CTOnews.com attached a link here, interested users can click to read.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.