In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-23 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >
Share
Shulou(Shulou.com)11/24 Report--
CTOnews.com, July 25, AI startup MosaicML recently released its 7 billion parameter model MPT-7B-8K. It is reported that the model can handle 8000 words of text at a time, is quite good at dealing with long text summaries and Q & A, and can further fine-tune the relevant configuration on the MosaicML platform according to specific tasks.
▲ source Huggingface it is reported that the series model uses 1.5 million Token and takes 3 days to complete the model training with 256H100 GPU. MosaicML has released three versions of the model, including MPT-7B-8k, MPT-7B-8k-Instruct and MPT-7B-8k-Chat.
▲ source Huggingface, the first version of MPT-7B-8k, is based on the Transformer decoder, and uses FlashAttention and FasterTransformer mechanisms to speed up training and inference. It can process 8000 words of text at a time. MosaicML said that the model is open source and allowed for commercial use.
The second version of MPT-7B-8k-Instruct is fine-tuned from the first version of MPT-7B-8k, and MosaicML says the MPT-7B-8k-Instruct model can handle long instructions, with a special focus on generating "summaries and questions and answers", which is equally open source and commercially available.
The third version of MPT-7B-8k-Chat is the robot conversational AI model. MosaicML claims that the model uses an extra 1.5 billion chat data Token and continues to be trained on top of the first version of the model MPT-7B-8k, which is open source but not allowed for commercial use.
CTOnews.com previously reported that MosaicML has also launched an open source model of MPT-30B, which claims to cost a fraction of the cost of training. The company has recently released all the open source models on the Huggingface platform. Interested friends can go to learn about it, use their own data on local hardware, and fine-tune the model after download.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.