Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to use Flow forecast for migration of time series prediction and classification

2025-03-26 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/02 Report--

This article introduces the relevant knowledge of "how to use Flow forecast for time series prediction and classification migration". In the operation of actual cases, many people will encounter such a dilemma, so let the editor lead you to learn how to deal with these situations. I hope you can read it carefully and be able to achieve something!

ImageNet was first published in 2009, and over the next four years, it became the basis of most computer vision models. So far, whether you are training a model to detect pneumonia or classify car models, you may start with pre-trained models on ImageNet or other large (and general image) data sets.

Some recent papers, such as ELMO and BERT (2018), use transfer learning to effectively improve the performance of several NLP tasks. These models create valid context-sensitive word representations. These representations can then be used for a variety of tasks, such as answering questions, named entity identification, and so on.

In addition, at the macro level, transfer learning paves the way for progress in all areas with limited data. By helping research groups and companies with limited data make effective use of this technology, it helps to popularize deep learning. Therefore, it is very important to be able to make use of transfer learning in the field of time series (where there are many events with finite time history).

Time series.

At present, there is no model for time series transfer learning, and there is no place to go. Moreover, the research on this subject is relatively few. A paper by Fawaz el (https://arxiv.org/pdf/1811.01533.pdf). He discussed the transfer learning of time series classification. Their conclusion is:

These experiments show that transfer learning can improve or reduce model prediction, but it depends on the dataset used for migration.

From this, we know that for time series, the similarity between the source data set and the target data set is more important than CV or NLP in many ways. Then, the author chooses to develop a technique to form a time series representation to find the most similar time series for migration. Although this is an interesting preliminary exploration, it still has many unanswered questions. What about multivariable time series? (the author only focuses on a single variable) will a different architecture help facilitate the transfer between different time series? Similarly, some other articles have explored limited cases in which transfer can be effective in the field of time series, but there is no general framework for transfer learning, especially in multiple situations.

Transfer learning how to work in other areas

Before delving into the challenges of transfer learning about time series prediction, let's take a look at how it works in other areas. In the transfer learning of computer vision, the hierarchical model is generally used for model learning; specifically, the "early" layers in the model learn more general patterns (such as shapes, contours, edges). The later layer learns more specific task features (the shape of a cat's beard or car headlights). After pre-training on ImageNet, this ability even successfully uses transfer learning to help with medical diagnosis and staging.

This is also common in NLP, but it requires a different architecture. Specifically, models such as BERT and ELMO pave the way for learning to migrate from sequences to sequence domains. In particular, the architecture of transformer has played a good role in transfer learning. The same applies to sequence problems, such as time series.

Specific challenges of time series prediction

Time series prediction has several specific core challenges. One of the biggest problems is that for time series, it is difficult to find a useful hierarchy or a set of intermediate representations that can be generalized to different problems. We do have some components, and people traditionally break down time series into seasonality, trends and residuals. However, developing a model for effectively learning intermediate decoupling representations is still elusive. The authors of "Reconstruction and Regression Loss for Time-Series Transfer Learning" explore creating a special loss function to help facilitate positive migration through the decoupling process. They recommend using the initial model (along with the reconstruction loss) to extract general features before using a specific time series model for prediction. Although this article is limited to univariate time series prediction use cases, this technique seems to help improve performance.

The second challenge is multivariate time series prediction. Many problems have different characteristic time series. For example, for COVID-19, we may have 7 characteristics of liquidity data (3 characteristics), new infection (1 feature), and weather (3 characteristics). However, for things like flu predictions, we may only have new infections and weather data with a total of four characteristics (for example, no mobile data are collected for influenza). In our experiments, we usually find it helpful to use a model-specific initial "embedding_layer" and then use a transferable middle tier.

How to use Flow forecast for transfer Learning

Flow forecast is an open source series of deep learning frameworks (https://github.com/AIStream-Peelout/flow-forecast))

In order to facilitate the transfer learning of time series prediction, Flow forecast has several characteristics, which make it easy to pre-train and use the pre-trained time series model. In the model parameters section, you can use a parameter named excluded_layers. This means that when you load a pre-trained model, the weights of these layers will not be loaded, but will be instantiated as fresh if they exist in the new model.

"excluded_layers": ["embedding_layer.weight", "embedding_lay.bias", "dense_shape.weight", "dense_shape.bias"]

This makes it easy to take advantage of the weight of the model in cases where multiple layers may not exist or do not match the shape. Take a look at this notebook example (https://colab.research.google.com/drive/169NO9B_il-E34Kdos1hxunuDFuD6rjuW#scrollTo=dx-tmLRain2Z).

Second, through Flow forecast, we can easily track the data set before training. This means that you can easily track the complete history of other time series data trained by your model. This can help you find the best pre-training data set.

Finally, Flow forecast is working to add additional features, such as making it easy to use different learning rates and selectively freezing different layers, and designing an automatic encoder module to find the most similar time data set. We believe that simple transfer learning is the highest priority feature in our framework.

What did you find in the study?

So far, we have found that generalized transfer learning is useful for small data sets such as COVID-19 prediction. We haven't tested it extensively enough on the big data collection to come to a conclusion about it. We also believe that transfer learning is very effective when incorporating metadata into forecasts. For example, the model needs to look at many different types of metadata and temporal data to learn how to merge them effectively. We can also design a transfer learning protocol, we first scan to find the best static hyperparameters. Then, before the final hyperparametric scan of the non-static parameters (such as batch size, learning rate, etc.), we use these parameters to pre-train the model (such as prediction length, number of layers).

Summary

The transfer learning of time series has made some progress, but it has not been widely used. This may be due to problems with the number of features, the usefulness of intermediate representations, and seasonal differences (such as more negative migration). However, the emergence of a framework like Flow forecast provides us with more easy-to-use modules to make it easier to successfully use transfer learning in the time domain. We believe that transfer learning will play a greater role in time series.

So much for the introduction of "how to use Flow forecast for time series prediction and classification migration". Thank you for reading. If you want to know more about the industry, you can follow the website, the editor will output more high-quality practical articles for you!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report