Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to use Jupyter Notekook to do preliminary Analysis

2025-01-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/02 Report--

This article mainly introduces how to use Jupyter Notebook to do preliminary analysis, the introduction in the article is very detailed, has a certain reference value, interested friends must read it!

Recently, Jupyter Notebook has been the initial version of the strategy design, which is a kind of quotation import drawing.

Before doing a dataframe analysis is easy, this is a simplified version.

Create a DataAnalyzer class, this is much simpler, support import quotation data from csv and mongodb, and integrate different minute k lines from 1 minute k line

The following is the import of 1 minute deformed steel data, integrated into 5 minutes K line

from pymongo import MongoClient, ASCENDINGimport pandas as pdimport numpy as npfrom datetime import datetimeimport talibimport matplotlib.pyplot as pltimport scipy.stats as st%matplotlib inline%config InlineBackend.figure_format = 'retina'class DataAnalyzer(object): """ """ def __init__(self, exportpath="C:\Project\\", datformat=['datetime', 'high', 'low', 'open', 'close','volume']): self.mongohost = None self.mongoport = None self.db = None self.collection = None self.df = pd.DataFrame() self.exportpath = exportpath self.datformat = datformat self.startBar = 2 self.endBar = 12 self.step = 2 self.pValue = 0.015 def db2df(self, db, collection, start, end, mongohost="localhost", mongoport=27017, export2csv=False): """Read MongoDB market records and output them to Dataframe""" self.mongohost = mongohost self.mongoport = mongoport self.db = db self.collection = collection dbClient = MongoClient(self.mongohost, self.mongoport, connectTimeoutMS=500) db = dbClient[self.db] cursor = db[self.collection].find({'datetime':{'$gte':start, '$lt':end}}).sort("datetime",ASCENDING) self.df = pd.DataFrame(list(cursor)) self.df = self.df[self.datformat] self.df = self.df.reset_index(drop=True) path = self.exportpath + self.collection + ".csv" if export2csv == True: self.df.to_csv(path, index=True, header=True) return self.df def csv2df(self, csvpath, dataname="csv_data", export2csv=False): """Read csv market data and enter it into Dataframe""" csv_df = pd.read_csv(csvpath) self.df = csv_df[self.datformat] self.df["datetime"] = pd.to_datetime(self.df['datetime']) # self.df["high"] = self.df['high'].astype(float) # self.df["low"] = self.df['low'].astype(float) # self.df["open"] = self.df['open'].astype(float) # self.df["close"] = self.df['close'].astype(float) # self.df["volume"] = self.df['volume'].astype(int) self.df = self.df.reset_index(drop=True) path = self.exportpath + dataname + ".csv" if export2csv == True: self.df.to_csv(path, index=True, header=True) return self.df def df2Barmin(self, inputdf, barmins, crossmin=1, export2csv=False): """Enter minute k-line dataframe data, combine multiple data, such as three minutes/5 minutes, etc. If the start time is 9:01, crossmin = 0; if it is 9:00, crossmin is 1""" dfbarmin = pd.DataFrame() highBarMin = 0 lowBarMin = 0 openBarMin = 0 volumeBarmin = 0 datetime = 0 for i in range(0, len(inputdf) - 1): bar = inputdf.iloc[i, :].to_dict() if openBarMin == 0: openBarmin = bar["open"] if highBarMin == 0: highBarMin = bar["high"] else: highBarMin = max(bar["high"], highBarMin) if lowBarMin == 0: lowBarMin = bar["low"] else: lowBarMin = min(bar["low"], lowBarMin) closeBarMin = bar["close"] datetime = bar["datetime"] volumeBarmin += int(bar["volume"]) # X minutes are over if not (bar["datetime"].minute + crossmin) % barmins: #divisible by X #Generate timestamp of last X minute K line barMin = {'datetime': datetime, 'high': highBarMin, 'low': lowBarMin, 'open': openBarmin, 'close': closeBarMin, 'volume' : volumeBarmin} dfbarmin = dfbarmin.append(barMin, ignore_index=True) highBarMin = 0 lowBarMin = 0 openBarMin = 0 volumeBarmin = 0 if export2csv == True: dfbarmin.to_csv(self.exportpath + "bar" + str(barmins)+ str(self.collection) + ".csv", index=True, header=True) return dfbarminexportpath = "C:\\Project\\"DA = DataAnalyzer(exportpath)#database import start = datetime.strptime("20190920", '%Y%m%d')end = datetime.now ()dfrb8888 = DA.db2df(db="VnTrader_1Min_Db", collection="rb8888", start = start, end = end,export2csv=True)dfrb5min = DA.df2Barmin(dfrb888,5,crossmin=1, export2csv=True)dfrb5min.tail()

2. Calculate the 5-minute K-line references, including standard deviation, rsi, 5-minute EMA, and 40-minute EMA

logdata = pd.DataFrame()logdata['close'] =(dfrb5min['close'])# logdata['tr'] = talib.ATR(np.array(dfrb8888['high']), np.array(dfrb8888['low']), np.array(dfrb8888['close']) ,1)# logdata['atr'] = talib.ATR(np.array(dfrb8888['high']), np.array(dfrb8888['low']), np.array(dfrb8888['close']) ,20)logdata['std20'] = talib.STDDEV( np.array(dfrb5min['close']) ,20)logdata['rsi30'] = talib.RSI(np.array(dfrb5min['close']) ,30)logdata['sma5'] = talib.SMA(np.array(dfrb5min['close']) ,5)logdata['sma40'] = talib.SMA(np.array(dfrb5min['close']) ,40)logdata.plot(subplots=True,figsize=(18,16))

3. Use the fast/slow EMA strategy to show buy/sell points

closeArray = np.array(logdata['close'])listup,listdown = [],[]for i in range(1,len(logdata['close'])): if logdata.loc[i,'sma5'] > logdata.loc[i,'sma40'] and logdata.loc[i-1,'sma5']

< logdata.loc[i-1,'sma40']: listup.append(i) elif logdata.loc[i,'sma5'] < logdata.loc[i,'sma40'] and logdata.loc[i-1,'sma5'] >

logdata.loc[i-1,'sma40']: listdown.append(i)fig=plt.figure(figsize=(18,6))plt.plot(closeArray, color='y', lw=2.) plt.plot(closeArray, '^', markersize=5, color='r', label='UP signal', markevery=listup)plt.plot(closeArray, 'v', markersize=5, color='g', label='DOWN signal', markevery=listdown)plt.legend()plt.show()

The above is "how to use Jupyter Notebook to do preliminary analysis" all the content of this article, thank you for reading! Hope to share the content to help everyone, more relevant knowledge, welcome to pay attention to the industry information channel!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report