In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-21 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
This article mainly introduces "how to use C # to complete common neural network". In daily operation, I believe that many people have doubts about how to use C# to complete common neural network problems. The editor consulted all kinds of materials and sorted out simple and easy-to-use operation methods. I hope it will be helpful for you to answer the doubts about how to use C# to complete common neural networks. Next, please follow the editor to study!
A computational graph framework for neural networks written by WeaveNetC#
This is a neural network written by c #, you can see any internal details of the implementation, can learn the neural network, and understand the calculation method. This architecture does not include automatic Backward backward propagation, in order to show more computational details.
The source code contains sample content such as cnn,bp,fcn,lstm,convlstm,GRU, including the data content used in the example.
LOSS support: MESLOSS,cross-entropy
Activation function support: ReLu,Tanh,Sigmod,Softmax
Data type support: float [] [] and float [] [] [,], 2D and 4D
Pooling support: average pooled Averpooling, maximum pooled Maxpooling
Other support: ConvLayer,Conv2DLayer,MulLayer,ConvTranspose2DLayer
Each supporting class contains methods to propagate Forward forward and Backward backward.
Here are a few examples
Training realization of CNN
Public class CNN {Conv2DLayer cl; Conv2DLayer cl2; Conv2DLayer cl3; / / TanhLayer sl = new TanhLayer (); / / TanhLayer sl2 = new TanhLayer (); / / TanhLayer sl3 = new TanhLayer (); Maxpooling ap1; Maxpooling ap2; SigmodLayer sl = new SigmodLayer (); SigmodLayer sl2 = new SigmodLayer (); / / SigmodLayer sl3 = new SigmodLayer () Softmax sl3 = new Softmax (); / / Averpooling ap2; / / Averpooling ap1; public CNN () {cl = new Conv2DLayer (1,0,5,1,6); / / ap1 = new Averpooling (2); ap1 = new Maxpooling (2); cl2 = new Conv2DLayer (1,0,5,6,12) / / ap2 = new Averpooling (2); ap2 = new Maxpooling (2); cl3 = new Conv2DLayer (in_channels: 12, out_channels: 10, _ inSize: 4) public dynamic Forward (float [] [] [,] matrices) {dynamic data = cl.Forward (matrices); data = sl.Forward (data) Data = ap1.Forward (data); data = cl2.Forward (data); data = sl2.Forward (data); data = ap2.Forward (data); data = cl3.Forward (data); data = sl3.Forward (data); return data;} dynamic cl3grid; dynamic cl2grid; dynamic clgrid Public void backward (dynamic grid) {dynamic grid2 = sl3.Backward (grid); cl3grid = cl3.backweight (grid2); / / get the weight of cl3 / /-- grid2 = cl3.Backward (grid2); grid2 = ap2.Backward (grid2) Grid2 = sl2.Backward (grid2); cl2grid = cl2.backweight (grid2); / / get the weight of cl2 / /-- grid2 = cl2.Backward (grid2); grid2 = ap1.Backward (grid2); grid2 = sl.Backward (grid2) Clgrid = cl.backweight (grid2); / / get the weight of cl} float lr = 1.0f; public void update () {/ / int channl = cl3grid.grid.Length; cl3.wdata = Matrix.MatrixSub (cl3.wdata, Matrix.multiply (cl3grid.grid, lr)) Cl3.basicData = Matrix.MatrixSub (cl3.basicData, Matrix.multiply (cl3grid.basic, lr)); cl2.weights = Matrix.MatrixSub (cl2.weights, Matrix.multiply (cl2grid.grid, lr)); cl2.basicData = Matrix.MatrixSub (cl2.basicData, Matrix.multiply (cl2grid.basic, lr)); cl.weights = Matrix.MatrixSub (cl.weights, Matrix.multiply (clgrid.grid, lr)) Cl.basicData = Matrix.MatrixSub (cl.basicData, Matrix.multiply (clgrid.basic, lr));}}
LSTM implementation example
Public class LSTMCELL {ConvLayer convLayerih; ConvLayer convLayerhh; int input_size; int hidden_size; public LSTMCELL (int _ input_size, int _ hidden_size) {input_size = _ input_size; hidden_size = _ hidden_size; convLayerih = new ConvLayer (input_size, hidden_size * 4) / / convLayerih.weights = JsonConvert.DeserializeObject ("D:\\ lstmihw.json"); / / convLayerih.basicData = JsonConvert.DeserializeObject (util.getstr ("D:\\ lstmihb.json")); convLayerhh = new ConvLayer (hidden_size, hidden_size * 4); / / convLayerhh.weights = JsonConvert.DeserializeObject (util.getstr ("D:\\ lstmhhw.json")) / / convLayerhh.basicData = JsonConvert.DeserializeObject (util.getstr ("D:\\ lstmhhb.json");} SigmodLayer input_gate_s = new SigmodLayer (); SigmodLayer forget_gate_s = new SigmodLayer (); SigmodLayer output_gate_s = new SigmodLayer (); TanhLayer cell_memory_tl = new TanhLayer (); TanhLayer cell_tl = new TanhLayer () MulLayer c_next_mul = new MulLayer (); MulLayer mulin_gate_mul = new MulLayer (); MulLayer h_next_mul = new MulLayer () Public dynamic Forward (float [] [] input, float [] h_prev, float [] [] c_prev) {/ / a_vector = np.dot (x, self.weight_ih.T) + np.dot (h_prev, self.weight_hh.T) / / a_vector + = self.bias_ih + self.bias_hh Xinput = input; xh_prev = h_prev Xc_prev = cymbals; var ih = convLayerih.Forward (input); var hh = convLayerhh.Forward (h_prev); var a_vector = Matrix.MatrixAdd (ih, hh); List liast = Matrix.chunk; var ahumi = liast [0]; var ahumf = liast [1] Var aqunc = liast [2]; var ahumo = liast [3]; input_gate = input_gate_s.Forward (ahumi); forget_gate = forget_gate_s.Forward (ahumf); cell_memory = cell_memory_tl.Forward (astatc); output_gate = output_gate_s.Forward (ahumo) Var c_next_temp = c_next_mul.Forward (forget_gate, c_prev); var mulin_gate = mulin_gate_mul.Forward (input_gate, cell_memory); var c_next = Matrix.MatrixAdd (c_next_temp, mulin_gate); var h_next = h_next_mul.Forward (output_gate, cell_tl.Forward (c_next)) / / dh_prev = Matrix.zroe (h_next.Length, h_next [0] .length); return (hobbies next); / / Last state, last memory} dynamic Xinput, xh_prev, xc_prev, input_gate, forget_gate, cell_memory, output_gate; / / dynamic dh_prev; dynamic ihweight, hhweight Public dynamic backward (dynamic grid) {var dh = h_next_mul.BackwardY (grid); var d_tanh_c = cell_tl.Backward (dh); / / var dc_prev=c_next_mul.backwardY (d_tanh_c); var d_input_gate = mulin_gate_mul.Backward (d_tanh_c) Var d_forget_gate=c_next_mul.Backward (d_tanh_c); var d_cell_memory = mulin_gate_mul.BackwardY (d_tanh_c); var d_output_gate = h_next_mul.Backward (grid); / / d_tanh_c var d_ai = input_gate_s.Backward (d_input_gate) Var d_af = forget_gate_s.Backward (d_forget_gate); var d_ao = output_gate_s.Backward (d_output_gate); var d_ac = cell_memory_tl.Backward (d_cell_memory); var temp=Matrix.cat (d_ai, d_af, 1); var temp2 = Matrix.cat (d_ac, d_ao, 1) Var da= Matrix.cat (temp, temp2, 1); / / var daT=Matrix.T (da); ihweight = convLayerih.backweight (da); hhweight = convLayerhh.backweight (da); return convLayerih.backward (da);} float lr = 0.1f Public void update () {convLayerih.weights = Matrix.MatrixSub (convLayerih.weights, Matrix.multiply (ihweight.grid, lr)); convLayerih.basicData = Matrix.MatrixSub (convLayerih.basicData, Matrix.multiply (ihweight.basic, lr)); convLayerhh.weights = Matrix.MatrixSub (convLayerhh.weights, Matrix.multiply (hhweight.grid, lr)) ConvLayerhh.basicData = Matrix.MatrixSub (convLayerhh.basicData, Matrix.multiply (hhweight.basic, lr));}}
FCN implementation example
Public class FCN {Conv2DLayer cl; Conv2DLayer cl2; Conv2DLayer cl3; ConvTranspose2DLayer Tcl1; Maxpooling mpl = new Maxpooling (); Maxpooling mpl2 = new Maxpooling (); SigmodLayer sl = new SigmodLayer (); SigmodLayer sl2 = new SigmodLayer (); SigmodLayer sl3 = new SigmodLayer (); Softmax sl4 = new Softmax () Public FCN (int weightssize) {cl = new Conv2DLayer (1, weightssize / 2, weightssize, 1,6, bias: false); cl2 = new Conv2DLayer (1, weightssize / 2, weightssize, 6,12, bias: false); cl3 = new Conv2DLayer (1, weightssize / 2, weightssize, 12,24, bias: false); Tcl1 = new ConvTranspose2DLayer (2,1, weightssize + 1,24,1, bias: false) } public dynamic Forward (dynamic data) {dynamic data2= cl.Forward (data); data2=sl.Forward (data2); data2=mpl.Forward (data2); data2= cl2.Forward (data2); data2= sl2.Forward (data2); data2= mpl2.Forward (data2); data2= cl3.Forward (data2) Data2= sl3.Forward (data2); data2=Tcl1.Forward (data2); data2= sl4.Forward (data2); return data2;} public dynamic backward (dynamic grid) {var grid2= sl4.Backward (grid); grid2= Tcl1.Backward (grid2); grid2= sl3.Backward (grid2) Grid2 = cl3.Backward (grid2); grid2 = mpl2.Backward (grid2); grid2 = sl2.Backward (grid2); grid2 = cl2.Backward (grid2); grid2 = mpl.Backward (grid2); grid2 = sl.Backward (grid2); grid2 = cl.Backward (grid2); return grid2 }} at this point, the study on "how to use C # to complete common neural networks" is over. I hope to be able to solve your doubts. The collocation of theory and practice can better help you learn, go and try it! If you want to continue to learn more related knowledge, please continue to follow the website, the editor will continue to work hard to bring you more practical articles!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.