In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-22 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
This article mainly introduces the relevant knowledge of "how to optimize BP neural network through genetic algorithm in matlab". The editor shows you the operation process through an actual case. The operation method is simple, fast and practical. I hope this article "how to optimize BP neural network through genetic algorithm in matlab" can help you solve the problem.
% clear environment variables
Clc
Clear
%% Network structure established
% read data
Load data input output
% number of nodes
Inputnum=2
Hiddennum=5
Outputnum=1
% training data and forecast data
Input_train=input (1 1900:)'
Input_test=input (1901PUR 2000:)'
Output_train=output (1purl 1900)'
Output_test=output (1901VOG 2000)'
% selected sample input and output data normalization
[inputn,inputps] = mapminmax (input_train)
[outputn,outputps] = mapminmax (output_train)
% build the network
Net=newff (inputn,outputn,hiddennum)
%% genetic algorithm parameter initialization
Maxgen=10;% evolutionary algebra, that is, the number of iterations
Sizepop=10;% population size
Pcross=0.3;% crossover probability selection, between 0 and 1
Pmutation=0.1;% mutation probability selection, between 0 and 1
% Total Node
Numsum=inputnum*hiddennum+hiddennum+hiddennum*outputnum+outputnum
Lenchrom=ones (1m numsum)
Bound= [- 3*ones (numsum,1) 3*ones (numsum,1)];% data range
%% population initialization
Individuals=struct ('fitness',zeros (1memsizepop),' chrom', [])
% defines the population information as a structure
Avgfitness= []
% average fitness of each generation
Bestfitness= []
% the best fitness of each generation.
Bestchrom= []
% chromosomes with the best fitness
% initialize the population
For i=1:sizepop
% randomly generate a population
Individuals.chrom (iMagna:) = Code (lenchrom,bound)
% Encoding
X=individuals.chrom (iMagna:)
% calculate fitness
Individuals.fitness (I) = fun (XRecience inputnumrecoverhiddennumrecoveroutputnumrecovernetbookinputn outputn); fitness of% chromosomes
End
Find the best chromosomes
[bestfitness, bestindex] = min (individuals.fitness)
Bestchrom=individuals.chrom (bestindex,:);% the best chromosome
Average fitness of avgfitness=sum (individuals.fitness) / sizepop;% chromosomes
Record the best and average fitness in the evolution of each generation
Trace= [avgfitness bestfitness]
%% iterative solution of optimal initial threshold and weight
% evolution begins
For i=1:maxgen
% Select
Individuals=Select (individuals,sizepop)
Avgfitness=sum (individuals.fitness) / sizepop
% crossover
Individuals.chrom=Cross (pcross,lenchrom,individuals.chrom,sizepop,bound)
% variation
Individuals.chrom=Mutation (pmutation,lenchrom,individuals.chrom,sizepop,i,maxgen,bound)
% calculate fitness
For j=1:sizepop
X=individuals.chrom (jPermine:);% Decoding
Individuals.fitness (j) = fun (xrem inputnumrecoverhidn.outputnumrecovernetbookinputnlegoutputn)
End
Find the chromosomes with minimum and maximum fitness and their position in the population
[newbestfitness,newbestindex] = min (individuals.fitness)
[worestfitness,worestindex] = max (individuals.fitness)
To replace the best chromosomes in the last evolution.
If bestfitness > newbestfitness
Bestfitness=newbestfitness
Bestchrom=individuals.chrom (newbestindex,:)
End
Individuals.chrom (worestindex,:) = bestchrom
Individuals.fitness (worestindex) = bestfitness
Avgfitness=sum (individuals.fitness) / sizepop
Trace= [trace;avgfitness bestfitness];% records the best and average fitness in the evolution of each generation
End
Analysis of%% genetic algorithm results
Figure (1)
[r, c] = size (trace)
Plot ([1 r]', trace (:, 2), 'bMurmur`)
Title (['fitness curve' 'termination algebra =' num2str (maxgen)])
Xlabel ('evolutionary algebra'); ylabel ('fitness')
Legend ('average fitness', 'best fitness')
X=bestchrom
%% assign the optimal initial threshold weight to the network prediction
%% value prediction using BP network optimized by genetic algorithm
W1roomx (1:inputnum*hiddennum)
B1roomx (inputnum*hiddennum+1:inputnum*hiddennum+hiddennum)
W2roomx (inputnum*hiddennum+hiddennum+1:inputnum*hiddennum+hiddennum+hiddennum*outputnum)
B2roomx (inputnum*hiddennum+hiddennum+hiddennum*outputnum+1:inputnum*hiddennum+hiddennum+hiddennum*outputnum+outputnum)
Net.iw {1 ~ (1)} = reshape (w1 ~ () ~ (1) ~ ()
Net.lw {2JEI 1} = reshape (w2Med outputnumMed Hiddennum)
Net.b {1} = reshape (B1pr. HiddennumMagne1)
Net.b {2} = B2
%% BP network training
% network evolution parameters
Net.trainParam.epochs=100
Net.trainParam.lr=0.1
% net.trainParam.goal=0.00001
% Network training
[net,per2] = train (net,inputn,outputn)
% BP Network Forecast
% data normalization
Inputn_test=mapminmax ('apply',input_test,inputps)
An=sim (net,inputn_test)
Test_simu=mapminmax ('reverse',an,outputps)
Error=test_simu-output_test
This is the end of the content about "how to optimize BP neural network through genetic algorithm in matlab". Thank you for reading. If you want to know more about the industry, you can follow the industry information channel. The editor will update different knowledge points for you every day.
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.