In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/01 Report--
This article mainly shows you "how to use Python and Matla to achieve simulated annealing", the content is easy to understand, clear, hope to help you solve your doubts, the following let the editor lead you to study and learn "how to use Python and Matla to achieve simulated annealing" this article.
1 Python implementation 1.1 source code implementation
I have previously given the complete knowledge points and source code implementation of simulated annealing: intelligent optimization algorithm-ant colony algorithm (Python implementation)
Simulated annealing, like the Monte Carlo experiment, is globally random. Because there is no adaptive process (such as approaching to the optimal, weight gradient decline, etc.), it is difficult to find the optimal solution for complex functions, which are all approximate optimal solutions. However, such as bat algorithm, particle swarm optimization algorithm and other directed optimal approximation and through the optimal optimal parameters adjustment steps, although the following graph function is easy to fall into local optimization, but the optimization accuracy is relatively high. If we understand this paragraph, we should understand why if the neural network initially optimizes a set of better network parameters before training, the training effect will be greatly improved and the error accuracy will be achieved faster.
1.2 sko.SA implementation # = 1 guide package = import matplotlib.pyplot as pltimport pandas as pdfrom sko.SA import SA # = 2 definition problem = fun = lambda x: X [0] * * 2 + x [2] * * 2 # = 3 run simulated annealing algorithm = sa = SA (func=fun, x0 = [1,1,1], T_max=1, T_min=1e-9, max_stay_counter=150 300, max_stay_counter=150) best_x Best_y = sa.run () print ('best_x:', best_x,' best_y', best_y) # = 4 draw the result = plt.plot (pd.DataFrame (sa.best_y_history). Cummin (axis=0)) plt.show () # scikit-opt also provides three simulated annealing schools: Fast, Boltzmann, Cauchy. # = 1.1 Fast Simulated Annealing=from sko.SA import SAFast sa_fast = SAFast (func=demo_func, x0 = [1,1,1], T_max=1, T_min=1e-9, Qcow 0.99, Lhasa 300, max_stay_counter=150) sa_fast.run () print ('Fast Simulated Annealing: best_x is', sa_fast.best_x, 'best_y is', sa_fast.best_y) # = 1.2Fast Simulated Annealing with bounds=from sko.SA import SAFast sa_fast = SAFast (func=demo_func, x0 = [1,1,1] T_max=1, T_min=1e-9, Qcow 0.99, Lhasa 300, max_stay_counter=150, lb= [- 1,1,1], ub= [2,3,4]) sa_fast.run () print ('Fast Simulated Annealing with bounds: best_x is', sa_fast.best_x, 'best_y is', sa_fast.best_y) # = 2.1Boltzmann Simulated Annealing=from sko.SA import SABoltzmann sa_boltzmann = SABoltzmann (func=demo_func, x0 = [1,1,1] T_max=1, T_min=1e-9, Qstocks 0.99, Lhasa 300, max_stay_counter=150) sa_boltzmann.run () print ('Boltzmann Simulated Annealing: best_x is', sa_boltzmann.best_x, 'best_y is', sa_fast.best_y) # = 2.2 Boltzmann Simulated Annealing with bounds=from sko.SA import SABoltzmann sa_boltzmann = SABoltzmann (func=demo_func, x0 = [1,1,1], T_max=1, T_min=1e-9, Qstocks 0.99, Lhasa 300, max_stay_counter=150 Lb=-1, ub= [2,3,4]) sa_boltzmann.run () print ('Boltzmann Simulated Annealing with bounds: best_x is', sa_boltzmann.best_x, 'best_y is', sa_fast.best_y) # = 3.1Cauchy Simulated Annealing=from sko.SA import SACauchy sa_cauchy = SACauchy (func=demo_func, x0 = [1,1,1], T_max=1, T_min=1e-9, Qcow 0.99, Lhasa 300 Max_stay_counter=150) sa_cauchy.run () print ('Cauchy Simulated Annealing: best_x is', sa_cauchy.best_x, 'best_y is', sa_cauchy.best_y) # = 3.2Cauchy Simulated Annealing with bounds=from sko.SA import SACauchy sa_cauchy = SACauchy (func=demo_func, x0 = [1,1,1], T_max=1, T_min=1e-9, qroom0.99, Lhas300, max_stay_counter=150, lb= [- 1,1,-1], ub= [2 3, 4]) sa_cauchy.run () print ('Cauchy Simulated Annealing with bounds: best_x is', sa_cauchy.best_x, 'best_y is', sa_cauchy.best_y)
2 Matlab achieves 2. 1 simulated annealing clearclcT=1000;% initialization temperature value 1;% setting temperature lower bound alpha=0.99;% temperature reduction rate num=1000;% total particle number n = 2;% independent variable number sub= [- 5 Lili 5];% independent variable lower limit up= [5dint 5];% independent variable upper limit tufor i=1:numfor j=1:nx (iMagnej) = (up (j)-sub (j) * rand+sub (j)) End fx (iMagne1) = fun (x (iMagne1), x (iMagne2)); end% take minimization as an example [bestf,a] = min (fx); bestx=x (bestx=x:); trace (1) = bestf;while (T > T_min) for i=1:numfor jungle 1 xx (iMagazine j) = (up (j)-sub (j) * rand+sub (j); end ff (iQuery 1) = fun (xx (iMagol 1), xx (iMagol 2)) Delta=ff (iMagne1)-fx (iMagne1); if deltarand fx (iMagne1) = ff (iMagne1); x (iMagne:) = xx (iMagne:); end end endif min (fx)
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.