Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to realize regression fitting of support Vector Machine with matlab

2025-01-18 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly introduces "matlab how to achieve support vector machine regression fitting", in the daily operation, I believe that many people have doubts about how to achieve support vector machine regression fitting in matlab. The editor consulted all kinds of data and sorted out simple and easy-to-use operation methods. I hope it will be helpful for you to answer the doubt of "how to achieve support vector machine regression fitting by matlab". Next, please follow the editor to study!

When SVM is applied to regression fitting analysis, its basic idea is no longer to find an optimal classification surface to separate the two types of samples, but to find an optimal classification surface to minimize the error of all training samples from the optimal classification surface.

% clear environment variables

Clear

Clc

Import data

Load concrete_data.mat

% randomly generate training set and test set

N = randperm (size (attributes,2))

% training set-80 samples

P_train = attributes (:, n (1:80))'

T_train = strength (:, n (1:80))'

% Test set-23 samples

P_test = attributes (:, n (81:end))'

T_test = strength (:, n (81:end))'

%% data normalization

% training set

[pn_train,inputps] = mapminmax (paired pictures')

Pn_train = pn_train'

Pn_test = mapminmax ('apply',p_test',inputps)

Pn_test = pn_test'

% Test set

[tn_train,outputps] = mapminmax (tweezers')

Tn_train = tn_train'

Tn_test = mapminmax ('apply',t_test',outputps)

Tn_test = tn_test'

SVM model creation / training

Find the best c parameter / g parameter

[cMagneg] = meshgrid (- 10VOUL 0.5VOULING 10MAXING 10MAXING 0.5VOL10)

[mmaine n] = size (c)

Cg = zeros (mQuery n)

Eps = 10 ^ (- 4)

V = 5

Bestc = 0

Bestg = 0

Error = Inf

For I = 1PUR m

For j = 1PUR n

Cmd = ['- v', num2str (v),'- t 2p0.1', num2str (2 ^ c (iForce j)),'- g', num2str (2 ^ g (iForce j)),'- s 3-p 0.1']

Cg (iMagazine j) = svmtrain (tn_train,pn_train,cmd)

If cg (iMagnej) < error

Error = cg (iQuery j)

Bestc = 2 ^ c (iMagazine j)

Bestg = 2 ^ g (iPermine j)

End

If abs (cg (iMagnej)-error) 2 ^ c (iMagnej)

Error = cg (iQuery j)

Bestc = 2 ^ c (iMagazine j)

Bestg = 2 ^ g (iPermine j)

End

End

End

% create / train SVM

Cmd = ['- t 2million cmd'- c', num2str (bestc),'- g', num2str (bestg),'- s 3-p 0.01']

Model = svmtrain (tn_train,pn_train,cmd)

%% SVM simulation prediction

[Predict_1,error_1] = svmpredict (tn_train,pn_train,model)

[Predict_2,error_2] = svmpredict (tn_test,pn_test,model)

% de-normalization

Predict_1 = mapminmax ('reverse',Predict_1,outputps)

Predict_2 = mapminmax ('reverse',Predict_2,outputps)

% result comparison

Result_1 = [t_train predict_1]

Result_2 = [t_test predict_2]

%% drawing

Figure (1)

Plot (1:length (t_train), predict_1,'b:o')

Grid on

Legend ('real value', 'predicted value')

Xlabel ('sample number')

Ylabel ('compressive strength')

String_1 = {'comparison of prediction results of training set'

['mse =' num2str (error_1 (2))'R ^ 2 = 'num2str (error_1 (3))]}

Title (string_1)

Figure (2)

Plot (1:length (t_test), predict_2,'b:o')

Grid on

Legend ('real value', 'predicted value')

Xlabel ('sample number')

Ylabel ('compressive strength')

String_2 = {'Test set Forecast result comparison'

['mse =' num2str (error_2 (2))'R ^ 2 = 'num2str (error_2 (3))]}

Title (string_2)

%% BP neural network

% data transpose

Pn_train = pn_train'

Tn_train = tn_train'

Pn_test = pn_test'

Tn_test = tn_test'

Create BP neural network

Net = newff (pn_train,tn_train,10)

% set training parameters

Net.trainParam.epcohs = 1000

Net.trainParam.goal = 1e-3

Net.trainParam.show = 10

Net.trainParam.lr = 0.1

% training network

Net = train (net,pn_train,tn_train)

% Simulation Test

Tn_sim = sim (net,pn_test)

% mean square error

E = mse (tn_sim-tn_test)

% determination coefficient

N = size (tweetest1)

R2 = (N*sum (tn_sim.*tn_test)-sum (tn_sim) * sum (tn_test)) ^ 2 / ((N*sum ((tn_sim). ^ 2)-(sum (tn_sim)) ^ 2) * (N*sum ((tn_test). ^ 2)-(sum (tn_test)) ^ 2))

% de-normalization

T_sim = mapminmax ('reverse',tn_sim,outputps)

% drawing

Figure (3)

Plot (1:length (t_test), tactile test.cmpdfds. com.)

Grid on

Legend ('real value', 'predicted value')

Xlabel ('sample number')

Ylabel ('compressive strength')

String_3 = {'comparison of test set prediction results (BP neural network)'

['mse =' num2str (E)'R ^ 2 = 'num2str (R2)]}

Title (string_3)

At this point, the study of "how to realize the regression fitting of support vector machine by matlab" is over. I hope to be able to solve your doubts. The collocation of theory and practice can better help you learn, go and try it! If you want to continue to learn more related knowledge, please continue to follow the website, the editor will continue to work hard to bring you more practical articles!

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report