In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-25 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >
Share
Shulou(Shulou.com)06/01 Report--
This article mainly introduces "matlab how to optimize the classification of support vector machines", in daily operation, I believe many people in matlab how to optimize the classification of support vector machines there are doubts, Xiaobian consulted all kinds of information, sorting out simple and easy to use operation methods, I hope to answer "matlab how to optimize the classification of support vector machines" doubts helpful! Next, please follow the small series to learn together!
By using the training set to find the best parameters of classification, the cross-validation method can not only predict the training set with high accuracy but also predict the test set reasonably, so that the classification accuracy of the test set can be maintained at a high level, that is, the learning ability and generalization ability of the SVM classifier obtained can be kept in a balance. Avoid overlearning and underlearning.
Cross-validation method is to let c and g in a certain range of values, for the selected c and g training set as the original data set using the K-CV method to obtain the training set in this group c and g validation classification accuracy, finally take the training set validation classification accuracy of the highest group c and g as the best parameters, but there is a problem is that there may be multiple groups of c and g corresponding to the highest validation classification accuracy, how to deal with this situation? The method used here is to select the group c and g that can meet the highest verification classification accuracy and the smallest parameter r as the best parameter. If there are multiple groups g corresponding to the smallest c, the first group c and g searched is selected as the best parameter. The reason for this is that too high c leads to overlearning, i.e., high classification accuracy for the training set and low classification accuracy for the test set (the generalization ability of the classifier decreases), so the smaller penalty parameter c is considered to be the better choice among all pairs of c and g that can achieve the highest validation classification accuracy.
%% Clear environment variables
clear
clc
%% Import Data
load BreastTissue_data.mat
% Random generation of training and test sets
n = randperm(size(matrix,1));
% Training Set--80 samples
train_matrix = matrix(n(1:80),:);
train_label = label(n(1:80),:);
% Test Set--26 samples
test_matrix = matrix(n(81:end),:);
test_label = label(n(81:end),:);
%% Data Normalization
[Train_matrix,PS] = mapminmax(train_matrix');
Train_matrix = Train_matrix';
Test_matrix = mapminmax('apply',test_matrix',PS);
Test_matrix = Test_matrix';
%% SVM Creation/Training (RBF Kernel)
% Finding the Best c/g Parameter--Cross Validation Method
[c,g] = meshgrid(-10:0.2:10,-10:0.2:10);
[m,n] = size(c);
cg = zeros(m,n);
eps = 10^(-4);
v = 5;
bestc = 1;
bestg = 0.1;
bestacc = 0;
for i = 1:m
for j = 1:n
cmd = ['-v ',num2str(v),' -t 2',' -c ',num2str(2^c(i,j)),' -g ',num2str(2^g(i,j))];
cg(i,j) = svmtrain(train_label,Train_matrix,cmd);
if cg(i,j) > bestacc
bestacc = cg(i,j);
bestc = 2^c(i,j);
bestg = 2^g(i,j);
end
if abs( cg(i,j)-bestacc ) 2^c(i,j)
bestacc = cg(i,j);
bestc = 2^c(i,j);
bestg = 2^g(i,j);
end
end
end
cmd = [' -t 2',' -c ',num2str(bestc),' -g ',num2str(bestg)];
% Create/train SVM models
model = svmtrain(train_label,Train_matrix,cmd);
%% SVM Simulation Test
[predict_label_1,accuracy_1] = svmpredict(train_label,Train_matrix,model);
[predict_label_2,accuracy_2] = svmpredict(test_label,Test_matrix,model);
result_1 = [train_label predict_label_1];
result_2 = [test_label predict_label_2];
%% Plot
figure
plot(1:length(test_label),test_label,'r-*')
hold on
plot(1:length(test_label),predict_label_2,'b:o')
grid on
legend ('true category','predicted category')
xlabel ('test set sample number')
ylabel ('Test Set Sample Category')
string = {'Test Set SVM Prediction Results Comparison (RBF Kernel Function)';
['accuracy = ' num2str(accuracy_2(1)) '%']};
title(string)
At this point, on "matlab how to optimize the classification of support vector machines" learning is over, I hope to solve everyone's doubts. Theory and practice can better match to help you learn, go and try it! If you want to continue learning more relevant knowledge, please continue to pay attention to the website, Xiaobian will continue to strive to bring more practical articles for everyone!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.