Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to use support Vector Machine SVM of matlab

2025-03-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly introduces the relevant knowledge of "how to use matlab's support vector machine SVM". Xiaobian shows you the operation process through actual cases. The operation method is simple and fast and practical. I hope this article "how to use matlab's support vector machine SVM" can help you solve the problem.

Support Vector Machine (SVM) name sounds very flashy, function is also very flashy, but the formula to understand it has a sense of vertigo. So we don't need formula to explain the principle of SVM. There are four key terms to understand SVM: separation hyperplane, maximum edge hyperplane, soft edge and kernel function.

Separating hyperplane: When dealing with classification problems, we need a decision boundary, just like Chu River Han boundary, where we judge A and B. This decision boundary separates the two classes of things, and the linear decision boundary is the separation hyperplane.

Maximal Margin Hyperplane: There can be many separation hyperplanes, how to find the best one, SVM's approach is to find a "most middle". In other words, this plane should keep a distance from both sides as much as possible to leave enough margin, reduce generalization error and ensure robustness. Or, in Chinese terms,"Zhizhong." When the river is the boundary, it is the center line of the channel as the boundary, which is the embodiment of the maximum edge hyperplane. Mathematically, finding this maximal marginal hyperplane is a quadratic programming problem.

Soft Margin: But there is nothing so beautiful in the world. In many cases, it is a mixed state of "you have me, I have you." It is unlikely that a plane can perfectly separate the two categories. In the case of linear inseparability, soft edges must be considered. Soft edges can be exceptional in allowing individual samples to run onto other categories of territory. But use parameters to weigh the two ends, one to keep the maximum margin of separation, the other to make this exception not too far off the mark. This parameter is the penalty C for misclassification.

Kernel Function: In order to solve the problem of perfect separation, SVM also puts forward an idea, that is, mapping the original data to high-dimensional space. Intuitively, you can feel that the data in high-dimensional space becomes sparse, which is conducive to "distinguishing between friends and enemies." Then the mapping method is to use the kernel function. If this "kernel technique" is chosen correctly, data in high dimensional space becomes easily linearly separated. Moreover, it can be proved that there is always a kernel function that maps a dataset into separable high-dimensional data. Don't get too excited to see this, mapping into high-dimensional space is not a good thing. The harm with too many dimensions is overfitting.

Therefore, choosing appropriate kernel function and soft edge parameter C is an important factor in training SVM. In general, the more complex the kernel, the more prone the model is to overfitting. In terms of parameter C, it can be seen as the inverse of lambda in LASSO algorithm. The larger C, the more the model tends to overfit, and the opposite is underfit. How do you choose between practical problems? The oldest human method, trial and error.

Common kernel functions include the following:

Linear: Using it becomes a linear vector machine, which is basically equivalent to Logistic regression. But it can handle extremely variable situations, such as text mining.

polynomial: Polynomial kernel function, suitable for image processing problems.

Radial basis, Gaussian kernel, the most popular and easy-to-use option. Parameters include sigma, which if set too low will overfit.

sigmoid: inverse kernel function, mostly used for neural network activation function.

Finally, explain "support vector" and "machine" respectively.

(1)"Machine" - Classification Machine.

(2)"support vector" --these points on the maximum margin are called support vectors, and the final classification machine expression contains only these "support vector" information

%% Remove calyx length, width and one of the Iris flowers

load fisheriris

inds = ~strcmp(species,'setosa');

X = meas(inds,3:4);

y = species(inds);

%% Training SVM classifiers using processed datasets

SVMModel = fitcsvm(X,y);

% Show category

disp(SVMModel.ClassNames)

%% Plot scatter plot of data and circle support vectors

sv = SVMModel.SupportVectors;

figure

gscatter(X(:,1),X(:,2),y)

hold on

plot(sv(:,1),sv(:,2),'ko','MarkerSize',10)

legend('versicolor','virginica','Support Vector')

hold off

The data used is MATLAB's own iris sample.

About "matlab support vector machine SVM how to use" the content is introduced here, thank you for reading. If you want to know more about industry-related knowledge, you can pay attention to the industry information channel. Xiaobian will update different knowledge points for you every day.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report