Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to implement a Class of support Vector machines with Java

2025-01-15 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Internet Technology >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly introduces the relevant knowledge of "how to achieve a kind of support vector machine in Java". The editor shows you the operation process through an actual case. The operation method is simple, fast and practical. I hope this article "how to achieve a kind of support vector machine in Java" can help you solve the problem.

Usually, a kind of problem occurs when it is necessary to screen the training samples in a certain proportion, or the known training samples are all positive samples, but the negative samples are very few.

In this case, it is often necessary to train a compact classification boundary for the training samples, and then we can pass the negative sample experiment. A simple practical example is that when a factory checks the qualification of products, it often knows the parameters of qualified products, while the parameters of unqualified products either have more space or know very little. In this case, a first-class classifier can be trained by known qualified product parameters, and a compact classification boundary can be obtained. Beyond this boundary, it is considered to be an unqualified product.

% load the Iris dataset

Delete the length and width of petals

% regard all irises as the same category

Load fisheriris

X = meas (:, 1:2)

Y = ones (size (XPHR 1), 1)

% use processed datasets to train SVM classifiers

% assume that 5% of the observations are outliers

Rng (1)

SVMModel = fitcsvm (XQuery and Kernel Scales, autographs, Standardizecodes, truth, etc.

'OutlierFraction',0.05)

% SVMModel model is a trained classifier.

By default, the software uses a class of learning based on the Gaussian kernel

%% draw observation and detection boundaries, mark support vectors and possible outliers

SvInd = SVMModel.IsSupportVector

H = 0.02;% grid step

[X1Mague X2] = meshgrid (min (X (:, 1)): h:max (X (:, 1)),

Min (X (:, 2)): h:max (X (:, 2))

[~, score] = predict (SVMModel, [X1 (:), X2 (:)])

ScoreGrid = reshape (score,size (X1Magol 1), size (X2Magne2))

Figure

Plot (X (:, 1), X (:, 2), 'k.')

Hold on

Plot (X (svInd,1), X (svInd,2), 'ro','MarkerSize',10)

Contour (X1, X2, scoreGrid)

Colorbar

Title ('{\ bf Iris Outlier Detection via One-Class SVM}')

Xlabel ('Sepal Length (cm)')

Ylabel ('Sepal Width (cm)')

Legend ('Observation','Support Vector')

Hold off

The outliers and other data in the% diagram are separated by 0.

The proportion of observations of negative scores in%% cross-validation data is close to 5%.

CVSVMModel = crossval (SVMModel)

[~, scorePred] = kfoldPredict (CVSVMModel)

OutlierRate = mean (scorePred

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Internet Technology

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report