[Home]


NSVM

A Matlab code for Single Versus Union: Non-parallel Support Vector Machine Frameworks. (You could Right-Click [Code] , and Save, then you can download the whole matlab code.)


Reference

Chun-Na Li, Yuan-Hai Shao, Naihua Xiu, Huajun Wang, Yu-Ting Zhao, Ling-Wei Huang and Nai-Yang Deng "Single Versus Union: Non-parallel Support Vector Machine Frameworks" Submitted 2019

Main Function

function [PredictY,obj,t] = NSVM_G(TestX,DataTrain,FunPara) %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % Ref: "Single Versus Union: Non-parallel Support Vector Machine % Frameworks" % Useage: function [PredictY] = NSVM_G(TestX,DataTrain,FunPara) % Input: % TestX: input test data matrix % DataTrain: DataTrain.X is the input training matrix, DataTrain.Y is % the training label vector. % FunPara: FunPara.C1 is the parameter C1, FunPara.C2 is the parameter % C2, FunPara.kerPara is the kernel parameter, FunPara.L is L>0. % % Output: % PredictY: the predicted label of TestX % % Reference: % Chun-Na Li, Yuan-Hai Shao, Naihua Xiu, Huajun Wang, Yu-Ting Zhao, % Ling-Wei Huang and Nai-Yang Deng "Single Versus Union: Non-parallel % Support Vector Machine Frameworks" Submitted 2019 % % Version 1.0 --Oct/2019 % % Written by Chun-Na Li (na1013na@163.com) %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% kerfPara = FunPara.kerfPara; C1 = FunPara.C1; C2 = FunPara.C2; L = FunPara.L; maxit = 20; % max iteration number eps = 10^(-5); % breaking condition 10^-5 X=(DataTrain.X)'; mmm1=size(DataTrain.X,1); mmm2=size(TestX,1); TestXback=TestX; TestX=TestX'; classLabel = unique(DataTrain.Y); nClass = length(classLabel); Xnew = []; for y = 1:nClass Index(y,:) = (DataTrain.Y == classLabel(y)); nSmpClass(y+1) = size((X(:,Index(y,:))),2); endind = cumsum(nSmpClass); Xsubclass{y} = X(:,Index(y,:)); Xnew(:,endind(length(nSmpClass)-1)+1:endind(length(nSmpClass))) = X(:,Index(y,:)); Ysubclass{y} = y*ones(nSmpClass(y+1),1); Ynew(endind(length(nSmpClass)-1)+1:endind(length(nSmpClass)),:) = y*ones(nSmpClass(y+1),1); end clear X if kerfPara.type == 'rbf' if (mmm1+mmm2)>=1000 Xnew=Xnew'; Xk = Xnew(crossvalind('Kfold',Xnew(:,1),10)==1,:); Xk=Xk'; Xnew=Xnew'; else Xk=Xnew; end TestX = kernelfun(TestXback,kerfPara,Xk'); TestX = TestX Xnew = kernelfun(Xnew',kerfPara,Xk'); Xnew =Xnew'; [~,nSmp1]=size(Xk); end theta0 = 1; theta1 = 1; [nFea,nSmp] = size(Xnew); if kerfPara.type == 'lin' A = [eye(nFea),zeros(nFea,1);1,zeros(1,nFea)]; Xnew = [Xnew;ones(1,nSmp)]; Wb_all = rand((nFea+1)*nClass,1); else A = kernelfun(Xk',kerfPara,Xk'); Wb_all = rand(nFea*nClass,1); end %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % Algorithm %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % Initializations for t = 1:maxit if t == 1 Wb{1} = Wb_all; % Wb{1} represents w^{t-1} in Algorithm 1 Wb{2} = Wb_all; % Wb{2} represents w^t in Algorithm 1 end if t==2 Wb{1} = Wb_all; end F = zeros(nSmp,nClass); % Note: Fi=F(i,:) here is a row vector if kerfPara.type == 'lin' H = zeros(nClass*(nFea+1),nClass*(nFea+1)); else H = zeros(nClass*nFea,nClass*nFea); end for i = 1:nSmp Bi = []; yi = Ynew(i); for y = 1:nClass if kerfPara.type == 'lin' Bi(y) = (((Wb{2}((nFea+1)*(yi-1)+1:(nFea+1)*yi))'*(Xnew(:,i)))^2 - ((Wb{2}((nFea+1)*(y-1)+1:(nFea+1)*y))'*Xnew(:,i))^2); else Bi(y) = ((Wb{2}((nFea*(yi-1)+1):nFea*yi))'*(Xnew(:,i)+ones(nFea,1))*(Xnew(:,i)+ones(nFea,1))'*Wb{2}((nFea*(yi-1)+1):nFea*yi))^2 - ... ((Wb{2}((nFea*(y-1)+1):nFea*y))'*(Xnew(:,i)+ones(nFea,1))*(Xnew(:,i)+ones(nFea,1))'*Wb{2}((nFea*(y-1)+1):nFea*y))^2; end end [maxBi,ji] = max(Bi); if maxBi>=0 F(i,ji) = 1; end if kerfPara.type == 'lin' H((nFea+1)*(ji-1)+1:(nFea+1)*ji,(nFea+1)*(ji-1)+1:(nFea+1)*ji) = H((nFea+1)*(ji-1)+1:(nFea+1)*ji,(nFea+1)*(ji-1)+1:(nFea+1)*ji) + Xnew(:,i)*Xnew(:,i)'; else H(nFea*(ji-1)+1:nFea*ji,nFea*(ji-1)+1:nFea*ji) = H(nFea*(ji-1)+1:nFea*ji,nFea*(ji-1)+1:nFea*ji) + (Xnew(:,i)+ones(nFea,1))*(Xnew(:,i)+ones(nFea,1))'; end end H = C1*H; % Compute G; D = C1 + C2; if kerfPara.type == 'lin' G = zeros(nClass*(nFea+1),nClass*(nFea+1)); else G = zeros(nClass*nFea,nClass*nFea); end for y = 1:nClass if kerfPara.type == 'lin' Gy{y} = zeros(nFea+1, nFea+1); else Gy{y} = zeros(nFea, nFea); end for i = 1:nSmp if Ynew(i) == y if kerfPara.type == 'lin' Gy{y} = Gy{y} + D*Xnew(:,i)*Xnew(:,i)'; else Gy{y} = Gy{y} + D*(Xnew(:,i)+ones(nFea,1))*(Xnew(:,i)+ones(nFea,1))'; end end end Gy{y} = 0.5*A + Gy{y}; if kerfPara.type == 'lin' G((nFea+1)*(y-1)+1:(nFea+1)*y,(nFea+1)*(y-1)+1:(nFea+1)*y) = Gy{y}; else G(nFea*(y-1)+1:nFea*y,nFea*(y-1)+1:nFea*y) = Gy{y}; end end [~,lab] = eig(G); lab = min(diag(lab)); beta = min((theta0 - 1)/theta1,sqrt((2*lab)/(2*lab+L))); theta0 = theta1; theta1 = 0.5*(1+sqrt(1+4*theta0^2)); % % % Main part of Algorithm 1 if t == 1 xi = 2*H*Wb{2}; % kesai u = Wb{2} + beta*(Wb{2} - Wb{1}); Wb_alltPlus1(:,t) = (0.5*L*eye(size(G)) + G)\(L*u + xi); % Wb_alltPlus1 represents w^{t+1} in Algorithm 1 Wb{2} = Wb_alltPlus1(:,t); elseif t == 2 xi = 2*H*Wb{2}; u = Wb{2} + beta*(Wb{2} - Wb{1}); Wb_alltPlus1(:,t) = (0.5*L*eye(size(G)) + G)\(L*u + xi); else Wb{1} = Wb{2}; Wb{2} = Wb_alltPlus1(:,t-1); xi = 2*H*Wb{2}; u = Wb{2} + beta*(Wb{2} - Wb{1}); Wb_alltPlus1(:,t) = (0.5*L*eye(size(G)) + G)\(L*u + xi); end if t>1 && norm(Wb_alltPlus1(:,t)-Wb{2}) < eps break; end obj(t) = Wb_alltPlus1(:,t)'*G*Wb_alltPlus1(:,t) - Wb_alltPlus1(:,t)'*H*Wb_alltPlus1(:,t); end %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % Prediction and output %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% [~, ind_opt] = min(obj); Wb_opt = Wb_alltPlus1(:,ind_opt); m = size(TestX,2); for y = 1:nClass if kerfPara.type == 'lin' temp = Wb_opt((nFea+1)*(y-1)+1:(nFea+1)*y); W_all{y} = temp(1:(length(temp)-1)); b_all{y} = temp(length(temp)); dis(:,y) = abs(TestX'*W_all{y} + b_all{y}*ones(m,1))/sqrt(W_all{y}'*W_all{y}); else alpha{y} = Wb_opt((nFea*(y-1)+1):nFea*y); dis(:,y) = abs((TestX +ones(nFea,1))'*alpha{y})/sqrt(alpha{y}'*(A+ones(nSmp1,nSmp1))*alpha{y}); end end [~,PredictY] = min(dis'); PredictY = PredictY'; end
Contacts


Any question or advice please email to na1013na@163.com or shaoyuanhai21@163.com.