## Abstract

With the advancement in information technology, datasets with an enormous amount of data are available. The classification task on these datasets is more time- and memory-consuming as the number of data increases. The support vector machine (SVM), which is arguably the most popular classification technique, has disappointing performance in dealing with large datasets due to its constrained optimization problem. To deal with this challenge, the variant SVM (VSVM) has been utilized which has the fraction ({1}/{2})b{2} in its primal objective function, where b is the bias of the desired hyperplane. The VSVM has been solved with different optimization techniques in more time- and memory-efficient fashion. However, there is no guarantee that its optimal solution is the same as the standard SVM. In this paper, we introduce the generalized VSVM (GVSVM) which has the fraction ({1}/{2t})b{2} in its primal objective function, for a fixed positive scalar t. Further, we present the thorough theoretical insights that indicate the optimal solution of the GVSVM tends to the optimal solution of the standard SVM as t rightarrow infty . One vital corollary is to derive a closed-form formula to obtain the bias term in the standard SVM. Such a formula obviates the need of approximating it, which is the modus operandi to date. An efficient neural network is then proposed to solve the GVSVM dual problem, which is asymptotically stable in the sense of Lyapunov and converges globally exponentially to the exact solution of the GVSVM. The proposed neural network has less complexity in architecture and needs fewer computations in each iteration in comparison to the existing neural solutions. Experiments confirm the efficacy of the proposed recurrent neural network and the proximity of the GVSVM and the standard SVM solutions with more significant values of t.

Original language | English |
---|---|

Article number | 8730505 |

Pages (from-to) | 2798-2809 |

Number of pages | 12 |

Journal | IEEE Transactions on Systems, Man, and Cybernetics: Systems |

Volume | 51 |

Issue number | 5 |

DOIs | |

Publication status | Published - May 2021 |

## Keywords

- Convex programming
- exponential convergence
- generalized VSVM (GVSVM)
- recurrent neural network (RNN)
- support vector machine (SVM)