Research Article Open Access

Escalating SVM-Efficiency by Sequential QP LP and Slack Variable Analysis

Amit Kumar Kundu1, Rezaul Karim1 and Ali Ahmed Ave1
  • 1 Department of Electrical and Electronic Engineering, Uttara University, Dhaka, Bangladesh


Support Vector Machine (SVM) is a highly attractive algorithm among many machine learning models due to its generalization power and classification performance based on sound mathematical formulation being convex that offers global minimum. However, despite being sparse, its high classification cost from kernel execution with Support Vectors (SVs) reduces the user's interest when there are hard computational constraints in the application, especially, for large and difficult data. So far in our knowledge, out of many existing works to overcome this problem, some are really interesting and heavy but get less attractive due to improper training difficulties for example, excessive cost-memory requirement, initialization, and parameter selection trouble because of the non-convexity of the problems while the other few that avoid these problems, cannot generate sparsity and complexity simultaneously of the final discriminator upto satisfactory level for very large and tricky data. In this direction, we propose a novel algorithm Efficiency Escalated SVM (EESVM) that solves two convex problems using Quadratic Programming (QP) and Linear Programming (LP) in sequence. This is followed by computational analysis on the remaining smallest set of slack variables that ultimately build two very essential properties of the machine: (i) Highly efficient by being heavily sparse and optimally complex and (ii) Able to handle very large and noise-effected complicated data. Benchmarking shows that this EESVM demands kernel computation as little as 6.8% of the standard QPSVM while posing almost the same classification accuracy on test data and requiring 42.7, 27.7 and 46.6% that of other three implemented state-of-the-art heavy-sparse machines while offering similar classification accuracy. It claims the lowest Machine Accuracy Cost (MAC) value among all of these machines though showing very similar generalization performance that is evaluated numerically using the term Generalization Failure Rate (GFR). Being quite pragmatic for modern technological advancement, it is indispensable for optimum manipulation of the troublesome massive, and difficult data.

Journal of Computer Science
Volume 19 No. 10, 2023, 1253-1262


Submitted On: 16 March 2023 Published On: 6 September 2023

How to Cite: Kundu, A. K., Karim, R. & Ave, A. A. (2023). Escalating SVM-Efficiency by Sequential QP LP and Slack Variable Analysis. Journal of Computer Science, 19(10), 1253-1262.

  • 0 Citations



  • SVM
  • Generalization
  • Sparse
  • Classification
  • Non-Convex