5910 Breckenridge Pkwy Suite B, Tampa, FL. 33610
(800) 272-0707

SkillSoft Explore Course

IT Skills     Data and Databases     Math     Math Behind Support Vector Machines
Simple to use yet efficient and reliable, support vector machines (SVMs) are supervised learning methods popularly used for classification tasks. This course uncovers the math behind SVMs, focusing on how an optimum SVM hyperplane for classification is computed.
Explore the representation of data in a feature space, finding a hyperplane to separate the data linearly. Then, learn how to separate non-linear data. Investigate the optimization problem for SVM classifiers, looking at how the weights of the model can be adjusted during training to get the best hyperplane separating the data points. Furthermore, apply gradient descent to solve the optimization problem for SVMs.
When you're done, you'll have the foundational knowledge you need to start building and applying SVMs for machine learning.

Objectives

Support Vector Machine (SVM) Math: A Conceptual Look at Support Vector Machines

  • discover the key concepts covered in this course
  • recognize the place of support vector machines (SVMs) in the machine learning landscape
  • outline how SVMs can be used to classify data, how hyperplanes are defined, and the qualities of an optimum hyperplane
  • recall the qualities of an optimum hyperplane, outline how scaling works with SVM, distinguish soft and hard margins, and recognize when and how to use either margin
  • recall the techniques that can be applied to classify data that are not linearly separable
  • formulate the optimization problem for support vector machines
  • apply the gradient descent algorithm to solve for the optimum hyperplane
  • summarize the key concepts covered in this course