Support Vector Machines (2)
SVMs are based on guaranteed risk bounds of statistical learning theory, i.e., the so-called structural risk minimization principle.
They implement a set of functions that approximate best the supervisor's response with an expected risk bounded by the sum of the empirical risk and the Vapnik-Chervonenkis (VC) confidence.
VC confidence is a bound on the generalization ability of the learning machine, that depends on the so-called VC dimension of the set of functions implemented by the machine.