Statistical learning theory derives necessary and sufficient conditions for consistency and fast rate of convergence of the empirical risk minimization principle , which is the basis of most traditional learning algorithms . it also theoretically underpins the support vector algorithms . support vector learning algorithm is based on structural risk minimization principle 传统的学习算法大多是基于经验风险最小化原则的,统计学习理论给出了经验风险最小化原则一致和快速收敛的充分和必要条件,并且为支持向量算法做了理论支持。