Abstract
The main contribution of this dissertation is the development of a method to train a Support Vector Regression (SVR) model for the large-scale case where the number of training samples supersedes the computational resources. The proposed scheme consists of posing the SVR problem entirely as a Linear Programming (LP) problem and on the development of a sequential optimization method based on variables decomposition, constraints decomposition, and the use of primal-dual interior point methods. Experimental results demonstrate that the proposed approach has comparable performance with other SV-based classifiers. Particularly, experiments demonstrate that as the problem size increases, the sparser the solution becomes, and more computational efficiency can be gained in comparison with other methods. To reduce the LP-SVR training time, a method is developed that takes advantage of the fact that the support vectors (SVs) are likely to lie on the convex hull of each class. The algorithm uses the Mahalanobis distance from the class sample mean in order to rank each sample in the training set; then the samples with the largest distances are used as part of the initial working set. Experimental results show a reduction in the total training time as well as a significant decrease in the total iterations percentage. Results also suggest that using the speedup strategy, the SVs are found earlier in the learning process. Also, this research introduces a method to find the set of LP-SVR hyper-parameters; experimental results show that the algorithm provides hyper-parameters that minimize an estimate of the true test generalization error. Finally, the SVR scheme shows state-of-the-art performance in various applications such as power load prediction forecasting, texture-based image segmentation, and classification of remotely sensed imagery. This demonstrates that the proposed learning scheme and the LP-SVR model are robust and efficient when compared with other methodologies for large-scale problems.Buy my dissertation here
But if it seems too expensive for you ($37 the PDF and $59 the soft-cover) you may contact me for a free copy, i.e., if you want to torture yourself in 335 pages, just let me know.