The present thesis deals with the fundamental machine learning issues of increasing the accuracy of learning systems and their computational performances. The key concept which is exploited throughout the thesis, is the tunable trade-off between local and global approaches to learning, integrating the effective setting of Instance Based Learning with the sound foundations of Statistical Learning Theory. Four are the main contributions of the thesis in this context: (i) a theoretical analysis and empirical evaluation of the Local SVM approach, (ii) a family of operators on kernels to obtain Quasi-Local kernels, (iii) the framework of Local Kernel Machines, and (iv) a local maximal margin approach to noise reduction. In our analysis of Local SVM, we derive a new learning bound starting from the theory of Local Learning Algorithms, and we showed that Local SVM statistically significantly overcomes the classification accuracy of SVM in a number of scenarios. The novel family of operators on kernels integrates local feature-space information into any kernel obtaining Quasi-Local kernels, mixing the effect of the input kernel with a kernel which is local in the feature space of the input one. With Local Kernel Machine we show that locality can be exploited to obtain fast and scalable kernel machines, whereas existing fast approximated SVM solvers try to globally smooth the decision functions. Fast Local Kernel SVM (FaLK-SVM) trains a set of local SVMs on redundant neighbourhoods in the training set selecting at testing time the most appropriate model for each query point. Theoretically supported by a recent result relating consistency and localizability, our approach divides the separation function in solutions of local optimization problems that can be handled very efficiently. For this novel approach, we derive a fast local model selection strategy, theoretical learning bounds and favourable complexity bounds. Local Kernel Machines can also be applied to the problem of detecting and removing noisy examples from datasets in order to enhance the generalization ability of Instance Based Learning and for data-cleansing. The local maximal margin principle provides a more robust alternative to the majority rule on which almost all the existing noise reduction techniques are based and a scalable version of the approach extends the feasibility of the noise reduction task to large datasets such as genomic-scale biological data. Extensive evaluations of the proposed techniques are carried out on more than 100 datasets with up to 3 millions examples, and statistically significantly showed that Quasi-Local kernels are more accurate than the corresponding input kernels using SVM, and that Local Kernel Machines can improve the generalization ability of accurate and approximated SVM solvers and of traditional noise-reduction techniques with much faster training and testing times and better scalability performances.

Local Approaches for Fast, Scalable and Accurate Learning with Kernels / Segata, Nicola. - (2009), pp. 1-174.

Local Approaches for Fast, Scalable and Accurate Learning with Kernels

Segata, Nicola
2009-01-01

Abstract

The present thesis deals with the fundamental machine learning issues of increasing the accuracy of learning systems and their computational performances. The key concept which is exploited throughout the thesis, is the tunable trade-off between local and global approaches to learning, integrating the effective setting of Instance Based Learning with the sound foundations of Statistical Learning Theory. Four are the main contributions of the thesis in this context: (i) a theoretical analysis and empirical evaluation of the Local SVM approach, (ii) a family of operators on kernels to obtain Quasi-Local kernels, (iii) the framework of Local Kernel Machines, and (iv) a local maximal margin approach to noise reduction. In our analysis of Local SVM, we derive a new learning bound starting from the theory of Local Learning Algorithms, and we showed that Local SVM statistically significantly overcomes the classification accuracy of SVM in a number of scenarios. The novel family of operators on kernels integrates local feature-space information into any kernel obtaining Quasi-Local kernels, mixing the effect of the input kernel with a kernel which is local in the feature space of the input one. With Local Kernel Machine we show that locality can be exploited to obtain fast and scalable kernel machines, whereas existing fast approximated SVM solvers try to globally smooth the decision functions. Fast Local Kernel SVM (FaLK-SVM) trains a set of local SVMs on redundant neighbourhoods in the training set selecting at testing time the most appropriate model for each query point. Theoretically supported by a recent result relating consistency and localizability, our approach divides the separation function in solutions of local optimization problems that can be handled very efficiently. For this novel approach, we derive a fast local model selection strategy, theoretical learning bounds and favourable complexity bounds. Local Kernel Machines can also be applied to the problem of detecting and removing noisy examples from datasets in order to enhance the generalization ability of Instance Based Learning and for data-cleansing. The local maximal margin principle provides a more robust alternative to the majority rule on which almost all the existing noise reduction techniques are based and a scalable version of the approach extends the feasibility of the noise reduction task to large datasets such as genomic-scale biological data. Extensive evaluations of the proposed techniques are carried out on more than 100 datasets with up to 3 millions examples, and statistically significantly showed that Quasi-Local kernels are more accurate than the corresponding input kernels using SVM, and that Local Kernel Machines can improve the generalization ability of accurate and approximated SVM solvers and of traditional noise-reduction techniques with much faster training and testing times and better scalability performances.
2009
XXII
2009-2010
Ingegneria e Scienza dell'Informaz (cess.4/11/12)
Information and Communication Technology
Blanzieri, Enrico
no
Inglese
Settore INF/01 - Informatica
File in questo prodotto:
File Dimensione Formato  
PhD-Thesis.pdf

accesso aperto

Tipologia: Tesi di dottorato (Doctoral Thesis)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 4.43 MB
Formato Adobe PDF
4.43 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/369130
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact