Εφαρμογές της Συναρτησιακής Ανάλυσης στη Μηχανική Μάθηση

Postgraduate Thesis uoadl:1317477 430 Read counter

Unit:
Κατεύθυνση Εφαρμοσμένα Μαθηματικά
Library of the School of Science
Deposit date:
2012-07-20
Year:
2012
Author:
Μητσάκος Νικόλαος
Supervisors info:
Δάλλα Λεώνη Αναπλ. Καθηγ.
Original Title:
Εφαρμογές της Συναρτησιακής Ανάλυσης στη Μηχανική Μάθηση
Languages:
Greek
Summary:
The paper presents applications of Reproducing Kernel Hilbert Space's (RKHS)
theory in two issues of Machine Learning: classification using Support Vector
Machines (SVM) and the construction of an Adaptive Learning algorithm, based on
the LMS algorithm, for dealing with non-linear Learning Problems.
The First Chapter begins with a presentation of the basic definitions of the
RKHS theory including several examples such as the Hardy space of the unitary
disk, Sobolev spaces on [0,1], Bergman spaces on areas of the complex plane,
weighted Hardy spaces, as well as examples of multiple variables. Next, the
complexification of such spaces is explained and then the general applicable
theory. What follows is the characterization of Reproducing kernel functions
and the chapter concludes with the description of the Kernel Trick.
The Second Chapter deals with the solution of supervised learning
classification problems using Support Vector Machines (SVM). The usual route is
followed, starting from the simplified case of linearly separable data and
passing through the Dual formulation of the SVM's we arrive at the 1-Norm
Soft-Margin SVM and it's dual. What follows is the description of how the
Kernel Trick is applied in this case including information concerning, mainly,
the selection of the appropriate kernel function for a given problem. At the
end of each paragraph the relative algorithms are also formulated.
The Third Chapter opens with a description of Learning Problems and the Least
Mean Squares (LMS) algorithm used to address them. Next, the kernel LMS (KLMS)
algorithm is being described, which results from applying the kernel trick on
LMS, as well as some techniques used for sparsifying the solution such as the
Platt's novelty criterion, the Coherence Based Sparsification Strategy, the
Surprise Criterion and last, but not least, the Quantization technique that
leads to the Quantized kernel LMS. (To be precise, we use a normalized version
of each algorithm mentioned above). Closing the chapter, there are three
simulations presented: a Nonlinear Channel Equalization, a Chaotic Time Series
Prediction and one real data prediction.
Keywords:
Functional, Analysis, Machine, Learning, Kernel
Index:
No
Number of index pages:
0
Contains images:
Yes
Number of references:
12
Number of pages:
74
document.pdf (3 MB) Open in new window