OU Portal
Log In
Welcome
Applicants
Z6_60GI02O0O8IDC0QEJUJ26TJDI4
Error:
Javascript is disabled in this browser. This page requires Javascript. Modify your browser's settings to allow Javascript to execute. See your browser's documentation for specific instructions.
{}
Zavřít
Publikační činnost
Probíhá načítání, čekejte prosím...
publicationId :
tempRecordId :
actionDispatchIndex :
navigationBranch :
pageMode :
tabSelected :
isRivValid :
Typ záznamu:
stať ve sborníku (D)
Domácí pracoviště:
Ústav pro výzkum a aplikace fuzzy modelování (94410)
Název:
Generalized Eigenvector Problem as a Foundation for Dimensionality Reduction
Citace
JANEČEK, J. a Perfiljeva, I. Generalized Eigenvector Problem as a Foundation for Dimensionality Reduction.
In:
ISCAMI 2018: Proceedings of the 19th International Student Conference on Applied Mathematics and Informatics 2018-05-10 Malenovice.
Ostrava: University of Ostrava, 2018. s. 35-36. ISBN 978-80-7464-112-1.
Podnázev
Rok vydání:
2018
Obor:
Informatika
Počet stran:
2
Strana od:
35
Strana do:
36
Forma vydání:
Tištená verze
Kód ISBN:
978-80-7464-112-1
Kód ISSN:
Název sborníku:
Proceedings of the 19th International Student Conference on Applied Mathematics and Informatics
Sborník:
Mezinárodní
Název nakladatele:
University of Ostrava
Místo vydání:
Ostrava
Stát vydání:
Sborník vydaný v ČR
Název konference:
ISCAMI 2018
Místo konání konference:
Malenovice
Datum zahájení konference:
Typ akce podle státní
příslušnosti účastníků akce:
Celosvětová akce
Kód UT WoS:
EID:
Klíčová slova anglicky:
dimensionality reduction, weighted graph, Laplacian matrix, eigenvalue problem
Popis v původním jazyce:
The focus of our research is the dimensionality reduction problem which can be described as follows: the input data are characterized by a large amount of parameters. The aim of this problem is to find a map that transforms the data into a form in which they are described by substantially fewer parameters and in which the resulting loss of information is in a way minimal. In other words, the points representing the input data are given and their images are to be found according to a specified criterion.One of the most employed methods for dimensionality reduction is the Principal Component Analysis.Our approach is different from the standard approach of the data analysis. We are trying to get a more stable results of the dimensionality reduction problem and hence we decide to change the notion of closeness. We require that the images of the close input points are close as well.As a first step, we embed the input data into a weighted connected graph represented by the incidence matrix where the elements characterize the closeness between the input points. Then we proceed with the Laplacian matrix and solve the generalized eigenvalue problem. It is known that an eigenvector corresponding to the minimal eigenvalue of this problem minimizes the quadratic form specified by the Laplacian matrix. And it is ensured that the initial requirement on the closeness is satisfied. The Laplacian matrix is positive semi-definite and this case is more difficult than that of positive definite matrices.In our work we are focused on mathematical justification of all heuristic claims that are encountered in the reference literature. The main theory is the theory of matrices and their particular forms. We also proved important facts about eigenvectors of positive semi-definite matrices. This research has a large potential in many practical applications, such as in the clustering and in the image segmentation problem which we plan to anylyze further on.
Popis v anglickém jazyce:
Seznam ohlasů
Ohlas
R01:
Complementary Content
Deferred Modules
${title}
${badge}
${loading}
Deferred Modules