Asymmetric linear dimension reduction for classification.
Journal of Computational and Graphical Statistics
This article discusses methods to project a p-dimensional dataset with classified points from s known classes onto a lower dimensional hyperplane so that the classes appear optimally separated. Such projections can be used, for example, for data visualization and classification in lower dimensions. New methods, which are asymmetric with respect to the numbering of the groups, are introduced for s = 2. They aim at generating data projections where one class is homogeneous and optimally separated from the other class, while the other class may be widespread. They are compared to classical discriminant coordinates and other symmetric methods from the literature by a simulation study, the application to a 12-dimensional dataset of 74,159 spectra of stellar objects, and to land snails distribution data. Neighborhood-based methods are also investigated, where local information about the separation of the classes is averaged. The use of robust MCD-covariance matrices is suggested.
|Title:||Asymmetric linear dimension reduction for classification|
|Additional information:||Imported via OAI, 7:29:01 24th Apr 2008|
|Keywords:||visualization, discriminant coordinates, canonical coordinates, nearest neighbor, projection pursuit, cluster validation, MCD estimator, quasars|
|UCL classification:||UCL > School of BEAMS > Faculty of Maths and Physical Sciences
UCL > School of BEAMS > Faculty of Maths and Physical Sciences > Statistical Science
Archive Staff Only