Computational & Technology Resources
an online resource for computational,
engineering & technology publications
Civil-Comp Proceedings
ISSN 1759-3433
CCP: 94
PROCEEDINGS OF THE SEVENTH INTERNATIONAL CONFERENCE ON ENGINEERING COMPUTATIONAL TECHNOLOGY
Edited by:
Paper 108

A Robust Two-Dimensional Principal Component Analysis for Classification

D.E. Herwindiati

Faculty of Information Technology, Tarumanagara University, Jakarta, Indonesia

Full Bibliographic Reference for this paper
D.E. Herwindiati, "A Robust Two-Dimensional Principal Component Analysis for Classification", in , (Editors), "Proceedings of the Seventh International Conference on Engineering Computational Technology", Civil-Comp Press, Stirlingshire, UK, Paper 108, 2010. doi:10.4203/ccp.94.108
Keywords: outlier, principal component analysis, 2DPCA, robust, vector variance.

Summary
Classification is the grouping together of similar things [1]. This paper proposes the new measure of classification by combining two advantages from two approaches; they are the two dimensional projection approach and the robust approach. The process of classification is not 'acceptable' if the dispersion of data is close to the singularity problem. The singularity problem will arise if the component variables of the system have near-linear dependencies. One of the variables can be written as a near linear combination of the other variables. The dimension reduction is one of the techniques to eliminate the 'redundant' information of a variable. One of the most common forms of dimensionality reduction is the principal components analysis (PCA) [2]. PCA is a technique to transform the original set of variables into a smaller set of linear combinations that account for most of the original set variance. One disadvantage of PCA is the elaborate computation.

Yang et al. [3] proposed the application of two dimensional principal component analysis (2DPCA) for reducing the computational time of the standard PCA on face recognition. The 2DPCA is often called a variant of PCA. In the 2DPCA, the image matrices were directly treated as two-dimensional matrices; the images do not need to be transformed into a vector so that the covariance matrix of the image can be constructed directly using the original image matrices. The 2DPCA has two important benefits over PCA, it is easier to evaluate the covariance matrix and it requires less time for determining the eigenvectors. The interesting process of 2DPCA inspires the author to experiment with the classification of the general matrix data using robust 2DPCA.

Robust 2DPCA is primarily a robust approach which describes the variance covariance structure through a linear transformation of the original variables. The grouping of classification is not effective when outlier is 'hidden' in a data set. The decomposed covariance matrix is very sensitive to outlying observations. The first component consisting of the greatest variation is often pushed toward the anomalous observations.

The objective of paper is to propose the use of robust minimizing vector variance (MVV) in the two-dimensional projection process for classification of the arbitrary matrix data. Minimum vector variance (MVV) is the robust measure in an attempt to determine the location estimator and covariance matrix based on a data subset covering approximately an half data which give the minimum vector variance [4]. The algorithm of MVV robust 2DPCA is composed in three stages. It is started with the two-dimensional projection stage, the second stage is the robust estimation process and the final stage is the classification process. The outcomes of all experiments show the MVV 2DPCA is powerful approach to classify the several kinds of object. The computational aspect of MVV robust 2DPCA is much better than the MVV robust PCA; however, the breakdown point of MVV robust 2DPCA is less than 0.5

References
1
S. Kotz, N.L. Johnson, "Encyclopedia of Statistical Sciences", 6, 110-122, John Wiley, New York, 1985.
2
I.T. Jolliffe, "Principal Component Analysis", Springer Verlag, 1986.
3
J. Yang, D. Zhang, A.F. Frangi, J-yu Yang, "Two-Dimensional PCA: A New Approach to Apperance-Based Face Reprentation and Recognition", J. IEEE Transactions on Pattern Analysis and machine Intelligence, 26(1), 131-137, 2004. doi:10.1109/TPAMI.2004.10004
4
D.E. Herwindiati, M.A. Djauhari, M. Mashuri, "Robust Multivariate Outlier Labeling", J. Communication in Statistics - Simulation and Computation, 36(6), 2007. doi:10.1080/03610910701569044

purchase the full-text of this paper (price £20)

go to the previous paper
go to the next paper
return to the table of contents
return to the book description
purchase this book (price £125 +P&P)