lmyammai.blogspot.com
aMMAI: Eigenfaces for Reconition
http://lmyammai.blogspot.com/2009/03/eigenfaces-for-reconition.html
M Turk, A Pentland - Journal of Cognitive Neuroscience, 1991. This paper propose a feature for dimension reduction. The significant features are known as "eigenfaces", and the center idea is to project face images onto the eigenfaces based on PCA. The main procedure is:. Given a set of face images :. Define the average face :. Each face differs from the average face :. Now we can use L to find eigenfaces :. 張貼者: LMY (for ammai). 標籤: Paper critiques and summaries. 訂閱: 張貼留言 (Atom).
lmyammai.blogspot.com
aMMAI: Rapid object detection using a boosted cascade of simple features
http://lmyammai.blogspot.com/2009/05/rapid-object-detection-using-boosted.html
Rapid object detection using a boosted cascade of simple features. Rapid object detection using a boosted cascade of simple features," Paul Viola and Michael Jones, CVPR, 2001. This paper describes a machine learning approach for visual object detection which is capable of processing images extremely rapidly and achieving high detection rates. This work is distinguished by 3 key contributions. 12288;A new image representation with 3 kinds of feature. A simple integral image at area A.
lmyammai.blogspot.com
aMMAI: Nonlinear dimensionality reduction by locally linear embedding
http://lmyammai.blogspot.com/2009/03/nonlinear-dimensionality-reduction-by.html
Nonlinear dimensionality reduction by locally linear embedding. Nonlinear dimensionality reduction by locally linear embedding," Roweis and Saul, Science, 2000. This paper present an unsupervised learning algorithm, locally linear embedding. LLE), which can construct a neighborhood preserving mapping based on the steps below:. Assign K neighbors for each data point X i. From its neighbors with linear weight, by minimize the cost function. Compute the low-dimensional embedding vector Y i.
lmyammai.blogspot.com
aMMAI: Support vector learning for ordinal regression
http://lmyammai.blogspot.com/2009/06/support-vector-learning-for-ordinal.html
Support vector learning for ordinal regression. Support vector learning for ordinal regression," R. Herbrich, ICANN, 1999. This paper proposes a new learning task for ordinal regression which is complementary to both classification and metric regression because of discrete and ordered outcome space. The formulation for this task is to formalize as a problem of binary classification by minimizing pairwise 0-1 loss. Is the rank function to output the score. 張貼者: LMY (for ammai). 訂閱: 張貼留言 (Atom).
lmyammai.blogspot.com
aMMAI: 六月 2009
http://lmyammai.blogspot.com/2009_06_01_archive.html
AdaRank: a boosting algorithm for information retrieval. AdaRank: a boosting algorithm for information retrieval," Jun Xu, Hang Li, SIGIR 2007. This paper proposes an boosting algorithm AdaRank. Of the issue learning to rank. For document retrieval. According to the essence of AdaBoost, AdaRank repeatedly constructs "weak rankers" on the basis of re-weighted data, and the final prediction function is also the linear combination of those "weak rankers". The evaluation of AdaRank. Based on Adaboost, AdaRank.
lmyammai.blogspot.com
aMMAI: 五月 2009
http://lmyammai.blogspot.com/2009_05_01_archive.html
Rapid object detection using a boosted cascade of simple features. Rapid object detection using a boosted cascade of simple features," Paul Viola and Michael Jones, CVPR, 2001. This paper describes a machine learning approach for visual object detection which is capable of processing images extremely rapidly and achieving high detection rates. This work is distinguished by 3 key contributions. 12288;A new image representation with 3 kinds of feature. A simple integral image at area A. 12288; i) At...
lmyammai.blogspot.com
aMMAI: [Note] 20090325
http://lmyammai.blogspot.com/2009/03/note-20090325.html
Feature significance (without label / subset). Feature significance (with label / subset). MI, x 2, correlation (subset). Exploiting feature correlation (without label / transform). PCA, manifold methods (LLE, ISOMAP). Exploiting feature correlation (with label / transform). Adaboost, maximum entropy. Exploiting hidden semantics (topics). PLSA, SVD, LSI, Information Bottleneck. 張貼者: LMY (for ammai). 標籤: Class note. 訂閱: 張貼留言 (Atom). Paper critiques and summaries.
lmyammai.blogspot.com
aMMAI: AdaRank: a boosting algorithm for information retrieval
http://lmyammai.blogspot.com/2009/06/adarank-boosting-algorithm-for.html
AdaRank: a boosting algorithm for information retrieval. AdaRank: a boosting algorithm for information retrieval," Jun Xu, Hang Li, SIGIR 2007. This paper proposes an boosting algorithm AdaRank. Of the issue learning to rank. For document retrieval. According to the essence of AdaBoost, AdaRank repeatedly constructs "weak rankers" on the basis of re-weighted data, and the final prediction function is also the linear combination of those "weak rankers". The evaluation of AdaRank. Based on Adaboost, AdaRank.
lmyammai.blogspot.com
aMMAI: 二月 2009
http://lmyammai.blogspot.com/2009_02_01_archive.html
How to give a good research talk. How to give a good research talk. Jones et. al. As a new graduate, I have no experience to present my work at the stage since I do not have work. I only have the experience to present other person's work at meeting. Therefore, I feel hard to understand some suggestion of the paper such as don't put outlines in the start of the slides, don't start preparing slides early. I summarize the paper as below. How to prepare slides? Use some aids such as overhead projector. Howev...