scispace - formally typeset
Journal ArticleDOI

Partial least squares regression and projection on latent structure regression (PLS Regression)

TLDR
Partial least squares (PLS) regression as mentioned in this paper is a recent technique that combines features from and generalizes principal component analysis (PCA) and multiple linear regression, which can be used to predict a set of dependent variables from a subset of independent variables or predictors.
Abstract
Partial least squares (PLS) regression (a.k.a. projection on latent structures) is a recent technique that combines features from and generalizes principal component analysis (PCA) and multiple linear regression. Its goal is to predict a set of dependent variables from a set of independent variables or predictors. This prediction is achieved by extracting from the predictors a set of orthogonal factors called latent variables which have the best predictive power. These latent variables can be used to create displays akin to PCA displays. The quality of the prediction obtained from a PLS regression model is evaluated with cross-validation techniques such as the bootstrap and jackknife. There are two main variants of PLS regression: The most common one separates the roles of dependent and independent variables; the second one—used mostly to analyze brain imaging data—gives the same roles to dependent and independent variables. Copyright © 2010 John Wiley & Sons, Inc. For further resources related to this article, please visit the WIREs website.

read more

Citations
More filters
Journal ArticleDOI

Principal component analysis

TL;DR: Principal component analysis (PCA) as discussed by the authors is a multivariate technique that analyzes a data table in which observations are described by several inter-correlated quantitative dependent variables, and its goal is to extract the important information from the table, to represent it as a set of new orthogonal variables called principal components, and display the pattern of similarity of the observations and of the variables as points in maps.
Journal ArticleDOI

A Comprehensive Survey on Transfer Learning

TL;DR: Transfer learning aims to improve the performance of target learners on target domains by transferring the knowledge contained in different but related source domains as discussed by the authors, in which the dependence on a large number of target-domain data can be reduced for constructing target learners.
Journal ArticleDOI

Partial Least Squares (PLS) methods for neuroimaging: a tutorial and review.

TL;DR: For both PLS methods, statistical inferences are implemented using cross-validation techniques to identify significant patterns of voxel activation and are presented with small numerical examples and typical applications in neuroimaging.
References
More filters
Book

An introduction to the bootstrap

TL;DR: This article presents bootstrap methods for estimation, using simple arguments, with Minitab macros for implementing these methods, as well as some examples of how these methods could be used for estimation purposes.
Book

Applied Regression Analysis

TL;DR: In this article, the Straight Line Case is used to fit a straight line by least squares, and the Durbin-Watson Test is used for checking the straight line fit.
Book

Principal Component Analysis

TL;DR: In this article, the authors present a graphical representation of data using Principal Component Analysis (PCA) for time series and other non-independent data, as well as a generalization and adaptation of principal component analysis.
Reference EntryDOI

Principal Component Analysis

TL;DR: Principal component analysis (PCA) as discussed by the authors replaces the p original variables by a smaller number, q, of derived variables, the principal components, which are linear combinations of the original variables.
Journal ArticleDOI

Ridge regression: biased estimation for nonorthogonal problems

TL;DR: In this paper, an estimation procedure based on adding small positive quantities to the diagonal of X′X was proposed, which is a method for showing in two dimensions the effects of nonorthogonality.
Related Papers (5)