University of Kyrenia Grand Library
Opening Hours: Monday-Saturday, 08:00-20:00 | E-mail: grand.library@kyrenia.edu.tr
 

You are not logged in Show Basket
  Home     Advanced Search     Back  
  Brief display     MARC Display     Reserve  
Linear regression / (Gross, Jürgen.)
Bibliographical information (record 270806)
Help
Linear regression /
Author:
Gross, Jürgen. Search in Online Databases

Publisher:
Springer,
ISBN:
3540401784
Edition:
c2003.
Classification:
QA278.2
Detailed notes
    - Includes bibliographical references (p. [381]-387) and index.
    - 1 Fundamentals -- 1.1 Linear Models -- 1.1.1 Application of Linear Models -- 1.1.2 Types of Linear Models -- 1.1.3 Proceeding with Linear Models -- 1.1.4 A Preliminary Example -- 1.2 Decision Theory and Point Estimation -- 1.2.1 Decision Rule -- 1.2.2 Non-operational Decision Rule -- 1.2.3 Loss and Risk -- 1.2.4 Choosing a Decision Rule -- 1.2.5 Admissibility -- 1.2.6 Squared Error Loss -- 1.2.7 Matrix Valued Squared Error Loss -- 1.2.8 Alternative Loss Functions -- 1.3 Problems -- 2 The Linear Regression Model -- 2.1 Assumptions -- 2.2 Ordinary Least Squares Estimation -- 2.2.1 The Principle of Least Squares -- 2.2.2 Coefficient of Determination R2 -- 2.2.3 Predictive Loss -- 2.2.4 Least Squares Variance Estimator -- 2.2.5 Properties of the Ordinary Least Squares Estimator -- 2.2.6 Properties Under Normality -- 2.3 Optimality of Least Squares Estimation -- 2.3.1 Linear Unbiased Estimation -- 2.3.2 Gauss-Markov Theorem -- 2.3.3 Normality Assumption -- 2.3.4 Admissibility -- 2.4 Unreliability of Least Squares Estimation -- 2.4.1 Estimation of the Covariance Matrix -- 2.4.2 Unbiased Versus Biased Estimation -- 2.4.3 Collinearity -- 2.4.4 Consistency -- 2.4.5 Biased Estimation -- 2.5 Inadmissibility of the Ordinary Least Squares Estimator -- 2.5.1 The Reparameterized Regression Model -- 2.5.2 Risk Comparison of Least Squares and Stein Estimator -- 2.5.3 An Example for Stein Estimation -- 2.5.4 Admissibility -- 2.6 Problems -- 3 Alternative Estimators -- 3.1 Restricted Least Squares Estimation -- 3.1.1 The Principle of Restricted Least Squares -- 3.1.2 The Parameter Space -- 3.1.3 Properties of Restricted Least Squares Estimator -- 3.1.4 Risk Comparison of Restricted and Ordinary Least Squares Estimator -- 3.1.5 Pretest Estimation -- 3.2 Other Types of Restriction -- 3.2.1 Stochastic Linear Restrictions -- 3.2.2 Inequality Restrictions -- 3.2.3 Elliptical Restrictions -- 3.3 Principal Components Estimator -- 3.3.1 Preliminary Considerations -- 3.3.2 Properties of the Principal Components Estimator -- 3.3.3 Drawbacks of the Principal Components Estimator -- 3.3.4 The Marquardt Estimator -- 3.4 Ridge Estimator -- 3.4.1 Preliminary Considerations -- 3.4.2 Properties of the Linear Ridge Estimator -- 3.4.3 The Choice of the Ridge Parameter -- 3.4.4 Standardization -- 3.4.5 Ridge and Restricted Least Squares Estimator -- 3.4.6 Ridge and Principal Components Estimator -- 3.4.7 Jackknife Modified Ridge Estimator -- 3.4.8 Iteration Estimator -- 3.4.9 An Example for Ridge Estimation -- 3.5 Shrinkage Estimator -- 3.5.1 Preliminary Considerations -- 3.5.2 Risk Comparison to Ordinary Least Squares -- 3.5.3 The Choice of the Shrinkage Parameter -- 3.5.4 Direction Modified Shrinkage Estimators -- 3.6 General Ridge Estimator -- 3.6.1 A Class of Estimators -- 3.6.2 Risk Comparison of General Ridge and Ordinary Least Squares Estimator -- 3.7 Linear Minimax Estimator -- 3.7.1 Preliminary Considerations -- 3.7.2 Inequality Restrictions -- 3.7.3 Linear Minimax Solutions -- 3.7.4 Alternative Approaches -- 3.7.5 Admissibility -- 3.8 Linear Bayes Estimator -- 3.8.1 Preliminary Considerations -- 3.8.2 Characterization of Linear Bayes Estimators -- 3.8.3 Non-Operational Bayes Solutions -- 3.8.4 A-priori Assumptions -- 3.9 Robust Estimator -- 3.9.1 Preliminary Considerations -- 3.9.2 Weighted Least Squares Estimation -- 3.9.3 The 11 Estimator -- 3.9.4 M Estimator -- 3.9.5 Robust Ridge Estimator -- 3.10 Problems -- 4 Linear Admissibility -- 4.1 Preliminary Considerations -- 4.2 Linear Admissibility in the Non-Restricted Model -- 4.2.1 Linear Admissibility in the Simple Mean Shift Model -- 4.2.2 Characterization of Linearly Admissible Estimators -- 4.2.3 Ordinary Least Squares and Linearly Admissible Estimator -- 4.2.4 Linear Transforms of Ordinary Least Squares Estimator -- 4.2.5 Linear Admissibility of Known Estimators -- 4.2.6 Shrinkage Property and Linear Admissibility -- 4.2.7 Convex Combination of Estimators -- 4.2.8 Linear Bayes Estimator -- 4.3 Linear Admissibility Under Linear Restrictions -- 4.3.1 The Assumption of a Full Rank Restriction Matrix -- 4.3.2 Restricted Estimator -- 4.3.3 Characterization of Linearly Admissible Estimators -- 4.4 Linear Admissibility Under Elliptical Restrictions -- 4.4.1 Characterization of Linearly Admissible Estimators -- 4.4.2 Linear Admissibility of Certain Linear Estimators -- 4.4.3 Admissible Improvements Over Ordinary Least Squares -- 4.5 Problems -- 5 The Covariance Matrix of the Error Vector -- 5.1 Estimation of the Error Variance -- 5.1.1 The Sample Variance -- 5.1.2 Nonnegative Unbiased Estimation -- 5.1.3 Optimality of the Least Squares Variance Estimator -- 5.1.4 Non-Admissibility of the Least Squares Variance Estimator -- 5.2 Non-Scalar Covariance Matrix -- 5.2.1 Preliminary Considerations -- 5.2.2 The Transformed Model -- 5.2.3 Two-Stage Estimation -- 5.3 Occurrence of Non-Scalar Covariance Matrices -- 5.3.1 Seemingly Unrelated Regression -- 5.3.2 Heteroscedastic Errors -- 5.3.3 Equicorrelated Errors -- 5.3.4 Autocorrelated Errors -- 5.4 Singular Covariance Matrices -- 5.5 Equality of Ordinary and Generalized Least Squares -- 5.6 Problems -- 6 Regression Diagnostics -- 6.1 Selecting Independent Variables -- 6.1.1 Mallows' Cp -- 6.1.2 Stepwise Regression -- 6.1.3 Alternative Criteria -- 6.2 Assessing Goodness of Fit -- 6.3 Diagnosing Collinearity -- 6.3.1 Variance Inflation Factors -- 6.3.2 Scaled Condition Indexes -- 6.4 Inspecting Residuals -- 6.4.1 Normal Quantile Plot -- 6.4.2 Residuals Versus Fitted Values Plot -- 6.4.3 Further Residual Plots -- 6.5 Finding Influential Observations -- 6.5.1 Leverage -- 6.5.2 Influential Observations -- 6.5.3 Collinearity-Influential Observations -- 6.6 Testing Model Assumptions -- 6.6.1 Preliminary Considerations -- 6.6.2 Testing for Heteroscedasticity -- 6.6.3 Testing for Autocorrelation -- 6.6.4 Testing for Non-Normality -- 6.6.5 Testing for Non-Linearity -- 6.6.6 Testing for Outlier -- 6.7 Problems -- A.1 Preliminaries -- A.1.1 Matrices and Vectors -- A.1.2 Elementary Operations -- A.1.3 Rank of a Matrix -- A.1.4 Subspaces and Matrices -- A.1.5 Partitioned Matrices -- A.1.6 Kronecker Product -- A.1.7 Moore-Penrose Inverse -- A.2 Common Pitfalls -- A.3 Square Matrices -- A.3.1 Specific Square Matrices -- A.3.2 Trace and Determinant -- A.3.3 Eigenvalue and Eigenvector -- A.3.4 Vector and Matrix Norm -- A.3.5 Definiteness -- A.4 Symmetric Matrix -- A.4.1 Eigenvalues -- A.4.2 Spectral Decomposition -- A.4.3 Rayleigh Ratio -- A.4.4 Definiteness -- A.5 Lowner Partial Ordering -- B.1 Expectation and Covariance -- B.2 Multivariate Normal Distribution -- B.3 X2 Distribution -- B.4 F Distribution -- C.1 Problem and Goal -- C.2 The Data -- C.3 The Choice of Variables -- C.3.1 The Full Model -- C.3.2 Stepwise Regression -- C.3.3 Collinearity Diagnostics -- C.4 Further Diagnostics -- C.4.1 Residuals -- C.4.2 Influential Observations -- C.5 Prediction.
Related links
Items (2)
Barcode
Status
Library
Section
6513028086
Item available
NEU Grand Library2nd Floor (QA278.2 .G76 2003)
General Collection
6839541333
Item available
NEU Grand Library2nd Floor (QA278.2 .G76 2003)
General Collection

UNIVERSITY OF KYRENIA LIBRARY +90 (392) 680 20 28. Near East Boulevard, Kyrenia, TRNC
This software is developed by NEU Library and it is based on Koha OSS
conforms to MARC21 library data transfer rules.