Bültmann & Gerriets
Regression Analysis and Linear Models
Concepts, Applications, and Implementation
von Richard B Darlington, Andrew F Hayes
Verlag: Guilford Publications
Reihe: Methodology in the Social Scie
Reihe: Methodology in the Social Sciences
Gebundene Ausgabe
ISBN: 978-1-4625-2113-5
Erschienen am 27.09.2016
Sprache: Englisch
Format: 261 mm [H] x 182 mm [B] x 38 mm [T]
Gewicht: 1328 Gramm
Umfang: 661 Seiten

Preis: 98,00 €
keine Versandkosten (Inland)


Jetzt bestellen und voraussichtlich ab dem 14. November in der Buchhandlung abholen.

Der Versand innerhalb der Stadt erfolgt in Regel am gleichen Tag.
Der Versand nach außerhalb dauert mit Post/DHL meistens 1-2 Tage.

98,00 €
merken
Gratis-Leseprobe
klimaneutral
Der Verlag produziert nach eigener Angabe noch nicht klimaneutral bzw. kompensiert die CO2-Emissionen aus der Produktion nicht. Daher übernehmen wir diese Kompensation durch finanzielle Förderung entsprechender Projekte. Mehr Details finden Sie in unserer Klimabilanz.
Klappentext
Biografische Anmerkung
Inhaltsverzeichnis

Emphasizing conceptual understanding over mathematics, this user-friendly text introduces linear regression analysis to students and researchers across the social, behavioral, consumer, and health sciences.



Richard B. Darlington, PhD, is Emeritus Professor of Psychology at Cornell University. He is a Fellow of the American Association for the Advancement of Science and has published extensively on regression and related methods, the cultural bias of mental tests, the long-term effects of preschool programs, and, most recently, the neuroscience of brain development and evolution.
Andrew F. Hayes, PhD, is Distinguished Research Professor in the Haskayne School of Business at the University of Calgary, Alberta, Canada. His research and writing on data analysis has been published widely. Dr. Hayes is the author of Introduction to Mediation, Moderation, and Conditional Process Analysis and Statistical Methods for Communication Science, as well as coauthor, with Richard B. Darlington, of Regression Analysis and Linear Models. He teaches data analysis, primarily at the graduate level, and frequently conducts workshops on statistical analysis throughout the world. His website is www.afhayes.com.



List of Symbols and Abbreviations
1. Statistical Control and Linear Models
1.1 Statistical Control
1.1.1 The Need for Control
1.1.2 Five Methods of Control
1.1.3 Examples of Statistical Control
1.2 An Overview of Linear Models
1.2.1 What You Should Know Already
1.2.2 Statistical Software for Linear Modeling and Statistical Control
1.2.3 About Formulas
1.2.4 On Symbolic Representations
1.3 Chapter Summary
2. The Simple Regression Model
2.1 Scatterplots and Conditional Distributions
2.1.1 Scatterplots
2.1.2 A Line through Conditional Means
2.1.3 Errors of Estimate
2.2 The Simple Regression Model
2.2.1 The Regression Line
2.2.2 Variance, Covariance, and Correlation
2.2.3 Finding the Regression Line
2.2.4 Example Computations
2.2.5 Linear Regression Analysis by Computer
2.3 The Regression Coefficient versus the Correlation Coefficient
2.3.1 Properties of the Regression and Correlation Coefficients
2.3.2 Uses of the Regression and Correlation Coefficients
2.4 Residuals
2.4.1 The Three Components of Y
2.4.2 Algebraic Properties of Residuals
2.4.3 Residuals as Y Adjusted for Differences in X
2.4.4 Residual Analysis
2.5 Chapter Summary
3. Partial Relationship and the Multiple Regression Model
3.1 Regression Analysis with More Than One Predictor Variable
3.1.1 An Example
3.1.2 Regressors
3.1.3 Models
3.1.4 Representing a Model Geometrically
3.1.5 Model Errors
3.1.6 An Alternative View of the Model
3.2 The Best-Fitting Model
3.2.1 Model Estimation with Computer Software
3.2.2 Partial Regression Coefficients
3.2.3 The Regression Constant
3.2.4 Problems with Three or More Regressors
3.2.5 The Multiple Correlation R
3.3 Scale-Free Measures of Partial Association
3.3.1 Semipartial Correlation
3.3.2 Partial Correlation
3.3.3 The Standardized Regression Coefficient
3.4 Some Relations among Statistics
3.4.1 Relations among Simple, Multiple, Partial, and Semipartial Correlations
3.4.2 Venn Diagrams
3.4.3 Partial Relationships and Simple Relationships May Have Different Signs
3.4.4 How Covariates Affect Regression Coefficients
3.4.5 Formulas for bj, prj, srj, and R
3.5 Chapter Summary
4. Statistical Inference in Regression
4.1 Concepts in Statistical Inference
4.1.1 Statistics and Parameters
4.1.2 Assumptions for Proper Inference
4.1.3 Expected Values and Unbiased Estimation
4.2 The ANOVA Summary Table
4.2.1 Data = Model + Error
4.2.2 Total and Regression Sums of Squares
4.2.3 Degrees of Freedom
4.2.4 Mean Squares
4.3 Inference about the Multiple Correlation
4.3.1 Biased and Less Biased Estimation of TR2
4.3.2 Testing a Hypothesis about TR
4.4 The Distribution of and Inference about a Partial Regression Coefficient
4.4.1 Testing a Null Hypothesis about Tbj
4.4.2 Interval Estimates for Tbj
4.4.3 Factors Affecting the Standard Error of bj
4.4.4 Tolerance
4.5 Inferences about Partial Correlations
4.5.1 Testing a Null Hypothesis about Tprj and Tsrj
4.5.2 Other Inferences about Partial Correlations
4.6 Inferences about Conditional Means
4.7 Miscellaneous Issues in Inference
4.7.1 How Great a Drawback Is Collinearity?
4.7.2 Contradicting Inferences
4.7.3 Sample Size and Nonsignificant Covariates
4.7.4 Inference in Simple Regression (When k = 1)
4.8 Chapter Summary
5. Extending Regression Analysis Principles
5.1 Dichotomous Regressors
5.1.1 Indicator or Dummy Variables
5.1.2 Y Is a Group Mean
5.1.3 The Regression Coefficient for an Indicator Is a Difference
5.1.4 A Graphic Representation
5.1.5 A Caution about Standardized Regression Coefficients for Dichotomous Regressors
5.1.6 Artificial Categorization of Numerical Variables
5.2 Regression to the Mean
5.2.1 How Regression Got Its Name
5.2.2 The Phenomenon
5.2.3 Versions of the Phenomenon
5.2.4 Misconceptions and Mistakes Fostered by Regression to the Mean
5.2.5 Accounting for Regression to the Mean Using Linear Models
5.3 Multidimensional Sets
5.3.1 The Partial and Semipartial Multiple Correlation
5.3.2 What It Means If PR = 0 or SR = 0
5.3.3 Inference Concerning Sets of Variables
5.4 A Glance at the Big Picture
5.4.1 Further Extensions of Regression
5.4.2 Some Difficulties and Limitations
5.5 Chapter Summary
6. Statistical versus Experimental Control
6.1 Why Random Assignment?
6.1.1 Limitations of Statistical Control
6.1.2 The Advantage of Random Assignment
6.1.3 The Meaning of Random Assignment
6.2 Limitations of Random Assignment
6.2.1 Limitations Common to Statistical Control and Random Assignment
6.2.2 Limitations Specific to Random Assignment
6.2.3 Correlation and Causation
6.3 Supplementing Random Assignment with Statistical Control
6.3.1 Increased Precision and Power
6.3.2 Invulnerability to Chance Differences between Groups
6.3.3 Quantifying and Assessing Indirect Effects
6.4 Chapter Summary
7. Regression for Prediction
7.1 Mechanical Prediction and Regression
7.1.1 The Advantages of Mechanical Prediction
7.1.2 Regression as a Mechanical Prediction Method
7.1.3 A Focus on R Rather Than the Regression Weights
7.2 Estimating True Validity
7.2.1 Shrunken versus Adjusted R
7.2.2 Estimating TRS
7.2.3 Shrunken R Using Statistical Software
7.3 Selecting Predictor Variables
7.3.1 Stepwise Regression
7.3.2 All Subsets Regression
7.3.3 How Do Variable Selection Methods Perform?
7.4 Predictor Variable Configurations
7.4.1 Partial Redundancy (the Standard Configuration)
7.4.2 Complete Redundancy
7.4.3 Independence
7.4.4 Complementarity
7.4.5 Suppression
7.4.6 How These Configurations Relate to the Correlation between Predictors
7.4.7 Configurations of Three or More Predictors
7.5 Revisiting the Value of Human Judgment
7.6 Chapter Summary
8. Assessing the Importance of Regressors
8.1 What Does It Mean for a Variable to Be Important?
8.1.1 Variable Importance in Substantive or Applied Terms
8.1.2 Variable Importance in Statistical Terms
8.2 Should Correlations Be Squared?
8.2.1 Decision Theory
8.2.2 Small Squared Correlations Can Reflect Noteworthy Effects
8.2.3 Pearson's r as the Ratio of a Regression Coefficient to Its Maximum Possible Value
8.2.4 Proportional Reduction in Estimation Error
8.2.5 When the Standard Is Perfection
8.2.6 Summary
8.3 Determining the Relative Importance of Regressors in a Single Regression Model
8.3.1 The Limitations of the Standardized Regression Coefficient
8.3.2 The Advantage of the Semipartial Correlation
8.3.3 Some Equivalences among Measures
8.3.4 Cohen's f 2
8.3.5 Comparing Two Regression Coefficients in the Same Model
8.4 Dominance Analysis
8.4.1 Complete and Partial Dominance
8.4.2 Example Computations
8.4.3 Dominance Analysis Using a Regression Program
8.5 Chapter Summary
9. Multicategorical Regressors
9.1 Multicategorical Variables as Sets
9.1.1 Indicator (Dummy) Coding
9.1.2 Constructing Indicator Variables
9.1.3 The Reference Category
9.1.4 Testing the Equality of Several Means
9.1.5 Parallels with Analysis of Variance
9.1.6 Interpreting Estimated Y and the Regression Coefficients
9.2 Multicategorical Regressors as or with Covariates


weitere Titel der Reihe