Thebookisbasedonseveralyearsofexperienceofbothauthorsinteaching linear models at various levels. It gives an up-to-date account of the theory and applications of linear models. The book can be used as a text for courses in statistics at the graduate level and as an accompanying text for courses in other areas. Some of the highlights in this book are as follows. A relatively extensive chapter on matrix theory (Appendix A) provides the necessary tools for proving theorems discussed in the text and o?ers a ...
Read More
Thebookisbasedonseveralyearsofexperienceofbothauthorsinteaching linear models at various levels. It gives an up-to-date account of the theory and applications of linear models. The book can be used as a text for courses in statistics at the graduate level and as an accompanying text for courses in other areas. Some of the highlights in this book are as follows. A relatively extensive chapter on matrix theory (Appendix A) provides the necessary tools for proving theorems discussed in the text and o?ers a selectionofclassicalandmodernalgebraicresultsthatareusefulinresearch work in econometrics, engineering, and optimization theory. The matrix theory of the last ten years has produced a series of fundamental results aboutthe de?niteness ofmatrices,especially forthe di?erences ofmatrices, which enable superiority comparisons of two biased estimates to be made for the ?rst time. We have attempted to provide a uni?ed theory of inference from linear models with minimal assumptions. Besides the usual least-squares theory, alternative methods of estimation and testing based on convex loss fu- tions and general estimating equations are discussed. Special emphasis is given to sensitivity analysis and model selection. A special chapter is devoted to the analysis of categorical data based on logit, loglinear, and logistic regression models. The material covered, theoretical discussion, and a variety of practical applications will be useful not only to students but also to researchers and consultants in statistics.
Read Less