Multivariate Analysis: Methods and ApplicationsStructural Sensitivity in Econometric Models Edwin Kuh, John W. Neese and Peter Hollinger Provides a pathbreaking assessment of the worth of linear dynamic systems methods for probing the behavior of complex macroeconomic models. Representing a major improvement upon the standard "black box" approach to analyzing economic model structure, it introduces the powerful concept of parameter sensitivity analysis within a linear systems root/vector framework. The approach is illustrated with a good mediumsize econometric model (Michigan Quarterly Econometric Model of the United States). EISPACK, the Fortran code for computing characteristic roots and vectors has been upgraded and augmented by a model linearization code and a broader algorithmic framework. Also features an interface between the algorithmic code and the interactive modeling system (TROLL), making an unusually wide range of linear systems methods accessible to economists, operations researchers, engineers and physical scientists. 1985 (0-471-81930-1) 324 pp. Linear Statistical Models and Related Methods With Applications to Social Research John Fox A comprehensive, modern treatment of linear models and their variants and extensions, combining statistical theory with applied data analysis. Considers important methodological principles underlying statistical methods. Designed for researchers and students who wish to apply these models to their own work in a flexible manner. 1984 (0 471-09913-9) 496 pp. Statistical Methods for Forecasting Bovas Abraham and Johannes Ledolter This practical, user-oriented book treats the statistical methods and models used to produce short-term forecasts. Provides an intermediate level discussion of a variety of statistical forecasting methods and models and explains their interconnections, linking theory and practice. Includes numerous time-series, autocorrelations, and partial autocorrelation plots. 1983 (0 471-86764-0) 445 pp. |
From inside the book
Results 1-3 of 92
Page 111
... dimension . In the previous two examples the MDS solution was presented in a two - dimensional space . As we suggested earlier , this will not always be the case , and higher - dimensional spaces may be needed in order for the fitted ...
... dimension . In the previous two examples the MDS solution was presented in a two - dimensional space . As we suggested earlier , this will not always be the case , and higher - dimensional spaces may be needed in order for the fitted ...
Page 123
... dimensional solutions and 18 stimuli for three - dimensional solutions . Kruskal and Wish ( 1978 ) suggest 9 stimuli for two - dimensional solutions , 13 stimuli for three - dimensional solutions , and 17 stimuli for four- dimensional ...
... dimensional solutions and 18 stimuli for three - dimensional solutions . Kruskal and Wish ( 1978 ) suggest 9 stimuli for two - dimensional solutions , 13 stimuli for three - dimensional solutions , and 17 stimuli for four- dimensional ...
Page 425
... dimensional plane ( since μ is p μ dimensional ) , ( 11.5-12 ) is a statement that the points P1 , P2 , ... , lie on an ( r = p - s ) dimensional plane . To test this hypothesis , assume that samples n1 , n2 , ... , nk are respectively ...
... dimensional plane ( since μ is p μ dimensional ) , ( 11.5-12 ) is a statement that the points P1 , P2 , ... , lie on an ( r = p - s ) dimensional plane . To test this hypothesis , assume that samples n1 , n2 , ... , nk are respectively ...
Contents
SELECTED ASPECTS OF MULTIVARIATE ANALYSIS | 1 |
PRINCIPAL COMPONENTS ANALYSIS | 26 |
FACTOR ANALYSIS | 56 |
Copyright | |
12 other sections not shown
Common terms and phrases
algorithm approach associated assumptions B₁ B₂ canonical correlation analysis canonical variate causal Chapter cluster column common factors computed correlation matrix corresponding covariance matrix criterion data matrix defined deletion denoted derived space dimension dimensional discriminant analysis discriminant function discussed distance distribution effects eigenvalues endogenous variables equation error Euclidean distance example F-value factor analysis Figure given independent variables indicated individual KSI 1 KSI LAMBDA latent class model LISREL maximum likelihood mean measures methods multiple multiple discriminant analysis n₁ n₂ null hypothesis objects obtained orthogonal parameter estimates posterior probability predictor variables principal components analysis probability problem procedure regression analysis regression coefficients regression model relationship residuals restrictions rotation sample scores shown solution squared standard statistically significant stimulus space structure sums-of-squares Table techniques test statistic variance variance-covariance matrix vector X₁ Y₁ Y₂ zero