Outro Acesso aberto

References

2003; Wiley; Linguagem: Inglês

10.1002/9780471722199.refs

ISSN

1940-6347

Autores

George A. F. Seber, Alan J. Lee,

Tópico(s)

Statistical and numerical algorithms

Resumo

Free Access References George A. F. Seber, George A. F. SeberSearch for more papers by this authorAlan J. Lee, Alan J. LeeSearch for more papers by this author Book Author(s):George A. F. Seber, George A. F. SeberSearch for more papers by this authorAlan J. Lee, Alan J. LeeSearch for more papers by this author First published: 21 January 2003 https://doi.org/10.1002/9780471722199.refsBook Series:Wiley Series in Probability and Statistics AboutPDF ToolsRequest permissionExport citationAdd to favoritesTrack citation ShareShare Give accessShare full text accessShare full-text accessPlease review our Terms and Conditions of Use and check box below to share full-text version of article.I have read and accept the Wiley Online Library Terms and Conditions of UseShareable LinkUse the link below to share a full-text version of this article with your friends and colleagues. Learn more.Copy URL Share a linkShare onFacebookTwitterLinked InRedditWechat References Aitkin, M. (1987). Modeling variance heterogeneity in normal regression using GLIM. Appl. Stat., 36, 332– 339. Wiley Online LibraryWeb of Science®Google Scholar Akaike, H. (1973). Information theory as an extension of the maximum likelihood principle. In B. N. Petrov and F. Csaki (Eds.), Procedings, 2nd International Symposium on Information Theory. Budapest: Akademiai Kiado, pp. 267– 281. Google Scholar Albert, A. (1972). Regression and the Moore-Penrose Pseudoinverse. New York: Academic Press. Google Scholar Allen, D. M. (1971). Mean squared error of prediction as a criterion for selecting variables. Technometrics, 13, 469– 475. Web of Science®Google Scholar Anda, A. A. and Park, H. (1994). Fast plane rotations with dynamic scaling. SIAM J. Matrix Anal. Appl., 15, 162– 174. CrossrefWeb of Science®Google Scholar Anderson, T. W. (1971). The Statistical Analaysis of Time Series. New York: Wiley. CASGoogle Scholar Andrews, D. F. and Pregibon, D. (1978). Finding the outliers that matter. J. R. Stat. Soc. B, 40, 85– 93. Wiley Online LibraryWeb of Science®Google Scholar Atiqullah, M. (1962). The estimation of residual variance in quadratically balanced least squares problems and the robustness of the F-test. Biometrika, 49, 83– 91. CrossrefWeb of Science®Google Scholar Atkinson, A. C. (1978). Posterior probabilities for choosing a regression model. Biometrika, 65, 39– 48. CrossrefWeb of Science®Google Scholar Atkinson, A. C. (1985). Plots, Transformations and Regression. Oxford: Clarendon Press. Google Scholar Atkinson, A. C. (1986). Masking unmasked. Biometrika, 73, 533– 541. CrossrefWeb of Science®Google Scholar Atkinson, A. C. and Weisberg, S. (1991). Simulated annealing for the detection of multiple outliers using least squares and least median of squares fitting. In W. Stahel and S. Weisberg (Eds.), Directions in Robust Statistics and Diagnostics. New York: Springer-Verlag, pp. 7– 20. CrossrefGoogle Scholar Azzalini, A. (1996). Statistical Inference Based on the Likelihood. New York: Chapman & Hall. Web of Science®Google Scholar Barrodale, I. and Roberts, F. D. K. (1974). Algorithm 478: Solution of an overde-termined system of equations in the L1 norm. Commun. ACM, 14, 319– 320. CrossrefWeb of Science®Google Scholar Bartels, R. H., Conn, A. R. and Sinclair, J. W. (1978). Minimization techniques for piecewise differentiable functions: The L1 solution to an overdetermined system. SIAM J. Numer. Anal., 15, 224– 241. CrossrefWeb of Science®Google Scholar Bekker, R. A., Cleveland, W. S. and Weil, G. (1988). The use of brushing and rotation for data analysis. In Dynamic Graphics for Statistics. Pacific Grove, CA: Wadsworth. Google Scholar Belsley, D. A. (1984). Demeaning conditioning diagnostics through centering. Am. Stat., 38, 73– 77. CrossrefWeb of Science®Google Scholar Belsley, D. A. (1991). Conditioning Diagnostics: Collinearity and Weak Data in Regression. New York: Wiley. Google Scholar Belsley, D. A., Kuh, E. and Welsch, R. E. (1980). Regression Diagnostics: Identifying Influential Data and Sources of Collinearity. New York: Wiley. Wiley Online LibraryGoogle Scholar Bendel, R. B. and Afifi, A. A. (1977). Comparison of stopping rules in forward “stepwise” regression. J. Am. Stat. Assoc., 72, 46– 53. CrossrefWeb of Science®Google Scholar Berger, J. O. and Pericchi, L. R. (1996). The intrinsic Bayes factor for model selection and prediction. J. Am. Stat. Assoc., 91, 109– 122. CrossrefWeb of Science®Google Scholar Berk, K. N. (1978). Comparing subset regression procedures. Technometrics, 20, 1– 6. CrossrefWeb of Science®Google Scholar Berk, K. N. and Booth, D. E. (1995). Seeing a curve in multiple regression. Technometrics, 37, 385– 398. CrossrefWeb of Science®Google Scholar Bickel, P. J. and Doksum, K. A. (1981). The analysis of transformations revisited. J. Am. Stat. Assoc., 76, 296– 311. CrossrefWeb of Science®Google Scholar Birkes, D. and Dodge, Y. (1993). Alternative Methods of Regression. New York: Wiley. Wiley Online LibraryWeb of Science®Google Scholar Björck, A. (1996). Numerical Methods for Least Squares Problems. Philadelphia: SIAM. CrossrefGoogle Scholar Björck, A. and Paige, C. C. (1992). Loss and recapture of orthogonality in the modified Gram-Schmidt algorithm. SIAM J. Matrix Anal. Appl., 13, 176– 190. CrossrefWeb of Science®Google Scholar Björck, A. and Paige, C. C. (1994). Solution of augmented linear systems using orthogonal factorizations. BIT, 34, 1– 26. CrossrefWeb of Science®Google Scholar Bloomfield, P. and Steiger, W. L. (1980). Least absolute deviations curve-fitting. SIAM J. Sci. Stat. Comput. 1, 290– 301. CrossrefWeb of Science®Google Scholar Bloomfield, P. and Steiger, W. L. (1983). Least Absolute Deviations: Theory, Applications and Algorithms. Boston: Birkhäuser. Google Scholar Bohrer, R. (1973). An optimality property of Scheffé bounds. Ann. Stat., 1, 766– 772. CrossrefWeb of Science®Google Scholar Bohrer, R. and Francis, G. K. (1972). Sharp one-sided confidence bounds for linear regression over intervals. Biometrika, 59, 99– 107. CrossrefWeb of Science®Google Scholar Bowden, D. C. (1970). Simultaneous confidence bands for linear regression models. J. Am. Stat. Assoc., 65, 413– 421. Web of Science®Google Scholar Bowden, D. C. and Graybill, F. A. (1966). Confidence bands of uniform and proportional width for linear models. J. Am. Stat. Assoc., 61, 182– 198. Web of Science®Google Scholar Box, G. E. P. (1966). Use and abuse of regression. Technometrics, 8, 625– 629. CrossrefWeb of Science®Google Scholar Box, G. E. P. and Cox, D. R. (1964). An analysis of transformations. J. R. Stat. Soc. B, 26, 211– 252. Wiley Online LibraryPubMedGoogle Scholar Box, G. E. P. and Cox, D. R. (1982). An analysis of transformations revisited, rebutted. J. Am. Stat. Assoc., 77, 209– 210. CrossrefWeb of Science®Google Scholar Box, G. E. P. and Taio, G. C. (1973). Bayesian Inference in Statistical Analysis. Reading, MA: Addison-Wesley. Google Scholar Box, G. E. P. and Watson, G. S. (1962). Robustness to non-normality of regression tests. Biometrika, 49, 93– 106. CrossrefWeb of Science®Google Scholar Breiman, L. (1992). The little bootstrap and other methods for dimensionality selection in regression: X-fixed prediction error. J. Am. Stat. Assoc., 87, 738– 754. CrossrefWeb of Science®Google Scholar Breiman, L. (1995). Better subset selection using the Nonnegative Garrote. Technometrics, 37, 373– 384. CrossrefWeb of Science®Google Scholar Breiman, L. (1996a). Stacked regressions. Machine Learning, 24, 49– 64. CrossrefWeb of Science®Google Scholar Breiman, L. (1996b). Heuristics of instability and stabilization in model selection. Ann. Stat., 24, 2350– 2383. CrossrefWeb of Science®Google Scholar Broersen, P. M. T. (1986). Subset regression with stepwise directed search. Appl. Stat., 35, 168– 177. Wiley Online LibraryWeb of Science®Google Scholar Brown, P. J. (1977). Centering and scaling in ridge regression. Technometrics, 19, 35– 36. CrossrefWeb of Science®Google Scholar Brown, P. J. (1993). Measurement, Regression and Calibration. Oxford: Clarendon Press. Google Scholar Brown, M. B. and Forsythe, A. B. (1974). Robust tests for equality of variance. J. Am. Stat. Assoc., 69, 364– 367. CrossrefWeb of Science®Google Scholar Brunk, H. D. (1965). An Introduction to Mathematical Statistics, 2nd ed. Waltham, MA: Blaisdell. Google Scholar Buckland, S. T., Burnham, K. P. and Augustin, N. H. (1997). Model selection: An integral part of inference. Biometrics, 53, 603– 618. CrossrefWeb of Science®Google Scholar Burnham, K. P. and Anderson, D. R. (1998). Model Selection and Inference: A Practical Information-Theoretic Approach. New York: Springer-Verlag. CrossrefGoogle Scholar Canner, P. L. (1969). Some curious results using minimum variance linear unbiased estimators. Am. Stat., 23 (5), 39– 40. Web of Science®Google Scholar Carlin, B. P. and Chib, S. (1995). Bayesian model choice via Markov chain Monte Carlo methods. J. R. Stat. Soc. B, 57, 473– 484. Wiley Online LibraryWeb of Science®Google Scholar Carlstein, E. (1986). Simultaneous confidence intervals for predictions. Am. Stat., 40, 277– 279. CrossrefWeb of Science®Google Scholar Carroll, R. J. (1980). A robust method for testing transformations to achieve approximate normality. J. R. Stat. Soc. B, 42, 71– 78. Wiley Online LibraryWeb of Science®Google Scholar Carroll, R. J. (1982a). Adapting for heteroscedasticity in linear models. Ann. Stat., 10, 1224– 1233. CrossrefWeb of Science®Google Scholar Carroll, R. J. (1982b). Two examples of transformations where there are possible outliers. Appl. Stat., 31, 149– 152. Wiley Online LibraryWeb of Science®Google Scholar Carroll, R. J. and Cline, D. B. H. (1988). An asymptotic theory for weighted least-squares with weights estimated by replication. Biometrika, 75, 35– 43. CrossrefWeb of Science®Google Scholar Carroll, R. J. and Davidian, M. (1987). Variance function estimation. J. Am. Stat. Assoc., 82, 1079– 1091. Web of Science®Google Scholar Carroll, R. J. and Ruppert, D. (1981). On prediction and the power transformation family. Biometrika, 68, 609– 615. CrossrefWeb of Science®Google Scholar Carroll, R. J. and Ruppert, D. (1982). Robust estimation in heteroscedastic linear models. Ann. Stat., 10, 429– 441. CrossrefWeb of Science®Google Scholar Carroll, R. J. and Ruppert, D. (1984). Power transformations when fitting theoretical models to data. J. Am. Stat. Assoc., 79, 321– 328. Web of Science®Google Scholar Carroll, R. J. and Ruppert, D. (1985). Transformations in regression: A robust analysis. Technometrics, 27, 1– 12. CrossrefWeb of Science®Google Scholar Carroll, R. J. and Ruppert, D. (1988). Transformations and Weighting in Regression. New York: Chapman & Hall. CrossrefGoogle Scholar Chambers, J. M., Cleveland, W. S., Kleiner, B. and Tukey, P. A. (1983). Graphical Methods for Data Analysis. Boston: Duxbury Press. Google Scholar Chan, T. F., Golub, G. H. and LeVeque, R. J. (1983). Algorithms for computing the sample variance: Analysis and recommendations. Am. Stat., 37, 242– 247. CrossrefWeb of Science®Google Scholar Chang, W. H., McKean, J. W., Naranjo, J. D. and Sheather, S. J. (1999). High-breakdown rank regression. J. Am. Stat. Assoc. 94, 205– 219. Web of Science®Google Scholar Chatfield, C. (1998). Durbin-Watson test. In P. Armitage and T. Colton, (Eds.), Encyclopedia of Biostatistics, Vol. 2. Wiley: New York, pp. 1252– 1253. Google Scholar Chatterjee, S. and Hadi, A. S. (1988). Sensitivity Analysis in Linear Regression. New York: Wiley. Wiley Online LibraryGoogle Scholar Cheney, E. W. (1966). Introduction to Approximation Theory. New York: McGraw-Hill. Google Scholar Clenshaw, C. W. (1955). A note on the summation of Chebyshev series. Math. Tables Aids Comput., 9, 118. CrossrefGoogle Scholar Clenshaw, C. W. (1960). Curve fitting with a digital computer. Comput. J., 2, 170. CrossrefWeb of Science®Google Scholar Clenshaw, C. W. and Hayes, J. G. (1965). Curve and surface fitting. J. Inst. Math. Appl., 1, 164– 183. CrossrefGoogle Scholar Cleveland, W. S. (1979). Robust locally-weighted regression and smoothing scatterplots. J. Am. Stat. Assoc., 74, 829– 836. CrossrefWeb of Science®Google Scholar Cleveland, W. S. and Devlin, S. J. (1988). Locally weighted regression: An approach to regression analysis by local fitting. J. Am. Stat. Assoc., 83, 596– 610. CrossrefWeb of Science®Google Scholar Coakley, C. and Hettmansperger, T. P. (1993). A bounded-influence, high breakdown, efficient regression estimator. J. Am. Stat. Assoc., 88, 872– 880. Web of Science®Google Scholar Cochran, W. G. (1938). The omission or addition of an independent variate in multiple linear regression. J. R. Stat. Soc. Suppl., 5, 171– 176. Wiley Online LibraryGoogle Scholar Conover, W. J., Johnson, M. E. and Johnson, M. M. (1981). A comparative study of tests for homogenetity of variances, with applications to the outer continental shelf bidding data. Technometrics, 23, 351– 361. CrossrefPubMedWeb of Science®Google Scholar Cook, R. D. (1977). Detection of influential observations in linear regression. Technometrics, 19, 15– 18. CASPubMedWeb of Science®Google Scholar Cook, R. D. (1993). Exploring partial residual plots. Technometrics, 35, 351– 362. CrossrefWeb of Science®Google Scholar Cook, R. D. (1994). On the interpretation of regression plots. J. Am. Stat. Assoc., 89, 177– 189. CrossrefWeb of Science®Google Scholar Cook, R. D. (1998). Regression Graphics: Ideas for Studying Regressions through Graphics. New York: Wiley. Wiley Online LibraryGoogle Scholar Cook, R. D. and Wang, P. C. (1983). Transformation and influential cases in regression. Technometrics, 25, 337– 343. Web of Science®Google Scholar Cook, R. D. and Weisberg, S. (1982). Residuals and Influence in Regression. New York: Chapman & Hall. CrossrefCASWeb of Science®Google Scholar Cook, R. D. and Weisberg, S. (1983). Diagnostics for heteroscedasticity in regression. Biometrika, 70, 1– 10. CrossrefWeb of Science®Google Scholar Cook, R. D. and Weisberg, S. (1994). An Introduction to Regression Graphics. New York: Wiley. Wiley Online LibraryGoogle Scholar Cook, R. D. and Weisberg, S. (1999). Applied Regression including Computing and Graphics. New York: Wiley. Wiley Online LibraryGoogle Scholar Cook, R. D., Hawkins, D. M. and Weisberg, S. (1992). Comparison of model mis-specification diagnostics using residuals from least mean of squares and least median of squares fits. J. Am. Stat. Assoc., 87, 419– 424. Web of Science®Google Scholar Cooper, B. E. (1968). The use of orthogonal polynomials: Algorithm AS 10. Appl. Stat., 17, 283– 287. Wiley Online LibraryWeb of Science®Google Scholar Cooper, B. E. (1971a). The use of orthogonal polynomials with equal x-values: Algorithm AS 42. Appl. Stat., 20, 208– 213. Wiley Online LibraryGoogle Scholar Cooper, B. E. (1971b). A remark on algorithm AS 10. Appl. Stat., 20, 216. Wiley Online LibraryWeb of Science®Google Scholar Cox, C. P. (1971). Interval estimating for X-predictions from linear Y-on-X regression lines through the origin. J. Am. Stat. Assoc., 66, 749– 751. Web of Science®Google Scholar Cox, D. R. and Hinkley, D. V. (1968). A note on the efficiency of least squares estimates. J. R. Stat. Soc B, 30, 284– 289. Wiley Online LibraryWeb of Science®Google Scholar Cox, D. R. and Hinkley, D. V. (1974). Theoretical Statistics. London: Chapman & Hall. CrossrefGoogle Scholar Craven, P. and Wahba, G. (1979). Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of cross-validation. Numer. Math., 31, 377– 403. CrossrefWeb of Science®Google Scholar Croux, C., Rousseeuw, P. J. and Hössjer, O. (1994). Generalized S-estimators. J. Am. Stat. Assoc., 89, 1271– 1281. Web of Science®Google Scholar David, H. A. (1981). Order Statistics, 2nd ed. New York: Wiley. Google Scholar Davies, R. B. and Hutton, B. (1975). The effects of errors in the independent variables in linear regression. Biometrika, 62, 383– 391. Correction, 64, 655. Web of Science®Google Scholar Davis, P. (1975). Interpolation and Approximation. New York: Dover. Google Scholar Davison, A. C. and Hinkley, D. V. (1997). Bootstrap Methods and Their Application. Cambridge: Cambridge University Press. CrossrefGoogle Scholar De Boor, C. (1978). A Practical Guide to Splines. Berlin: Springer-Verlag. CrossrefGoogle Scholar Dempster, A. P. and Gasko-Green, M. (1981). New tools for residual analysis. Ann. Stat., 9, 945– 959. CrossrefWeb of Science®Google Scholar Dempster, A. P., Schatzoff, M. and Wermuth, N. (1977). A simulation study of alternatives to ordinary least squares. J. Am. Stat. Assoc., 72, 77– 106. Web of Science®Google Scholar Diehr, G. and Hoflin, D. R. (1974). Approximating the distribution of the sample R2 in best subset regressions. Technometrics, 16, 317– 320. CrossrefWeb of Science®Google Scholar Diercx, P. (1993). Curve and Surface Fitting with Splines. Oxford: Clarendon Press. Google Scholar Y. Dodge (Ed.) (1987). Statistical Data Analysis Based on the L1 Norm and Related Methods. Amsterdam: North-Holland. Google Scholar Draper, D. (1995). Assessment and propagation of model uncertainty (with discussion). J. R. Stat. Soc. B, 57, 45– 98. Wiley Online LibraryWeb of Science®Google Scholar Draper, N. R. and Cox, D. R. (1969). On distributions and their transformation to normality. J. R. Stat. Soc. B, 31, 472– 476. Wiley Online LibraryWeb of Science®Google Scholar Draper, N. R. and Smith, H. (1998). Applied Regression Analysis, 3rd ed. New York: Wiley. Wiley Online LibraryGoogle Scholar Draper, N. R. and Van Nostrand, R. C. (1979). Ridge regression and James-Stein estimation: Review and comments. Technometrics, 21, 451– 466. Web of Science®Google Scholar Draper, N. R., Guttman, I., and Kanemasu, H. (1971). The distribution of certain regression statistics. Biometrika, 58, 295– 298. CrossrefWeb of Science®Google Scholar Dunn, O. J. (1959). Confidence intervals for the means of dependent, normally distributed variables. J. Am. Stat. Assoc., 54, 613– 621. Web of Science®Google Scholar Dunn, O. J. (1961). Multiple comparisons among means. J. Am. Stat. Assoc., 56, 52– 64. CrossrefWeb of Science®Google Scholar Dunn, O. J. (1968). A note on confidence bands for a regression line over finite range. J. Am. Stat. Assoc., 63, 1029– 1033. CrossrefWeb of Science®Google Scholar Durbin, J. and Watson, G. S. (1950). Testing for serial correlation in least squares regression. I. Biometrika, 37, 409– 428. CrossrefCASPubMedWeb of Science®Google Scholar Durbin, J. and Watson, G. S. (1951). Testing for serial correlation in least squares regression. II. Biometrika, 38, 159– 178. CrossrefCASPubMedWeb of Science®Google Scholar Durbin, J. and Watson, G. S. (1971). Testing for serial correlation in least squares regression. III. Biometrika, 58, 1– 19. Web of Science®Google Scholar Efron, B. and Morris, C. (1973). Stein's estimation rule and its competitors—an empirical Bayes approach. J. Am. Stat. Assoc., 68, 117– 130. CrossrefWeb of Science®Google Scholar Efroymson, M. A. (1960). Multiple regression analysis. In A. Ralston and H. S. Wilf (Eds.), Mathematical Methods for Digital Computers, 1, 191– 203. Google Scholar Eicker, F. (1963). Asymptotic normality and consistency of the least squares estimators for families of linear regressions. Ann. Math. Stat., 34, 447– 456. CrossrefWeb of Science®Google Scholar Eilers, P. H. C. and Marx, B. D. (1996). Flexible smoothing with B-splines and penalties. Stat. Sci., 11, 89– 121. CrossrefWeb of Science®Google Scholar Eubank, R. L. (1984). Approximate regression models and splines. Commun. Stat. A, 13, 485– 511. CrossrefGoogle Scholar Eubank, R. L. (1999). Nonparametric Regression and Spline Smoothing, 2nd ed. New York: Marcel Dekker. CrossrefGoogle Scholar Evans, M. and Swartz, T. (1995). Methods for approximating integrals in statistics with special emphasis on Bayesian integration problems. Stat. Sci., 10, 254– 272. CrossrefWeb of Science®Google Scholar Ezekiel, M. (1924). A method of handling curvilinear correlation for any number of variables. J. Am. Stat. Assoc., 19, 431– 453. Google Scholar Ezekiel, M. and Fox, K. A. (1959). Methods of Correlation and Regression Analysis, 3rd ed. New York: Wiley. Google Scholar Fan, J. and Gijbels, I. (1996). Local Polynomial Modelling and Its Applications. London: Chapman & Hall. Web of Science®Google Scholar Farebrother, R. W. (1990). Algorithm AS 256: The distribution of a quadratic form in normal variables. Appl. Stat., 23, 470– 476. Wiley Online LibraryGoogle Scholar Farley, J. U. and Hinich, M. J. (1970). A test for shifting slope coefficient in a linear model. J. Am. Stat. Assoc., 65, 1320– 1329. Web of Science®Google Scholar Feller, W. (1968). An Introduction to Probability Theory and Its Applications, 3rd ed. New York: Wiley. Google Scholar Fieller, E. C. (1940). The biological standardization of insulin. J. R. Stat. Soc. Suppl., 7, 1– 64. Wiley Online LibraryGoogle Scholar Fisher, R. A. and Yates, F. (1957). Statistical Tables for Biological, Agricultural, and Medical Research, 5th ed. London: Oliver and Boyd. Google Scholar Fletcher, R. (1987). Practical Methods of Optimization, 2nd ed. New York: Wiley. Google Scholar Forsythe, G. E. (1957). Generation and use of orthogonal polynomials for data-fitting with a digital computer. J. Soc. Ind. Appl. Math., 5, 74– 87. CrossrefWeb of Science®Google Scholar Frank, I. E. and Friedman, J. H. (1993). A comparison of some chemometrics regression tools. Technometrics, 35, 109– 148. CrossrefWeb of Science®Google Scholar Freedman, D. A. (1983). A note on screening regression equations. Am. Stat., 37, 152– 155. CrossrefWeb of Science®Google Scholar Fuller, W. A. (1987). Measurement Error Models. New York: Wiley. Wiley Online LibraryGoogle Scholar Fuller, W. A. and Rao, J. N. K. (1978). Estimation for a linear regression model with unknown diagonal covariance matrix. Ann. Stat., 6, 1149– 1158. CrossrefWeb of Science®Google Scholar Furnival, G. M. (1971). All possible regressions with less computation. Technometrics, 13, 403– 408. CrossrefWeb of Science®Google Scholar Furnival, G. M. and Wilson, R. W. (1974). Regression by leaps and bounds. Technometrics, 16, 499– 511. CrossrefWeb of Science®Google Scholar Gafarian, A. V. (1964). Confidence bands in straight line regression. J. Am. Stat. Assoc., 59, 182– 213. CrossrefWeb of Science®Google Scholar Garside, M. J. (1965). The best subset in multiple regression analysis. Appl. Stat., 14, 196– 200. Wiley Online LibraryWeb of Science®Google Scholar Garthwaite, P. H. and Dickey, J. M. (1992). Elicitation of prior distributions for variable selection problems in regression. Ann. Stat., 20, 1697– 1719. CrossrefWeb of Science®Google Scholar Gelfand, A. E. and Dey, D. K. (1994). Bayesian model choice: Asymptotics and exact calculations. J. R. Stat. Soc. B, 56, 501– 514. Wiley Online LibraryWeb of Science®Google Scholar Gelman, A., Carlin, J. B., Stern, H. S. and Rubin, D. B. (1995). Bayesian Data Analysis. London: Chapman & Hall. Wiley Online LibraryCASPubMedWeb of Science®Google Scholar Gentleman, W. M. (1973). Least squares computations by Givens transformations without square roots. J. Inst. Math. Appl., 10, 195– 197. Google Scholar George, E. I. (2000). The variable selection problem. J. Am. Stat. Assoc., 95, 1304– 1308. CrossrefWeb of Science®Google Scholar George, E. I. and McCulloch, R. E. (1993). Variable selection via Gibbs sampling. J. Am. Stat. Assoc., 88, 881– 889. CrossrefWeb of Science®Google Scholar Ghosh, M. N. and Sharma, D. (1963). Power of Tukey's tests for non-additivity. J. R. Stat. Soc. B, 25, 213– 219. Wiley Online LibraryWeb of Science®Google Scholar Glaser, R. E. (1983). Levene's robust test of homogeneity of variances. In N. L. Johnson and C. B. Read (Eds.), Encyclopedia of Statistical Sciences, Vol. 4. New York: Wiley, pp. 608– 610. Google Scholar Golub, G. H. and Styan, G. P. H. (1974). Some aspects of numerical computations for linear models. In Proceedings, 7th Annual Symposium on the Interface, Iowa State University, Ames, IA, pp. 189– 192. Google Scholar Golub, G. H. and Van Loan, C. F. (1996). Matrix Computations, 3rd ed. Baltimore: Johns Hopkins University Press. CASWeb of Science®Google Scholar Golub, G. H., Heath, M. and Wahba, G. (1979). Generalized cross-validation as a method for choosing a good ridge parameter. Technometrics, 21, 215– 223. CrossrefWeb of Science®Google Scholar Good, I. J. (1969). Conditions for a quadratic form to have a chi-squared distribution. Biometrika, 56, 215– 216. CrossrefWeb of Science®Google Scholar Good, I. J. (1970). Correction to “Conditions for a quadratic form to have a chi-squared distribution.” Biometrika, 57, 225. CrossrefWeb of Science®Google Scholar Goodnight, J. (1979). A tutorial on the SWEEP operator. Am. Stat., 33, 149– 158. CrossrefWeb of Science®Google Scholar Graybill, F. A. (1961). An Introduction to Linear Statistical Models. New York: McGraw-Hill. Google Scholar Graybill, F. A. and Bowden, D. C. (1967). Linear segment confidence bands for simple linear models. J. Am. Stat. Assoc., 62, 403– 408. Web of Science®Google Scholar Green, P. J. and Silverman, B. W. (1994). Nonparametric Regression and Generalized Linear Models. London: Chapman & Hall. CrossrefGoogle Scholar Grier, D. A. (1992). An extended sweep operator for the cross-validation of variable selection in linear regression. J. Stat. Comput. Simul., 43, 117– 126. CrossrefGoogle Scholar Grossman, S. I. and Styan, G. P. H. (1972). Optimal properties of Theil's BLUS residuals. J. Am. Stat. Assoc., 67, 672– 673. Web of Science®Google Scholar Gujarati, D. (1970). Use of dummy variables in testing for equality between sets of coefficients in linear regressions: A generalization. Am. Stat., 24, 18– 22. CrossrefWeb of Science®Google Scholar Gunst, R. F. and Mason, R. L. (1977). Biased estimation in regression: An evaluation using mean squared error. J. Am. Stat. Assoc., 72, 616– 628. Web of Science®Google Scholar Gunst, R. F. and Mason, R. L. (1985). Outlier-induced collinearities. Technometrics, 27, 401– 407. Web of Science®Google Scholar Hadi, A. S. and Ling, R. F. (1998). Some cautionary notes on the use of principal components regression. Am. Stat., 52, 15– 19. CrossrefWeb of Science®Google Scholar Hadi, A. S. and Simonoff, J. S. (1993). Procedures for the identification of multiple outliers in linear models. J. Am. Stat. Assoc., 88, 1264– 1272. CrossrefWeb of Science®Google Scholar Hahn, G. J. (1972). Simultaneous prediction intervals for a regression model. Technometrics, 14, 203– 214. CrossrefWeb of Science®Google Scholar Hahn, G. J. and Hendrickson, R. W. (1971). A table of percentage points of the distribution of the largest absolute value of k Student t variates and its applications. Biometrika, 58, 323– 332. CrossrefWeb of Science®Google Scholar Halperin, M. and Gurian, J. (1968). Confidence bands in linear

Referência(s)