Abstract:
One of the most important methods in statistics for estimating parameters is there sampling method. But estimations of unknown parameters with re sampling methods demand a lot of calculations and it is very difficult to use them. These methods are found in lot of applications because of wide spread use of computers.Tukey and Quenouoille firstly introduced jackknife methods to estimate the biasand the variance during the estimation of unknown parameters. Afterwards, Efron used jackknife methods to estimate the variance of least squares estimators inlinear regression models. In many cases, jackknife estimators resulted successfully.But, there were some cases, when the results of jackknife estimators were far fromthe true values of the parameters. This happened, because the jack knife estimators depended from various conditions that fulfill the model. These conditions were related with the values of the independent variables, the properties of the matrix of independent variables and the observation errors variances.We have shown in our paper that ordinary jackknife estimations for variances ofleast squares estimators of unknown coefficients in linear regression models are notunbiased. Their accuracy depends on the variances of linear regression model errorsand the nature of the matrix of the independent variables observations. We have found some conditions when the jackknife estimators are robust estimations (not influenced from the distribution of the sample elements) for the variances of least squares estimators of the linear regression model coefficients in the case when model errors are homoschedastic (the errors have equal variances) and in the case when model errors are heteroschedastic (the errors have not equal variances). Wehave analyzed the relationship between these conditions and have found that there lationship between them and the eigenvalues of the matrix XTX, when X is the matrix of the observed independent variables.