Abstract The simple linear regression model with measurement error has been subject to much research. In this work we will focus on this model when the error in the explanatory variable is correlated with the error in the regression equation. Specifically, we are interested in the comparison between the ordinary errors-in-variables estimator of the regression coefficient β and the estimator that takes account of the correlation between the errors. Based on large sample approximations, we compare the estimators and find that the estimator that takes account of the correlation should be preferred in most situations. We also compare the estimators in small sample situations. This is done by stochastic simulation. The results show that the estimators behave quite similarly in most of the simulated situations, but that the ordinary errors-in-variables estimator performs considerably worse than the estimator that takes account of the correlation for certain parameter combinations. In addition, we look briefly into the bias introduced by ignoring correlated errors when computing sample correlations, and in predictions.