Abstract This paper examines the effects of sampling and analytical error on time trends derived from routine monitoring. Our analysis is based on actual concentration differences observed among three long sulfate series recorded by collocated and independent measurements at Shenandoah National Park. Five-year sulfate trends at this location are shown to include a one-sigma uncertainty of about 1% year –1 from measurement error alone. This is significantly more than would be estimated under naïve statistical assumptions from the demonstrated precision of the measurements. The excess uncertainty arises from subtle trends in the errors themselves.