In this paper, thorough analysis along with mathematical derivations of the matched filter for a voltmeter used in electrical impedance tomography systems are presented. The effect of the random noise in the system prior to the matched filter, generated by other components, are considered. Employing the presented equations allow system/circuit designers to find the maximum tolerable noise prior to the matched filter that leads to the target signal-to-noise ratio (SNR) of the voltmeter, without having to over-design internal components. A practical model was developed that should fall within 2 dB and 5 dB of the median SNR measurements of signal amplitude and phase, respectively. In order to validate our claims, simulation and experimental measurements have been performed with an analog-to-digital converter (ADC) followed by a digital matched filter, while the noise of the whole system was modeled as the input referred at the ADC input. The input signal was contaminated by a known value of additive white Gaussian noise (AWGN) noise, and the noise level was swept from 3% to 75% of the least significant bit (LSB) of the ADC. Differences between experimental and both simulated and analytical SNR values were less than 0.59 and 0.35 dB for RMS values ≥ 20% of an LSB and less than 1.45 and 2.58 dB for RMS values < 20% of an LSB for the amplitude and phase, respectively. Overall, this study provides a practical model for circuit designers in EIT, and a more accurate error analysis that was previously missing in EIT literature.