Input noise, defined as the root mean square of the fluctuations in the input, typically limits the performance of any system in engineering or biology. We show that three different performance measures scale identically as a function of the noise in a simple model of neuronal spiking that has both a voltage and current threshold. These performance measures are: the probability of correctly detecting a constant input in a limited time, the signal-to-noise ratio in response to sinusoidal input, and the mutual information between an arbitrarily varying input and the output spike train of the model neuron. Of these, detecting a constant signal is the simplest and most fundamental quantity. For subthreshold signals, the model exhibits stochastic resonance, a non-zero noise amplitude that optimally enhances signal detection. In this case, noise paradoxically does not limit, but instead improves performance. This resonance arises through the conjunction of two competing mechanisms: the noise-induced linearization (‘dithering’) of the model's firing rate and the increase in the variability of the number of spikes in the output. Even though the noise amplitude dwarfs the signal, detection of a weak constant signal using stochastic resonance is still possible when the signal elicits on average only one additional spike. Stochastic resonance could thus play a role in neurobiological sensory systems, where speed is of the utmost importance and averaging over many individual spikes is not possible.