We review and expand on a Bayesian model selection technique for the detection of gravitational waves from neutron star ring-downs associated with pulsar glitches. The algorithm works with power spectral densities constructed from overlapping time segments of gravitational wave data. Consequently, the original approach was at risk of falsely identifying multiple signals where only one signal was present in the data. We introduce an extension to the algorithm which uses posterior information on the frequency content of detected signals to cluster events together. The requirement that we have just one detection per signal is now met with the additional bonus that the belief in the presence of a signal is boosted by incorporating information from adjacent time segments.