We examine the decelerating shock instability (DSI) with a linear perturbation theory. We also investigate the effect of the self-gravity on the instability of a decelerating shocked layer. We find that the characteristics of the DSI can be obtained with a thin layer approximation and that the DSI is caused by the combination of the fluctuation of column density induced by the rippling of the shocked layer and the deceleration of the layer as a whole. There is a typical wavenumber k_S(≡2 β^1/2 ×(1 - β)^-1/2|\dotVS0|/cS^2) in the dispersion relation of the DSI, where β is the pressure ratio of the trailing side to the leading side of the shocked layer, \dotVS0 is the deceleration of the shock wave and cS is the sound speed. The thermal pressure in the layer has a stabilizing effect especially at k \gtrsimkmax. When the self-gravity plays an important role, the dispersion relation is found to be characterized by the parameter α≡kS/kG, where kG ≡2πGσ0/cS^2, G being the gravitational constant and σ0 the column density of the shocked layer. In the case α> 1, the layer is unstable only to the DSI. If α≪ 1, there are two types of instabilities. For the wavenumber k ≪ (kG - kS)/g(α, β), where g(α, β) is a factor of the order of unity, the layer is gravitationally unstable and for (kG - kS)/g(α, β) ≪ k ≪ (kG + kS)/g(α, β), the layer is overstable and the instability is the DSI mode. This dispersion relations may be related to the hierarchical structure of clumps and cores observed in the star forming regions around H II regions.