Step detection is critical for many applications including health and indoor navigation. However, it remains challenging to achieve robust step detection for all types of human gait and sensors locations on the user's body. The challenge increases for blind people whose gait is different from sighted and affected by the use of navigation aids. In this study, we propose and evaluate a new machine-learning-based step detection method: Smartstep. The advantages of this method are that it does not rely on any sensor-position, step-mode, and hand motion mode pre-classifications, nor on any threshold calibration. The method had already shown a promising performance with 99% recall and precision when applied in challenging conditions on young adults' gait. In this study, the ability of this method to generalize to blind gait is put to question. The performance is assessed on two different blind people walking datasets including various challenging conditions (different walking speeds, smartphone placements, hand motion modes, sensor types, and navigation aids). Smartstep achieves a 99 % precision or 1 % overcount rate and a 90 % recall or 10 % undercount rate. This study demonstrates the robustness of the method and encourages its usage for other applications and populations.