Ubiquitous Robotics is a novel paradigm aimed at addressing the coordinated behaviour of robots in environments that are intelligent per se. To this aim, suitable methods to enforce cooperative activities must be assessed. In this article, a formalism to encode spatio-temporal situations whose occurrences must be detected by a contextaware system is introduced. The Situation Definition Language is a tool used to specify relationships among classes of sensory data in distributed systems (such as those adhering to the Ubiquitous Robotics paradigm), without posing any assumption on how data themselves are acquired. The capabilities offered by the language are discussed with respect to a real-world scenario, where a team of mobile robots cooperates with an intelligent environment to perform service tasks. Specifically, the article focuses on the system ability to combine in a centralized representation information originating from distributed sources, either mobile (i.e., the robots) or fixed (i.e., the intelligent devices in the network).