Wearable devices embed inertial sensors whose records are processed for navigation instructions, health assessment, sports training or change in mobility behaviour. The applications are processing inertial or telecommunication signals sensed in our clothes, shoes and glasses. Complex methods, more and more based on artificial intelligence, are developed to process these data but they sometimes forget that human behaviour defies the developed methods.
Defining the minimum performance requirements for a targeted application, calibrating embedded sensors and accounting for the hardware constraints of the wearables are classical R&D steps. The influence that humans can have on the quality of measurements (signal attenuation by the human body, change of behaviour, ageing, etc.) is however often forgotten.
In this presentation, we will analyse the observability of gait parameters and navigation data with signals sensed by devices worn on different body parts (upper/lower body). We will also observe the human gait variation for the same person in different kinematic contexts (visually impaired people guided by a cane or a dog). The analysis will be supported by a theoretical and experimental approach with inertial signals and GNSS phase and pseudo-ranges data collected by pedestrians.