Abstract
Advances in sensor technologies have substantially reshaped human-machine interaction (HMI), enabling more intuitive, robust, and efficient interactions across diverse application domains including healthcare, robotics, and industrial automation. This review systematically covers next-generation sensing approaches essential to future HMI systems, addressing radar-based sensing, vision-based techniques, haptic feedback and teleoperation, as well as wearable sensors and wireless sensor networks (WSNs). Radar sensors are discussed for their reliability in challenging conditions and suitability for gesture and vital-sign monitoring. Vision-based sensors, incorporating monocular, stereo, near-infrared, and thermal imaging, provide detailed spatial and semantic information vital for gesture tracking and remote physiological monitoring. Moreover, haptic sensors and AI-driven teleoperation techniques are examined for their pivotal roles in enabling safe, precise remote interactions in medical and industrial contexts. This paper further explores IoT-enabled wearable sensors and WSNs, highlighting their application in continuous health monitoring and personalized healthcare. Sensor fusion strategies are discussed, emphasizing their potential to integrate complementary data streams effectively. Cross-cutting challenges such as computational efficiency, sensor integration, ethical considerations, and sustainability are identified and analyzed. By consolidating recent progress and discussing practical implementation issues, this review offers insights into current research trends and outlines key directions for future advancements in sensor-driven HMI systems.
Schlagwörter
Haptic sensors
Human-machine interaction (HMI)
Radar sensing
Remote healthcare
Vision-based sensing
Wearable sensors
Wireless sensor networks (WSNs)