Qi, Xinyu (2024) Securing teleoperated robot: Classifying human operator identity and emotion through motion-controlled robotic behaviors. PhD thesis, University of Glasgow.
Full text available as:
PDF
Download (6MB) |
Abstract
Teleoperated robotic systems allow human operators to control robots from a distance, which mitigates the constraints of physical distance between the operators and offers invaluable applications in the real world. However, the security of these systems is a critical concern. System attacks and the potential impact of operators’ inappropriate emotions can result in misbehavior of the remote robots, which poses risks to the remote environment. These concerns become particularly serious when performing mission-critical tasks, such as nuclear cleaning. This thesis explored innovative security methods for the teleoperated robotic system. Common methods of security that can be used for teleoperated robots include encryption, robot misbehavior detection and user authentication. However, they have limitations for teleoperated robot systems. Encryption adds communication overheads to the systems. Robot misbehavior detection can only detect unusual signals on robot devices. The user authentication method secured the system primarily at the access point. To address this, we built motioncontrolled robot platforms that allow for robot teleoperation and proposed methods of performing user classification directly on remote-controlled robotic behavioral data to enhance security integrity throughout the operation. We discussed in Chapter 3 and conducted 4 experiments. Experiments 1 and 2 demonstrated the effectiveness of our approach, achieving user classification accuracy of 95% and 93% on two platforms respectively, using motion-controlled robotic end-effector trajectories. The results in experiment 3 further indicated that control system performance directly impacts user classification efficacy. Additionally, we deployed an AI agent to protect user biometric identities, ensuring the robot’s actions do not compromise user privacy in the remote environment in experiment 4. This chapter provided a foundation of methodology and experiment design for the next work. Additionally, Operators’ emotions could pose a security threat to the robot system. A remote robot operator’s emotions can significantly impact the resulting robot’s motions leading to unexpected consequences, even when the user follows protocol and performs permitted tasks. The recognition of a user operator’s emotions in remote robot control scenarios is, however, under-explored. Emotion signals mainly are physiological signals, semantic information, facial expressions and bodily movements. However, most physiological signals are electrical signals and are vulnerable to motion artifacts, which can not acquire the accurate signal and is not suitable for teleoperated robot systems. Semantic information and facial expressions are sometimes not accessible and involve high privacy issues and add additional sensors to the teleoperated systems. We proposed the methods of emotion recognition through the motion-controlled robotic behaviors in Chapter 4. This work demonstrated for the first time that the motioncontrolled robotic arm can inherit human operators’ emotions and emotions can be classified through robotic end-effector trajectories, achieving an 83.3% accuracy. We developed two emotion recognition algorithms using Dynamic Time Warping (DTW) and Convolutional Neural Network (CNN), deriving unique emotional features from the avatar’s end-effector motions and joint spatial-temporal characteristics. Additionally, we demonstrated through direct comparison that our approach is more appropriate for motion-based telerobotic applications than traditional ECG-based methods. Furthermore, we discussed the implications of this system on prominent current and future remote robot operations and emotional robotic contexts. By integrating user classification and emotion recognition into teleoperated robotic systems, this thesis lays the groundwork for a new security paradigm that enhances both the safety of remote operations. Recognizing users and their emotions allows for more contextually appropriate robot responses, potentially preventing harm and improving the overall quality of teleoperated interactions. These advancements contribute significantly to the development of more adaptive, intuitive, and human-centered HRI applications, setting a precedent for future research in the field.
Item Type: | Thesis (PhD) |
---|---|
Qualification Level: | Doctoral |
Subjects: | T Technology > T Technology (General) T Technology > TK Electrical engineering. Electronics Nuclear engineering |
Colleges/Schools: | College of Science and Engineering > School of Engineering |
Supervisor's Name: | Imran, Professor Muhammad |
Date of Award: | 2024 |
Depositing User: | Theses Team |
Unique ID: | glathesis:2024-84335 |
Copyright: | Copyright of this thesis is held by the author. |
Date Deposited: | 23 May 2024 15:37 |
Last Modified: | 23 May 2024 15:37 |
Thesis DOI: | 10.5525/gla.thesis.84335 |
URI: | https://theses.gla.ac.uk/id/eprint/84335 |
Actions (login required)
View Item |
Downloads
Downloads per month over past year