|

Real-Time Activity Recognition with Instantaneous Characteristic Features of Thigh Kinematics.

Researchers

Journal

Modalities

Models

Abstract

Current supervised learning or deep learning-based activity recognition classifiers can achieve high accuracy in recognizing locomotion activities. Most available techniques use a high-dimensional space of features, e.g., combinations of EMG, kinematics and kinetics, and transformations over those signals. The associated classification rules are therefore complex; the machine tries to understand the human, but the human does not understand the machine. This paper presents an activity recognition system that uses signals from a thigh-mounted IMU and a force sensitive resistor to classify transitions between sitting, walking, stair ascending, and stair descending. The system uses the thigh’s orientation and velocity with foot contact information at specific moments within a given activity as the features to classify transitions to other activities. We call these Instantaneous Characteristic Features (ICFs). Because these ICFs are biomechanically intuitive, they are easy for the user to understand and thus control the activity transitions of wearable robots. We assessed our classification algorithm offline using an existing dataset with 10 able-bodied subjects and online with another 10 able-bodied subjects wearing a real-time system. The offline study analyzed the effect of subject-dependency and ramp inclinations. The real-time classification accuracy was evaluated before and after training the subjects on the ICFs. The real-time system achieved overall pre-subject-training and post-subject-training error rates of 0.59% ± 0.24% and 0.56% ± 0.20%, respectively. We also evaluated the feasibility of our ICFs for amputee ambulation by analyzing a public dataset with the open-source bionic leg. The simplicity of these classification rules demonstrates a new paradigm for activity recognition where the human can understand the machine and vice-versa.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *