|

Data-Driven Haptic Texture Modeling and Rendering Based on Deep Spatio-Temporal Networks.

Researchers

Journal

Modalities

Models

Abstract

Data-driven approaches are commonly used to model and render haptic textures for rigid stylus-based interaction. Current state-of-the-art data-driven methodologies synthesize acceleration signals through the interpolation of samples with different input parameters based on neural networks or parametric spectral estimation methods. In this paper, we see the potential of emerging deep learning methods in this area. To this end, we designed a complete end-to-end data-driven framework to synthesize acceleration profiles based on the proposed deep spatio-temporal network. The network is trained using contact acceleration data collected through our manual scanning stylus and interaction parameters, i.e., scanning velocities, directions, and forces. The proposed network is composed of attention-aware 1D CNNs and attention-aware encoder-decoder networks to adequately capture both the local spatial features and the temporal dynamics of the acceleration signals, which are further augmented with attention mechanisms that assign weights to the features according to their contributions. For rendering, the trained network generates synthesized signals in real-time in accordance with the user’s input parameters. The whole framework was numerically compared with existing state-of-the-art approaches, showing the effectiveness of the approach. Additionally, a pilot user study is conducted to demonstrate subjective similarity.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *