| | |

Vibrational Spectroscopy Can Be Vulnerable to Adversarial Attacks.

Researchers

Journal

Modalities

Models

Abstract

Nondestructive detection methods based on vibrational spectroscopy have been widely used in many critical applications in a variety of fields such as the chemical industry, pharmacy, national defense, security, and so on. As these methods/applications rely on machine learning models for data analysis, studying the threats associated with adversarial examples in vibrational spectroscopy and defenses against them is of great importance. In this paper, we propose a novel adversarial method to attack vibrational spectroscopy, named SynPat, where synthetic peaks produced by a physical model are placed at key locations to form adversarial perturbations. Our new attack generates perturbations that successfully deceive machine learning models for Raman and infrared spectrum analysis while they blend much better into the spectra and hence are unnoticeable to human operators, unlike the existing state-of-the-art adversarial attacking methods, e.g., images and audio. We verified the superiority of the proposed SynPat by an imperceptibility test conducted by human experts and of defense experiments by an AI detector. To the best of our knowledge, this is a first thorough study on the robustness of vibrational spectroscopic techniques against adversarial samples and defense mechanisms. Our extensive experiments show that machine learning models for vibrational spectroscopy, including conventional and deep models for Raman or infrared classification and regression, are all vulnerable to adversarial perturbations and thus may pose serious security threats to our society.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *