| |

A novel method based on machine vision system and deep learning to detect fraud in turmeric powder.

Researchers

Journal

Modalities

Models

Abstract

Assessing the quality of food and spices is particularly important in ensuring proper human nutrition. The use of computer vision method as a non-destructive technique in measuring the quality of food and spices has always been taken into consideration by researchers. Due to the high nutritional value of turmeric among the spices as well as the fraudulent motives to gain economic profit from the selling of this product, its quality assessment is very important. The lack of marketability of grade 3 chickpeas (small and broken chickpeas) and their very low price have made them a good choice to be mixed with turmeric in powder form and sold in the market. In this study, an improved convolutional neural network (CNN) was used to classify turmeric powder images to detect fraud. CNN was improved through the use of gated pooling functions. We also show with a combined approach based on the integration of average pooling and max pooling that the accuracy and performance of the proposed CNN has increased. In this study, 6240 image samples were prepared in 13 categories (pure turmeric powder, chickpea powder, chickpea powder mixed with food coloring, 10, 20, 30, 40 and 50% fraud in turmeric). In the preprocessing step, unwanted parts of the image were removed. The data augmentation (DA) was used to reduce the overfitting problem on CNN. Also in this research, MLP, Fuzzy, SVM, GBT and EDT algorithms were used to compare the proposed CNN results with other classifiers. The results showed that prevention of the overfitting problem using gated pooling, the proposed CNN was able to grade the images of turmeric powder with 99.36% accuracy compared to other classifiers. The results of this study also showed that computer vision, especially when used with deep learning (DL), can be a valuable method in evaluating the quality and detecting fraud in turmeric powder.Copyright © 2021 Elsevier Ltd. All rights reserved.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *