On a Scalable Entropic Breaching of the Overfitting Barrier for Small Data Problems in Machine Learning.

Researchers

Journal

Modalities

Models

Abstract

Overfitting and treatment of small data are among the most challenging problems in machine learning (ML), when a relatively small data statistics size
T
is not enough to provide a robust ML fit for a relatively large data feature dimension
D
. Deploying a massively parallel ML analysis of generic classification problems for different
D
and
T
, we demonstrate the existence of statistically significant linear overfitting barriers for common ML methods. The results reveal that for a robust classification of bioinformatics-motivated generic problems with the long short-term memory deep learning classifier (LSTM), one needs in the best case a statistics
T
that is at least 13.8 times larger than the feature dimension
D
. We show that this overfitting barrier can be breached at a 10


12

fraction of the computational cost by means of the entropy-optimal scalable probabilistic approximations algorithm (eSPA), performing a joint solution of the entropy-optimal Bayesian network inference and feature space segmentation problems. Application of eSPA to experimental single cell RNA sequencing data exhibits a 30-fold classification performance boost when compared to standard bioinformatics tools and a 7-fold boost when compared to the deep learning LSTM classifier.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *