Unsupervised Out-of-distribution Detection Using Few In-distribution Samples
Published in ICASSP 2023, 2023
This paper tackles the out-of-distribution (OOD) detection problem for natural language classifiers. While the previous OOD detection methods require large-scale in-distribution (ID) training data, we attack this problem from the few-shot perspective in an unsupervised manner where the training relies on only a few samples from ID data. First, we develop various baselines for Few-shot OOD (FSOOD) detection in text classification based on the three well-known few-shot learning approaches (well-explored in the vision domain), i.e., meta-learning, metric learning, and data augmentation (DA). Then, we introduce the concept of demonstration-based data augmentation with meta and metric-learning approaches to reap the combined benefit of both approaches. A pre-trained transformer is fine-tuned on a few available ID samples in all developed methods. In tandem with this fine-tuning, an OOD detector is fitted over the ID training samples to reject the data from the unknown classes using two kinds of distance metrics, namely Mahalanobis distance and Cosine similarity. At last, we present an extensive evaluation of three ID datasets and three OOD datasets. We also perform an ablation study to analyze the impact of various components of our method.