dc.description.abstracten |
In this work, we analyse the problem of out-of-distribution detection, which includes
inlier classification and outlier detection, and the use of two methods for it, i.e.,
Mixup and metric learning, as well as their combination. Inspired by the use of
Mixup between inliers and outliers in MixOE (Zhang et al., 2023), our first objective
is to identify the key ingredients of their method and to combine it with seven
other Mixup techniques. We also change the size and diversity of the auxiliary outlier
dataset, which is used for training. We find that in the fine-grained OoD setting, where
outliers come from the same domain as inliers, mixing only the label with the one that
comes from the uniform probability distribution without mixing the inlier and outlier
training samples has a similar performance as mixing both pairs of training samples
and labels. At the same time, in the coarse-grained settings, where outliers come from
a completely different domain than inliers, the more outliers are used for mixing,
the better the detection performance is. Our second objective is the investigation of
metric learning in the form of triplet loss for the same setup with different types of
triplet combinations, some of which are created using Mixup. In the coarse-grained
settings, the triplet combination with only inlier classes performs better than such
combinations with outliers, mixed or not. Finally, we combine MixOE and our metric
learning approach to show that, for some datasets, the detection performance in the
coarse-grained settings is comparably larger than the previous best result, MixOE. |
uk |