Abstract
Few-Shot Learning (FSL) algorithms have made substantial progress in learning novel concepts with just a handful of labelled data. To classify query instances from novel classes encountered at test-time, they only require a support set composed of a few labelled samples. FSL benchmarks commonly assume that those queries come from the same distribution as instances in the support set. However, in a realistic setting, data distribution is plausibly subject to change, a situation referred to as Distribution Shift (DS). The present work addresses the new and challenging problem of Few-Shot Learning under Support/ Query Shift (FSQS) i.e., when support and query instances are sampled from related but different distributions. Our contributions are the following. First, we release a testbed for FSQS, including datasets, relevant baselines and a protocol for a rigorous and reproducible evaluation. Second, we observe that well-established FSL algorithms unsurprisingly suffer from a considerable drop in accuracy when facing FSQS, stressing the significance of our study. Finally, we show that transductive algorithms can limit the inopportune effect of DS. In particular, we study both the role of Batch-Normalization and Optimal Transport (OT) in aligning distributions, bridging Unsupervised Domain Adaptation with FSL. This results in a new method that efficiently combines OT with the celebrated Prototypical Networks. We bring compelling experiments demonstrating the advantage of our method. Our work opens an exciting line of research by providing a testbed and strong baselines. Our code is available at https://github.com/ebennequin/meta-domain-shift.
E. Bennequin and V. Bouvier—Equal contribution.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
- 2.
These normalizations are implemented in FewShiftBed for future works.
- 3.
References
Amodei, D., et al.: Concrete problems in AI safety. arXiv preprint arXiv:1606.06565 (2016)
Ben-David, S., et al.: Analysis of representations for domain adaptation. In: Advances in Neural Information Processing Systems, pp. 137–144 (2007)
Bronskill, J., et al.: Tasknorm: rethinking batch normalization for meta-learning. In ICML, pp. 1153–1164. PMLR (2020)
Caldas, S., et al.: Leaf: a benchmark for federated settings. arXiv preprint arXiv:1812.01097 (2018)
Chen, W.-Y., et al.: A closer look at few-shot classification. In: International Conference on Learning Representations (2019)
Cohen, G., et al.: EMNIST: extending MNIST to handwritten letters. In: IJCNN. IEEE (2017)
Courty, N., et al.: Optimal transport for domain adaptation. IEEE Trans. Pattern Anal. Mach. Intell. 39(9), 1853–1865 (2016)
Cuturi, M.: Sinkhorn distances: lightspeed computation of optimal transport. In: Advances in Neural Information Processing Systems, vol. 26, pp. 2292–2300 (2013)
Deng, J., et al.: ImageNet: a large-scale hierarchical image database. In: 2009 IEEE Conference on Computer Vision and Pattern Recognition, pp. 248–255. IEEE (2009)
Dhillon, G.S., et al.: A baseline for few-shot image classification. In: ICLR (2020)
Du, Y., et al.: MetaNorm: learning to normalize few-shot batches across domains. In: International Conference on Learning Representations (2021). https://openreview.net/forum?id=9z_dNsC4B5t
Finn, C., et al.: Model-agnostic meta-learning for fast adaptation of deep networks. In: ICML. JMLR (2017)
Ganin, Y., Lempitsky, V.: Unsupervised domain adaptation by backpropagation. In: International Conference on Machine Learning, pp. 1180–1189 (2015)
Gulrajani, I., Lopez-Paz, D.: In search of lost domain generalization. In: International Conference on Learning Representations (2021)
He, K., et al.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)
Hendrycks, D., Dietterich, T.: Benchmarking neural network robustness to common corruptions and perturbations. In: ICLR (2019)
Hu, Y., et al.: Leveraging the feature distribution in transfer-based few-shot learning. arXiv preprint arXiv:2006.03806 (2020)
Ioffe, S., Szegedy, C.: Batch normalization: accelerating deep network training by reducing internal covariate shift. In: ICML. PMLR (2015)
Krizhevsky, A., et al.: Learning multiple layers of features from tiny images. Citeseer (2009)
Laenen, S., Bertinetto, L.: On episodes, prototypical networks, and few-shot learning. arXiv preprint arXiv:2012.09831 (2020)
Liu, Y., et al.: Learning to propagate labels: transductive propagation network for few-shot learning. In: ICLR (2019)
Pan, S.J., Yang, Q.: A survey on transfer learning. IEEE Trans. Knowl. Data Eng. 22(10), 1345–1359 (2009)
Peyré, G., et al.: Computational optimal transport: with applications to data science. Found. Trends® Mach. Learn. 11(5–6), 355–607 (2019)
Quionero-Candela, J., et al.: Dataset Shift in Machine Learning. The MIT Press, Cambridge (2009)
Ren, M., et al.: Meta-learning for semi-supervised few-shot classification. In: ICLR (2019)
Sahoo, D., et al.: Meta-learning with domain adaptation for few-shot learning under domain shift (2019)
Schneider, S., et al.: Improving robustness against common corruptions by covariate shift adaptation. In: Advances in Neural Information Processing Systems, vol. 33 (2020)
Snell, J., et al.: Prototypical networks for few-shot learning. In: Advances in Neural Information Processing Systems, pp. 4077–4087 (2017)
Sun, Y., et al.: Test-time training with self-supervision for generalization under distribution shifts. In: ICML (2020)
Triantafillou, E., et al.: Meta-dataset: a dataset of datasets for learning to learn from few examples. In: ICLR (2020)
Vinyals, O., et al.: Matching networks for one shot learning. In: NIPS (2016)
Wang, D., et al.: Fully test-time adaptation by entropy minimization. In: ICLR (2021)
Zhang, M., et al.: Adaptive risk minimization: a meta-learning approach for tackling group shift. In: ICLR (2021)
Zhao, A., et al.: Domain-adaptive few-shot learning. arXiv preprint arXiv:2003.08626 (2020)
Acknowledgements
Etienne Bennequin is funded by Sicara and ANRT (France), and Victor Bouvier is funded by Sidetrade and ANRT (France), both through a CIFRE collaboration with CentraleSupélec. This work was performed using HPC resources from the “Mésocentre” computing center of CentraleSupélec and École Normale Supérieure Paris-Saclay supported by CNRS and Région Île-de-France (http://mesocentre.centralesupelec.fr/).
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
1 Electronic supplementary material
Below is the link to the electronic supplementary material.
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Bennequin, E., Bouvier, V., Tami, M., Toubhans, A., Hudelot, C. (2021). Bridging Few-Shot Learning and Adaptation: New Challenges of Support-Query Shift. In: Oliver, N., Pérez-Cruz, F., Kramer, S., Read, J., Lozano, J.A. (eds) Machine Learning and Knowledge Discovery in Databases. Research Track. ECML PKDD 2021. Lecture Notes in Computer Science(), vol 12975. Springer, Cham. https://doi.org/10.1007/978-3-030-86486-6_34
Download citation
DOI: https://doi.org/10.1007/978-3-030-86486-6_34
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-86485-9
Online ISBN: 978-3-030-86486-6
eBook Packages: Computer ScienceComputer Science (R0)