Aufnahme Datum 2021-02-01
Paper: Tsendsuren Munkhdalai, Hong Yu, Meta Networks. Proceedings of the 34th International Conference on Machine Learning, PMLR 70:2554-2563, 2017.
Abstract
Neural networks have been successfully applied in applications with a large amount of labeled data. However, the task of rapid generalization on new concepts with small training data while preserving performances on previously learned ones still presents a significant challenge to neural network models. In this work, we introduce a novel meta learning method, Meta Networks (MetaNet), that learns a meta-level knowledge across tasks and shifts its inductive biases via fast parameterization for rapid generalization. When evaluated on Omniglot and Mini-ImageNet benchmarks, our MetaNet models achieve a near human-level performance and outperform the baseline approaches by up to 6\% accuracy. We demonstrate several appealing properties of MetaNet relating to generalization and continual learning.
Paper: http://proceedings.mlr.press/v70/munkhdalai17a.html