You can view the full text of this article for free using the link below.

Title: Effectively learn how to learn: a novel few-shot learning with meta-gradient memory

Authors: Lin Hui; Yi-Cheng Chen

Addresses: Department of Computer Science and Information Engineering, Tamkang University, Taiwan ' Department of Information Management, National Central University, Taiwan

Abstract: Recently, the importance of few-shot learning has tremendously grown due to its widespread applicability. Via few-shot learning, users can train their models with few data and maintain high generalisation ability. Meta-learning and continual learning models have demonstrated elegant performance in model development. However, unstable performance and catastrophic forgetting are still two fatal issues with regard to retaining the memory of knowledge about previous tasks when facing new tasks. In this paper, a novel method, enhanced model-agnostic meta-learning (EN-MAML), is proposed for blending the flexible adaptation characteristics of meta-learning and the stable performance of continual learning to tackle the above problems. Based on the proposed learning method, users can efficiently and effectively train the model in a stable manner with few data. Experiments show that when following the N-way K-shot experimental protocol, EN-MAML has higher accuracy, more stable performance and faster convergence than other state-of-the-art models on several real datasets.

Keywords: machine learning; deep learning; meta-learning; continual learning.

DOI: 10.1504/IJWGS.2024.137549

International Journal of Web and Grid Services, 2024 Vol.20 No.1, pp.3 - 24

Received: 13 Apr 2023
Accepted: 23 Sep 2023

Published online: 25 Mar 2024 *

Full-text access for editors Full-text access for subscribers Free access Comment on this article