Adaptive Meta-Learning via data-dependent PAC-Bayes bounds

Lior Friedman, Ron Meir

Research output: Contribution to journalConference articlepeer-review


Meta-learning aims to extract common knowledge from similar training tasks in order to facilitate efficient and effective learning on future tasks. Several recent works have extended PAC-Bayes generalization error bounds to the meta-learning setting. By doing so, prior knowledge can be incorporated in the form of a distribution over hypotheses that is expected to lead to low error on new tasks that are similar to those that have been previously observed. In this work, we develop novel bounds for the generalization error on test tasks based on recent data-dependent bounds and provide a novel algorithm for adapting prior knowledge to downstream tasks in a potentially more effective manner. We demonstrate the effectiveness of our algorithm numerically for few-shot image classification tasks with deep neural networks and show a significant reduction in generalization error without any additional adaptation data.

Original languageEnglish
Pages (from-to)796-810
Number of pages15
JournalProceedings of Machine Learning Research
StatePublished - 2023
Event2nd Conference on Lifelong Learning Agents, CoLLA 2023 - Montreal, Canada
Duration: 22 Aug 202325 Aug 2023

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability


Dive into the research topics of 'Adaptive Meta-Learning via data-dependent PAC-Bayes bounds'. Together they form a unique fingerprint.

Cite this