| Openreview | Arxiv | poster |
I also explain our paper in detail at Medium.
To avoid conflict with your current setup, please create and activate a virtual environment and install the required packages. For example:
conda create --name noisyMAML python=3.7
conda activate noisyMAML
pip install -r requirements.txt
For experiments on mini-ImageNet dataset, please manually download the mini-ImageNet dataset here to ./data/miniimagenet
folder and unzip it. (ref1 and ref2)
cd ./data/miniimagenet
gdown https://drive.google.com/u/0/uc?id=1HkgrkAwukzEZA0TpO7010PkAOREb2Nuk
unzip mini-imagenet.zip
For experiments on Omniglot dataset, the dataset will be download automatically.
To visualize the contrastiveness of the MAML algorithm, please go to ./cos_sim_analysis
and run ./contrastivemess_visualization.py
to train models and calculate the cosine similarities. You can refer to the ipython notebook to directly visualize the results.
In ./omniglot
and ./miniimagenet
folders, codes that reproduce the results are provided.
To obtain the main results, please run script.txt
.
To explore how the zeroing trick mitigates the memorization problem, please run script_memorization.txt
.
For reproducibility, we also provide our experimental results and our visualization code in ./figure_reproduction
.
The codes are adapted from this repository.
@InProceedings{kao2022maml,
title={MAML Is a Noisy Contrastive Learner in Classification},
author={Kao, Chia-Hsiang and Chiu, Wei-Chen and Chen, Pin-Yu},
booktitle = {Proceedings of the 39th International Conference on Machine Learning},
year={2022}
}