Skip to content

IandRover/MAML_noisy_contrasive_learner

Repository files navigation

MAML Is a Noisy Contrastive Learner in Classification (ICLR 2022)

| Openreview | Arxiv | poster |

I also explain our paper in detail at Medium.

1. Specification of dependencies

1.1 Setup

To avoid conflict with your current setup, please create and activate a virtual environment and install the required packages. For example:

conda create --name noisyMAML python=3.7
conda activate noisyMAML
pip install -r requirements.txt

2. Building up dataset

2.1 mini-ImageNet

For experiments on mini-ImageNet dataset, please manually download the mini-ImageNet dataset here to ./data/miniimagenet folder and unzip it. (ref1 and ref2)

cd ./data/miniimagenet
gdown https://drive.google.com/u/0/uc?id=1HkgrkAwukzEZA0TpO7010PkAOREb2Nuk
unzip mini-imagenet.zip

2.2 Omniglot

For experiments on Omniglot dataset, the dataset will be download automatically.

3.1 Cosine similarity analysis

To visualize the contrastiveness of the MAML algorithm, please go to ./cos_sim_analysis and run ./contrastivemess_visualization.py to train models and calculate the cosine similarities. You can refer to the ipython notebook to directly visualize the results.

3.2 Training code

In ./omniglot and ./miniimagenet folders, codes that reproduce the results are provided.

To obtain the main results, please run script.txt.

To explore how the zeroing trick mitigates the memorization problem, please run script_memorization.txt.

3.3 Experimental results

For reproducibility, we also provide our experimental results and our visualization code in ./figure_reproduction.

Acknowledgement

The codes are adapted from this repository.

Citation

@InProceedings{kao2022maml,
  title={MAML Is a Noisy Contrastive Learner in Classification},
  author={Kao, Chia-Hsiang and Chiu, Wei-Chen and Chen, Pin-Yu},
  booktitle = {Proceedings of the 39th International Conference on Machine Learning},
  year={2022}
}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published