How to use pytorch-lightning for meta learning #678
Replies: 10 comments 7 replies
-
|
could you post a toy example? |
Beta Was this translation helpful? Give feedback.
-
|
@wheatdog will reopen if we need to add to examples |
Beta Was this translation helpful? Give feedback.
-
|
Hi @wheatdog, I am working on meta-learning. I personally prefer the In brief, I matched the meta-training steps to the PL hooks in the following way
more info available here
|
Beta Was this translation helpful? Give feedback.
-
|
Hi @pietrolesci , I checked your code, and found that higher does not support pytorch-lightning optimizers. How do you deal with this issue? |
Beta Was this translation helpful? Give feedback.
-
|
Hi @wakaizen, Thanks for your reply. Maybe I don't get your question. Aren't you able to run the code? Also, can you please clarify what do you mean by "pytorch-lightning optimizers"? The configure_optimizer returns a pytorch optimizer |
Beta Was this translation helpful? Give feedback.
-
|
@pietrolesci I run your code, and got such errors: File "/home/hanw/miniconda3/envs/meta_learning/lib/python3.7/site-packages/higher/optim.py", line 786, in get_diff_optim
"Optimizer type {} not supported by higher yet.".format(type(opt))
ValueError: Optimizer type <class 'pytorch_lightning.core.optimizer.LightningSGD'> not supported by higher yet.It seems that during training, the PyTorch SGD optimizer is converted to the Lightning optimizer, and higher does not support such optimizer since it's not differentiable. |
Beta Was this translation helpful? Give feedback.
-
|
Anyway, it seems to be a long way to go before combining the two. Since now higher can only support 1 GPU. It can not handle DP or DDP... |
Beta Was this translation helpful? Give feedback.
-
|
Is there a way to integrate learn2learn with pytorch lightning? I tried to convert this this simple MAML script into lightning, and this is what I tried so far. However I do have some errors about gradients... |
Beta Was this translation helpful? Give feedback.
-
|
From what I know, Lightning still does not support higher order differentiation because of its optimizers.
…________________________________
From: chris-tkinter ***@***.***>
Sent: Monday, July 5, 2021 1:44:26 AM
To: PyTorchLightning/pytorch-lightning ***@***.***>
Cc: Gilberto Rui Nogueira Cunha ***@***.***>; Comment ***@***.***>
Subject: Re: [PyTorchLightning/pytorch-lightning] How to use pytorch-lightning for meta learning (#678)
Is there a way to integrate learn2learn<https://github.com/learnables/learn2learn> with pytorch lightning? I tried to convert this this simple MAML script<https://github.com/learnables/learn2learn/blob/master/examples/vision/maml_omniglot.py> into lightning, and this<https://gist.github.com/chris-tkinter/74bfe9ff1237b9a938bb8345f410a23a> is what I tried so far. However I do have some errors about gradients...
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub<#678 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AN7Z5SFFLK2RV4HXT32JSLTTWD56VANCNFSM4VLBSFIA>.
|
Beta Was this translation helpful? Give feedback.
-
|
I plan to use this: https://github.com/learnables/learn2learn/blob/master/examples/vision/distributed_maml.py |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Thanks for making this great library! Appreciate your work very much!
I wonder if anyone has the experience to use pytorch-lightning for meta-learning, maybe combined with library like Torchmeta or learn2learn.
I would like to hear from others before getting my hands dirty. Thanks!
Beta Was this translation helpful? Give feedback.
All reactions