-
Notifications
You must be signed in to change notification settings - Fork 357
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Add differentiable optimization module. (Meta-Descent, KFO, Meta-Curv…
…ature) (#151) * Ported hypergrad example. * Add meta-curvature example with GBML wrapper. * GBML support for nograd, unused, first_order and tests. * Add ANIL+KFO low-level example. * Add misc nn layers. * Update maml_update. * Change download path for mini-imagenet tests. * Add docs for differentiable sgd. * Update docs, incl. for MetaWorld. * KroneckerTranform docs. * Docs for meta-curvature. * Add docs for l2l.nn.misc. * Add docs for kroneckers. * Fix lint, add more docs. * Add docs for GBML. * Completes GBML docs. * Rename meta_update -> update_module, and write docs. * Fix lint, add docs for ParameterUpdate. * Add docs for LearnableOptimizer * Update changelog * Update to readme, part 1 * Update README, part 2. * Fix readme links * Version bump.
- Loading branch information
Showing
36 changed files
with
2,079 additions
and
111 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,22 @@ | ||
# Meta-Optimization | ||
|
||
This directory contains examples of using learn2learn for meta-optimization or meta-descent. | ||
|
||
# Hypergradient | ||
|
||
The script `hypergrad_mnist.py` demonstrates how to implement a slightly modified version of "[Online Learning Rate Adaptation with Hypergradient Descent](https://arxiv.org/abs/1703.04782)". | ||
The implementation departs from the algorithm presented in the paper in two ways. | ||
|
||
1. We forgo the analytical formulation of the learning rate's gradient to demonstrate the capability of the `LearnableOptimizer` class. | ||
2. We adapt per-parameter learning rates instead of updating a single learning rate shared by all parameters. | ||
|
||
**Usage** | ||
|
||
!!! warning | ||
The parameters for this script were not carefully tuned. | ||
|
||
Manually edit the script and run: | ||
|
||
~~~shell | ||
python examples/optimization/hypergrad_mnist.py | ||
~~~ |
Oops, something went wrong.