Skip to content
14 changes: 14 additions & 0 deletions tensorflow_addons/optimizers/BUILD
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ py_library(
"__init__.py",
"lazy_adam.py",
"moving_average.py",
"weight_decay_optimizers.py",
],
srcs_version = "PY2AND3",
deps = [
Expand Down Expand Up @@ -40,3 +41,16 @@ py_test(
":optimizers",
],
)

py_test(
name = "weight_decay_optimizers_test",
size = "small",
srcs = [
"weight_decay_optimizers_test.py",
],
main = "weight_decay_optimizers_test.py",
srcs_version = "PY2AND3",
deps = [
":optimizers",
],
)
7 changes: 5 additions & 2 deletions tensorflow_addons/optimizers/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,20 +5,23 @@
|:---------- |:------------- |:--------------|
| lazy_adam | SIG-Addons | [email protected] |
| moving_average | Dheeraj R. Reddy | [email protected] |
| weight_decay_optimizers | Phil Jund | [email protected] |


## Components
| Submodule | Optimizer | Reference |
|:----------------------- |:---------------------- |:---------|
|:--------- |:---------- |:---------|
| lazy_adam | LazyAdam | https://arxiv.org/abs/1412.6980 |
| moving_average | MovingAverage | |
| weight_decay_optimizers | SGDW, AdamW, extend_with_decoupled_weight_decay | https://arxiv.org/pdf/1711.05101.pdf |


## Contribution Guidelines
#### Standard API
In order to conform with the current API standard, all optimizers
must:
* Inherit from either `keras.optimizer_v2.OptimizerV2` or its subclasses.
* [Register as a keras global object](https://github.com/tensorflow/addons/blob/master/tensorflow_addons/utils/python/keras_utils.py)
* [Register as a keras global object](https://github.com/tensorflow/addons/blob/master/tensorflow_addons/utils/keras_utils.py)
so it can be serialized properly.
* Add the addon to the `py_library` in this sub-package's BUILD file.

Expand Down
4 changes: 4 additions & 0 deletions tensorflow_addons/optimizers/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,4 +19,8 @@
from __future__ import print_function

from tensorflow_addons.optimizers.lazy_adam import LazyAdam
from tensorflow_addons.optimizers.weight_decay_optimizers import AdamW
from tensorflow_addons.optimizers.weight_decay_optimizers import SGDW
from tensorflow_addons.optimizers.weight_decay_optimizers import (
extend_with_decoupled_weight_decay)
from tensorflow_addons.optimizers.moving_average import MovingAverage
Loading