Skip to content

tfa.nn sub-module to keep the implementation of tensor operation #426

@WindQAQ

Description

@WindQAQ

System information

  • TensorFlow version (you are using): nightly
  • TensorFlow Addons version: source
  • Is it in the tf.contrib (if so, where): no
  • Are you willing to contribute it (yes/no): yes
  • Are you willing to maintain it going forward? (yes/no): yes

Describe the feature and the current behavior/state.

Does anyone have some thoughts on tfa.nn sub-module? IMO, it's somewhat weird to expose tensor operations like sparsemax or gelu only in tfa.activations.

Like in core TF, the real implementations of tensor operations such as relu and selu are in tf.nn.* (most of them are done in C++, and ported to python in tensorflow.python.ops.nn_ops.py). These tensor operations in tf.nn.* are widely used in tf.keras.activations or tf.keras.layers (In fact, they use tf.keras.backend.*, which is like a keras wrapper of tf.nn.*). tf.keras.activations acts more like a hub of activation functions instead of the place to put the real implementations.

Currently, the implementation of sparsemax or gelu (WIP) are in tfa.activations, and their Layer subclass calls tfa.activations.* for call method. If we have tfa.nn sub-module, then both tfa.activations and tfa.layers use tfa.nn directly, which is much more similar with core TF style.

https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/ops/nn_ops.py
https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/keras/layers/advanced_activations.py#L313
https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/keras/activations.py

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions