Skip to content

How to remove Weight_Normalization? #763

@MachineJeff

Description

@MachineJeff

Question

Hi !

I have successfully applied the weight normalization wrapper in my model, and the model can run without any bug.

However, I wanna know how to remove the weight normalization at inference time?

Just like the layer normalization,as we all know, layer normalization API has a param called is_training, so that layer normalization can be removed automatically.

In a project written with pytorch called melgan

I read their pytorch code and I found that they remove weight norm at inference time. What's more, I have pulled an issue in that project.
iuuse

Key code in my model

def wn_conv1d(x, kernel_size, channels, scope, stride=1, pad='same', dilation=1, groups=1):
  with tf.variable_scope(scope):
    output = WeightNorm(group_Conv1D(
                          filters = channels, 
                          kernel_size = kernel_size, 
                          strides = stride, 
                          padding = pad, 
                          dilation_rate = dilation,
                          groups = groups))(x)
    return output

More

The WeightNorm is from this repository

Haha, I found the contributor is @seanpmorgan , you sir are a hero!

In my opinion, addons should be inserted into tensorflow rather than left as a lone repository...

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions