-
Notifications
You must be signed in to change notification settings - Fork 617
Closed
Labels
Description
Question
Hi !
I have successfully applied the weight normalization wrapper in my model, and the model can run without any bug.
However, I wanna know how to remove the weight normalization at inference time?
Just like the layer normalization,as we all know, layer normalization API has a param called is_training, so that layer normalization can be removed automatically.
In a project written with pytorch called melgan
I read their pytorch code and I found that they remove weight norm at inference time. What's more, I have pulled an issue in that project.
Key code in my model
def wn_conv1d(x, kernel_size, channels, scope, stride=1, pad='same', dilation=1, groups=1):
with tf.variable_scope(scope):
output = WeightNorm(group_Conv1D(
filters = channels,
kernel_size = kernel_size,
strides = stride,
padding = pad,
dilation_rate = dilation,
groups = groups))(x)
return outputMore
The WeightNorm is from this repository
Haha, I found the contributor is @seanpmorgan , you sir are a hero!
In my opinion, addons should be inserted into tensorflow rather than left as a lone repository...