Skip to content

Conversation

@sljlp
Copy link
Contributor

@sljlp sljlp commented Nov 11, 2021

Add FQA for Optimizer

Copy link
Contributor

@wangxicoding wangxicoding left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM


----------

##### 问题:如何在同一个优化器中定义不同参数的优化策略,比如bias的参数weight_decay的值为0.0,非bias的参数weight_decay的值为0.01?
Copy link
Collaborator

@Ligoml Ligoml Nov 12, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

注意标点符号统一:
image

另外序号格式有些问题,需要改下 md

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

##### 问题:如何在同一个优化器中定义不同参数的优化策略,比如bias的参数weight_decay的值为0.0,非bias的参数weight_decay的值为0.01?

+ 答复:
- 1. `AdamW`的参数`apply_decay_param_fun`可以用来选择哪些参数使用decay_weight策略。可以参考AdamW优化器的[API文档](https://www.paddlepaddle.org.cn/documentation/docs/zh/api/paddle/optimizer/AdamW_cn.html#adamw)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

api 文档的链接方式通常采用直接在出现位置加超链接的形式,建议统一下:
AdamW的参数apply_decay_param_fun可以用来选择哪些参数使用decay_weight策略。


----------

#### 问题:paddle fluid如何自定义优化器,自定义更新模型参数的规则?
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

同上格式需要统一下:
image

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

image
我这里序号显示正常

Copy link
Collaborator

@Ligoml Ligoml Nov 12, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

官网的解析可能和本地 markdown 不太一致,还是以官网的预览为准吧

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

image
改好了

#### 问题:paddle fluid如何自定义优化器,自定义更新模型参数的规则?
+ 答复:
- 1. 要定义全新优化器,自定义优化器中参数的更新规则,可以通过继承fluid.Optimizer,重写_append_optimize_op方法实现。不同优化器实现原理各不相同,一般流程是先获取learning_rate,gradients参数,可训练参数,以及该优化器自身特别需要的参数,然后实现更新参数的代码,最后返回更新后的参数。代码样例可参考[paddle源码](https://github.com/PaddlePaddle/Paddle/blob/develop/python/paddle/fluid/optimizer.py)中AdamOptimizer等优化器的实现。
- 2. 使用现有的常用优化器,可以在创建`Param`的时候,可以通过设置`ParamAttr`的属性来控制参数的属性,可以通过设置`regularize``learning_rate`等参数简单设置参数的更新规则。可以参考[ParamAttr文档](https://www.paddlepaddle.org.cn/documentation/docs/zh/api/paddle/ParamAttr_cn.html#paramattr)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

内容设置方面,建议从动态图和静态图两个方面分别来讲下如何自定义优化器,让这条 FAQ 能帮到更多的用户~

Copy link
Contributor Author

@sljlp sljlp Nov 12, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

用户开发的话,动态图和静态图代码可以是统一的
除非自己重写C++代码,开发底层的算子,重新编译paddle,但是这对用户来说会比较繁琐

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

新增了使用原生算子的注意事项

@Ligoml Ligoml merged commit a72b88d into PaddlePaddle:develop Nov 12, 2021
sljlp added a commit to sljlp/docs that referenced this pull request Nov 12, 2021
@sljlp sljlp deleted the add_faq_for_opt branch November 12, 2021 10:48
Ligoml added a commit that referenced this pull request Nov 16, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants