From a1f9eaad64a2ad7ae8827d6c2e987affdffec71b Mon Sep 17 00:00:00 2001 From: tink2123 Date: Tue, 18 Sep 2018 18:44:17 +0800 Subject: [PATCH] update paddle changes and add api_guides folder --- .../low_level/optimizer/optimizer_all.rst | 106 ++++++++++++++++++ doc/fluid/dev/versioning_en.md | 66 +++++++++++ 2 files changed, 172 insertions(+) create mode 100644 doc/fluid/api_guides/low_level/optimizer/optimizer_all.rst create mode 100644 doc/fluid/dev/versioning_en.md diff --git a/doc/fluid/api_guides/low_level/optimizer/optimizer_all.rst b/doc/fluid/api_guides/low_level/optimizer/optimizer_all.rst new file mode 100644 index 0000000000000..d5803d8652ead --- /dev/null +++ b/doc/fluid/api_guides/low_level/optimizer/optimizer_all.rst @@ -0,0 +1,106 @@ +.. _api_guide_optimizer: + + +Optimizer +######### + +神经网络最终是一个 `最优化问题 `_ , +在经过 `前向计算和反向传播 `_ 后, +:code:`Optimizer` 使用反向传播梯度,优化神经网络中的参数。 + +1.SGD/SGDOptimizer +------------------ + +:code:`SGD` 是实现 `随机梯度下降 `_ 的一个 :code:`Optimizer` 子类,是 `梯度下降 `_ 大类中的一种方法。 +当需要训练大量样本的时候,往往选择 :code:`SGD` 来使损失函数更快的收敛。 + +API Reference 请参考 api_fluid_optimizer_SGDOptimizer_ + +.. _api_fluid_optimizer_SGDOptimizer: http://www.paddlepaddle.org/docs/0.14.0/api/fluid/en/optimizer.html#permalink-8-sgdoptimizer + +2.Momentum/MomentumOptimizer +---------------------------- + +:code:`Momentum` 优化器在 :code:`SGD` 基础上引入动量,减少了随机梯度下降过程中存在的噪声问题。 +用户在使用时可以将 :code:`ues_nesterov` 参数设置为False或True,分别对应传统 `Momentum(论文4.1节) +`_ 算法和 `Nesterov accelerated gradient(论文4.2节) +`_ 算法。 + +API Reference 请参考 api_fluid_optimizer_MomentumOptimizer_ + +.. _api_fluid_optimizer_MomentumOptimizer: http://www.paddlepaddle.org/docs/0.14.0/api/fluid/en/optimizer.html#permalink-9-momentumoptimizer + +3. Adagrad/AdagradOptimizer +--------------------------- +`Adagrad `_ 优化器可以针对不同参数样本数不平均的问题,自适应地为各个参数分配不同的学习率。 + +API Reference 请参考 api_fluid_optimizer_AdagradOptimizer_ + +.. _api_fluid_optimizer_AdagradOptimizer: http://www.paddlepaddle.org/docs/0.14.0/api/fluid/en/optimizer.html#permalink-10-adagradoptimizer + +4.RMSPropOptimizer +------------------ +`RMSProp优化器 `_ ,是一种自适应调整学习率的方法, +主要解决使用Adagrad后,模型训练中后期学习率急剧下降的问题。 + +API Reference 请参考 api_fluid_optimizer_RMSPropOptimizer_ + +.. _api_fluid_optimizer_RMSPropOptimizer: http://www.paddlepaddle.org/docs/0.14.0/api/fluid/en/optimizer.html#permalink-14-rmspropoptimizer + +5.Adam/AdamOptimizer +-------------------- +`Adam `_ 的优化器是一种自适应调整学习率的方法, +适用于大多非 `凸优化 `_ 、大数据集和高维空间的场景。在实际应用中,:code:`Adam` 是最为常用的一种优化方法。 + +API Reference 请参考 api_fluid_optimizer_AdamOptimizer_ + +.. _api_fluid_optimizer_AdamOptimizer: http://www.paddlepaddle.org/docs/0.14.0/api/fluid/en/optimizer.html#permalink-11-adamoptimizer + + +6.Adamax/AdamaxOptimizer +------------------------ + +`Adamax `_ 是 :code:`Adam` 算法的一个变体,对学习率的上限提供了一个更简单的范围,使学习率的边界范围更简单。 + +API Reference 请参考 api_fluid_optimizer_AdamxOptimizer_ + +.. _api_fluid_optimizer_AdamxOptimizer: http://www.paddlepaddle.org/docs/0.14.0/api/fluid/en/optimizer.html#permalink-12-adamaxoptimizer + + +7.DecayedAdagrad/ DecayedAdagradOptimizer +------------------------------------------- + +`DecayedAdagrad `_ 优化器,可以看做是引入了衰减速率的 :code:`Adagrad` 算法,解决使用Adagrad后,模型训练中后期学习率急剧下降的问题。 + +API Reference 请参考 api_fluid_optimizer_DecayedAdagrad_ + +.. _api_fluid_optimizer_DecayedAdagrad: http://www.paddlepaddle.org/docs/0.14.0/api/fluid/en/optimizer.html#permalink-13-decayedadagradoptimizer + + +8. Ftrl/FtrlOptimizer +---------------------- + +`FtrlOptimizer `_ 优化器结合了 `FOBOS算法 `_ 的高精度与 `RDA算法 +`_ 的稀疏性,是目前效果非常好的一种 `Online Learning `_ 算法。 + +API Reference 请参考 api_fluid_optimizer_FtrlOptimizer_ + +.. _api_fluid_optimizer_FtrlOptimizer: http://www.paddlepaddle.org/docs/0.14.0/api/fluid/en/optimizer.html#permalink-15-ftrloptimizer + +9.ModelAverage +----------------- + +:code:`ModelAverage` 优化器,在训练中通过窗口来累计历史 parameter,在预测时使用取平均值后的paramet,整体提高预测的精度。 + +API Reference 请参考 api_fluid_optimizer_ModelAverage_ + +.. _api_fluid_optimizer_ModelAverage: http://www.paddlepaddle.org/docs/0.14.0/api/fluid/en/optimizer.html#permalink-17-modelaverage + + +10.Optimizer +-------------- +:code:`Optimizer` 这个类是 :code:`Fluid` 中优化器的基类。它的作用是定义优化器的公共接口,用户通过该类调用上述经典的优化算法。 + +API Reference 请参考 api_fluid_optimizer_ + +.. _api_fluid_optimizer: http://www.paddlepaddle.org/docs/0.14.0/api/fluid/en/optimizer.html#permalink-18-optimizer diff --git a/doc/fluid/dev/versioning_en.md b/doc/fluid/dev/versioning_en.md new file mode 100644 index 0000000000000..f15fd029dc92e --- /dev/null +++ b/doc/fluid/dev/versioning_en.md @@ -0,0 +1,66 @@ +# Versioning (Work In Progress) + + +PaddlePaddle framework follows Semantic Versioning 2.0 (semver). +Each release has version of the following format: MAJOR.MINOR.PATCH +(e.g. 1.2.0). Some key points: + + + * Major version number change can result in backward-incompatible changes. Codes working in old version don’t necessarily work in the new version. In addition, data, such as program model and checkpointed parameters, generated by the previous major version might not work in the new version. Tools will be attempted to be built to help the release migration. + + * Minor version number change always maintain backward compatibility. It normally contains compatible improvements and bug fixes. + + * Patch number change is for bug fixes. + + * Violation of the policy are considered as bugs and should be fixed. + +### What is Covered + +* All public documented Python APIs, excluding those live in the contrib namespace. + +### What is Not Covered + +* If an API’s implementation has bugs, we reserve the rights to fix the bugs and change the behavior. + +* The Python APIs in contrib namespace. + +* The Python function and classes that start with ‘_’. + +* The offline tools. + +* The data generated by the framework, such as serialized Program model file and checkpointed variables, are subject to different versioning scheme described below. + +* C++ Inference APIs. (To be covered) + + +## Data + + +Data refers to the artifacts generated by the framework. Here, we specifically mean model Program file and the checkpointed variables. + + + +* Backward Compatibility: User sometimes generates Data at PaddlePaddle version 1.1 and expects it to be consumed by PaddlePaddle version 1.2. + This can happen when an new online system wants to serve an old model trained previously. + + + +* Forward Compatibility: User sometimes generates Data at PaddlePaddle version 1.2 and expects it to be consumed by PaddlePaddle version 1.1. + The can happen when an new successful research model want to be served by an old online system that is not frequently upgraded. + + + +### Versioning + +Data version. Data is assigned an integer version number. Version is increased when incompatible change is introduced. + +PaddlePaddle framework has an interval of Data version that it supports. PadlePaddle framework within the same major version (semver) cannot drop support of lower version of Data. Hence, a minor version change cannot drop support of Data version. + + +For example, For PaddlePaddle version 1.1, it supports Program version 3 to 5. Later, Program version is increased from 5 to 6 due to addition of an attribute. As a result PaddlePaddle version 1.1 won’t be able to consume it. PaddlePaddle 1.2 should support Program version 3 to 6. PaddlePaddle can only drop support for Program version 3 until PaddlePaddle version 2.0. + + + +### Known Issues + +Currently, forward compatibility for new Data version is best-effort.