Fixing typos, grammatical and spelling mistakes in Refactorization Overview document#4479
Conversation
dzhwinter
left a comment
There was a problem hiding this comment.
LGTM!
A great job which enhanced our refactoring document!
| 1. A graph is composed of *variables* and *operators*. | ||
|
|
||
| 1. The description of graphs must be able to be serialized/deserialized, so it | ||
| 1. The description of graphs must be capable of being serialized/deserialized, so that |
| 1. The Python program does the following steps | ||
|
|
||
| 1. *compilation*: runs a Python program to generate a protobuf message representation of the graph and send it to | ||
| 1. *compilation*: run a Python program to generate a protobuf message representation of the graph and send it to |
| 1. It can be sent to clients for mobile or enterprise deployment. | ||
|
|
||
| 1. The Python program do | ||
| 1. The Python program does the following steps |
There was a problem hiding this comment.
The Python program does two things:
| 1. the master process of a distributed training job for training, or | ||
| 1. the server process of a Kubernetes serving job for distributed serving. | ||
| 1. *execution*: according to the protobuf message, constructs instances of class `Variable` and `OperatorBase`, and run them. | ||
| 1. *execution*: execute the graph by constructing instances of class [`Variable`](https://github.com/PaddlePaddle/Paddle/blob/develop/paddle/framework/variable.h#L24) and [`OperatorBase`](https://github.com/PaddlePaddle/Paddle/blob/develop/paddle/framework/operator.h#L70), according to the protobuf message. |
There was a problem hiding this comment.
*Execution* executes the graph
| ## Description and Realization of Computation Graph | ||
|
|
||
| At compile time, the Python program generates protobuf message representation of the graph, or the description of the graph. | ||
| At compile time, the Python program generates a protobuf message representation of the graph, or the description of the graph. |
There was a problem hiding this comment.
a description of the graph.
| 1. Write an Op class and its gradient Op class, if required. | ||
| 2. Write an Op maker class. In the constructor of this class, describe the inputs, outputs and attributes of the operator. | ||
| 3. Invoke the macro `REGISTER_OP`. This macro will | ||
| 1. Call maker class to complete the `proto` and the `checker` |
| 2. Write an Op maker class. In the constructor of this class, describe the inputs, outputs and attributes of the operator. | ||
| 3. Invoke the macro `REGISTER_OP`. This macro will | ||
| 1. Call maker class to complete the `proto` and the `checker` | ||
| 2. Using the completed `proto` and `checker`, it will add a new key-value pair to the `OpInfoMap` |
There was a problem hiding this comment.
Add a new key-value pair to OpInfoMap using the completed proto and checker.
| 2. Using the completed `proto` and `checker`, it will add a new key-value pair to the `OpInfoMap` | ||
|
|
||
| 4. Invoke `USE` macro in where the Op is used to make sure it is linked. | ||
| 4. Invoke the `USE` macro in which the Op is used, to make sure that it is linked. |
There was a problem hiding this comment.
Invoke the USE macro where an Op is used to make sure that it is linked.
| - shared variable => insert `Add` operator | ||
| - no gradient => insert `fill_zero_grad` operator | ||
| - recursive netOp => call `Backward` recursively | ||
| - **Input**: graph of forwarding operators |
| * `Scope` has a hierarchical structure. The local scope can get variable from its parent scope. | ||
| * All operations on `Tensor` are written in `Operator` or global functions. | ||
| * Variable length Tensor design [LoDTensor](https://github.com/PaddlePaddle/Paddle/blob/develop/paddle/framework/lod_tensor.md) | ||
| * `Variable` instances are the inputs and the outputs of an operator. Not just `Tensor`. |
pengli09
left a comment
There was a problem hiding this comment.
One critical error and a few potential typos.
|
|
||
| 1. could to be sent to the cloud for distributed execution, and | ||
| 1. be sent to clients for mobile or enterprise deployment. | ||
| 1. It can to be sent to the cloud for distributed execution, and |
| 1. Infer the type and the shape of variables, | ||
| 1. Plan memory-reuse for variables, | ||
| 1. Generate the backward graph | ||
| 1. Optimize the computation graph. |
There was a problem hiding this comment.
Here "optimization'' refers to the optimization operators related stuff such as Adam, which is also a part of the graph. Therefore, "Optimize the computation graph." is incorrect.
| * Use `Run` to compute `input variables` to `output variables`. | ||
| * `Operator` is the fundamental building block of the user interface. | ||
| * Operator stores input/output variable names, and attributes. | ||
| * The `InferShape` interface is used to infer the shape of the output variable shapes based on the shapes of the input variables. |
There was a problem hiding this comment.
s/the output variable shapes/the output variables/g
| * Variable length Tensor design [LoDTensor](https://github.com/PaddlePaddle/Paddle/blob/develop/paddle/framework/lod_tensor.md) | ||
| * `Variable` instances are the inputs and the outputs of an operator. Not just `Tensor`. | ||
| * `step_scopes` in RNN is a variable and not a tensor. | ||
| * `Scope` is where variables are stores. |
| * `Variable` instances are the inputs and the outputs of an operator. Not just `Tensor`. | ||
| * `step_scopes` in RNN is a variable and not a tensor. | ||
| * `Scope` is where variables are stores. | ||
| * map<string `variable_name`, Variable> |
There was a problem hiding this comment.
I think /* var name*/ is better.
| - take the documentation quality into account when doing PR | ||
| - preview the documentations, read and improve them from users' perspective | ||
| - Compare the performance of migrated models with old ones. | ||
| - Follow the google C++ style |
There was a problem hiding this comment.
s/style/style./g
Note the period.
No description provided.