Skip to content

Adding interface for the adagrad optimizer#4977

Merged
kexinzhao merged 2 commits intoPaddlePaddle:developfrom
kexinzhao:python_adagrad
Oct 21, 2017
Merged

Adding interface for the adagrad optimizer#4977
kexinzhao merged 2 commits intoPaddlePaddle:developfrom
kexinzhao:python_adagrad

Conversation

@kexinzhao
Copy link
Contributor

fixes #4916

"""
_moment_acc_str = "moment"

def __init__(self, learning_rate, epsilon):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can give a default value to the epsilon.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also if the original paper mentions a default learning rate value for the Adagrad optimizer, then you can add it too.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done. The original paper does not specify a default learning rate. Only added default value to epsilon.

Copy link
Contributor

@abhinavarora abhinavarora left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thank you. The PR looks great.

@kexinzhao kexinzhao merged commit 5fd4bee into PaddlePaddle:develop Oct 21, 2017
@kexinzhao kexinzhao deleted the python_adagrad branch October 21, 2017 00:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

2 participants