Skip to content

Feature/lstm#2

Merged
dzhwinter merged 4 commits intomasterfrom
feature/lstm
Dec 5, 2017
Merged

Feature/lstm#2
dzhwinter merged 4 commits intomasterfrom
feature/lstm

Conversation

@dzhwinter
Copy link
Owner

@dzhwinter dzhwinter commented Dec 1, 2017

add static rnn benchmark scripts (book chapter6)
fix PaddlePaddle/Paddle#6156

'--use_cprof', action='store_true', help='If set, use cProfile.')
parser.add_argument(
'--use_nvprof',
action='store_false',
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

store_false -> store_ture . disable use_nvprof by defalut.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done.


c_pre_init = fluid.layers.fill_constant(
dtype=emb.dtype, shape=[batch_size, emb_dim], value=0.0)
layer_1_out = fluid.layers.lstm(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we can add an argument to allow to set multi-lstm?Here

lstm_var = emb
for i in range(arg.lstm_num):
    lstm_var = fluid.layers.lstm(lstm_var)
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed.


c_pre_init = fluid.layers.fill_constant(
dtype=emb.dtype, shape=[batch_size, emb_dim], value=0.0)
layer_1_out = fluid.layers.lstm(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we can add an argument to allow to set multi-lstm?Here

lstm  = emb
for i in range(arg.lstm_num):
    lstm = fluid.layers.lstm(lstm)
for i in range(stacked_num):
layer_1_out = fluid.layers.lstm(
layer_1_out, c_pre_init=c_pre_init, hidden_dim=emb_dim)
layer_1_out = fluid.layers.transpose(x=layer_1_out, axis=[1, 0, 2])
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The fluid.layers.transpose should be in the outside of for loop.

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed.

@dzhwinter dzhwinter merged commit 0f93e59 into master Dec 5, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

2 participants