Skip to content

"add tensorflow lstm"#4

Merged
dzhwinter merged 5 commits intomasterfrom
feature/tensorflow_lstm
Dec 5, 2017
Merged

"add tensorflow lstm"#4
dzhwinter merged 5 commits intomasterfrom
feature/tensorflow_lstm

Conversation

@dzhwinter
Copy link
Owner

cost = tf.nn.softmax_cross_entropy_with_logits(prediction, label)
avg_cost = tf.reduce_mean(cost)
adam_optimizer = tf.train.AdamOptimizer(learning_rate=0.002)
train_op = adam_optimizer.minimizer(avg_cost)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

minimizer => minimize?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.


initial_state = lstm_cell.zero_state(batch_size, dtype=tf.float32)
outputs, states = rnn.static_rnn(
lstm_cell, lstm_input, dtype=tf.float32)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里看着只有一层lstm,可以设置多层stacked的lstm吗?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

train_reader = paddle.batch(
paddle.reader.shuffle(
paddle.dataset.imdb.train(word_dict),
buf_size=FLAGS.batch_size * 10),
Copy link
Collaborator

@qingqing01 qingqing01 Dec 5, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IMDB traing set总共25 000,buf_size设置成25000吧,增加buf_size,应该可以加速度reader。

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think there is any difference, and the factor caused by buffer should be removed in our validation scripts.

@dzhwinter dzhwinter merged commit 81461bc into master Dec 5, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

3 participants