RNN&&LSTM

RNN

理解RNN内部

  • 我们可以用前馈神经网络(FFNNs)那样去理解

  • Backpropagation through time(BPTT)

    For n-layers, can check here

LSTM

Word Embeddings

Basic

  • Reduce the dimensionality of text data.
  • Learn some interesting traits about words in a vocabulary.

Embedding Weight Matrix/Lookup table

  • The embedding can graetly improve the ability of networks to learn from text data, by representing that data as lower-dimensional vectors.


  • Embedding lookup

Word2Vec

The Word2Vec algorithm finds much more efficient representations by finding vectors that represent the words. These vectors also contain semantic information about the words.

  • Two architectures for implementing Word2Vec:

CODING

  1. 将数据feed给FC层之前,一般需要进行“降维”
    1
    2
    3
    4
    # shape output to be (batch_size*seq_length, hidden_dim)
    r_out = r_out.view(-1, self.hidden_dim)
    # sometime you may need to use contiguous to reshape the output
    r_out = r_out.contiguous().view(-1, self.hidden_dim)
------ 本文结束 ------