How is long term dependency maintained while building a language model?

    • Language models can be built using the following popular methods –
      1. Using n-gram language model
      2. Using hidden Markov Model(HMM)
        • HMM maintains long term dependency using hidden states. Given the current hidden state, current observation becomes independent of previous observations. In other words, hidden states carry the dependency on previous words. This answer explains how one can generate text using an hMM.
      3. Using Long Short-Term Memory(LSTM)
        • LSTM is neural language model and a type of RNN which through the combination of its 3 gates(forget gate, update gate and input gate) and a cell, maintains the dependency on words over arbitrary interval of time.

Leave a Reply

Your email address will not be published. Required fields are marked *