-
- Language models can be built using the following popular methods –
- Using n-gram language model
- n-gram language models make assumption for the value of n. Larger the value of n, longer the dependency. One can refer to what is the significance of n-grams in a language model for further reading.
- Using hidden Markov Model(HMM)
- HMM maintains long term dependency using hidden states. Given the current hidden state, current observation becomes independent of previous observations. In other words, hidden states carry the dependency on previous words. This answer explains how one can generate text using an hMM.
- Using Long Short-Term Memory(LSTM)
- LSTM is neural language model and a type of RNN which through the combination of its 3 gates(forget gate, update gate and input gate) and a cell, maintains the dependency on words over arbitrary interval of time.
- Using n-gram language model
- Language models can be built using the following popular methods –
How is long term dependency maintained while building a language model?
Posted on