How do you generate text using a Hidden Markov Model (HMM) ?

The HMM is a latent variable model where the observed sequence of variables y are assumed to be generated from a set of temporally connected latent  variables x.

The joint distribution of the observed variables or data y and the latent variables x can be written as :

    \[p(x, y)=p(x|y)p(y)\,=\,\prod_{t=1}^{T}p(x_{t}|y_{t})p(y_{t}|y_{t-1})\]

One possible interpretation of the latent variables in the HMM model is that they are POS tags. We will go with this interpretation for simplicity, though the latent states could mean other things as well.

To generate text from a HMM, we need to know the transition matrix (the probability of going from one tag to another) and the emission/output matrix (the probability of generating a token given the tag.) Given this :

  • First generate the state (tag) y_{1} .
  • We then generate all the other tags using

        \[p(y_{t}|y_{t-1})\]

    .

  • Then from each tag, generate a word(at each position t) using the distribution

        \[p(x_{t}|y_{t})\]

    . Note that this is possible because given the current tag y_{t}, observed variable x_{t} doesn’t depend on x_{t-1} and y_{t-1}.

Leave a Reply

Your email address will not be published. Required fields are marked *