Language models conditioned on dialog state.

2001 
We consider various techniques for using the state of the dialog in language modeling. The language models we built were for use in an automated airline travel reservation system. The techniques that we explored include (1) linear interpolation with state specific models and (2) incorporating state information using maximum entropy techniques. We also consider using the system prompt as part of the language model history. We show that using state results in about a relative gain in perplexity and about a percent relative gain in word error rate over a system using a language model with no information of the state.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    10
    References
    17
    Citations
    NaN
    KQI
    []