Variational Infinite Hidden Conditional Random Fields

Variational Infinite Hidden Conditional Random Fields

K. Bousmalis, S. Zafeiriou, L.-P. Morency, M. Pantic and Z. Ghahramani, Variational Infinite Hidden Conditional Random Fields, IEEE Transaction on Pattern Analysis and Machine Intelligence (TPAMI), Volume 37, Issue 9, September 2015, Pages 1917-1929


Typical techniques for sequence modeling rely upon well-segmented sequences which have been edited to remove noisy or irrelevant parts. Therefore, we cannot easily apply such methods to noisy sequences expected in real-world applications.

In one of our projects, we study sequence modeling through the combination of RNNs that captures the temporal dependencies and the attention mechanism that localizes the salient observations which are relevant to the final decision and ignore the irrelevant (noisy) parts of the input sequence.

More recent work uses more powerful neural network models such as Transformers to process longer sequences. One of our more recent projects uses a hierarchical architecture to model multiple temporal resolutions of sequences, allowing representations of data with different degrees of granularity and more easily capturing long-range dependencies. This work is being applied to music generation, where modeling hierarchical structure is especially important.