Skip to main content
placeholder image

Sparse Gaussian conditional random fields on top of recurrent neural networks

Conference Paper


Abstract


  • Predictions of time-series are widely used in different disciplines. We propose CoR, Sparse Gaussian Conditional Random Fields (SGCRF) on top of Recurrent Neural Networks (RNN), for problems of this kind. CoR gains advantages from both RNN and SGCRF. It can not only effectively represent the temporal correlations in observed data, but can also learn the structured information of the output. CoR is challenging to train because it is a hybrid of deep neural networks and densely-connected graphical models. Alternative training can be a tractable way to train CoR, and furthermore, an end-to-end training method is proposed to train CoR more efficiently. CoR is evaluated by both synthetic data and real-world data, and it shows a significant improvement in performance over state-of-the-art methods.

Publication Date


  • 2018

Citation


  • Wang, X., Zhang, M. & Ren, F. (2018). Sparse Gaussian conditional random fields on top of recurrent neural networks. 32nd AAAI Conference on Artificial Intelligence, AAAI 2018 (pp. 4219-4226). Palo Alto, United States: Association for the Advancement of Artificial Intelligence.

Scopus Eid


  • 2-s2.0-85060466900

Start Page


  • 4219

End Page


  • 4226

Place Of Publication


  • Palo Alto, United States

Abstract


  • Predictions of time-series are widely used in different disciplines. We propose CoR, Sparse Gaussian Conditional Random Fields (SGCRF) on top of Recurrent Neural Networks (RNN), for problems of this kind. CoR gains advantages from both RNN and SGCRF. It can not only effectively represent the temporal correlations in observed data, but can also learn the structured information of the output. CoR is challenging to train because it is a hybrid of deep neural networks and densely-connected graphical models. Alternative training can be a tractable way to train CoR, and furthermore, an end-to-end training method is proposed to train CoR more efficiently. CoR is evaluated by both synthetic data and real-world data, and it shows a significant improvement in performance over state-of-the-art methods.

Publication Date


  • 2018

Citation


  • Wang, X., Zhang, M. & Ren, F. (2018). Sparse Gaussian conditional random fields on top of recurrent neural networks. 32nd AAAI Conference on Artificial Intelligence, AAAI 2018 (pp. 4219-4226). Palo Alto, United States: Association for the Advancement of Artificial Intelligence.

Scopus Eid


  • 2-s2.0-85060466900

Start Page


  • 4219

End Page


  • 4226

Place Of Publication


  • Palo Alto, United States