Skip to main content

Hierarchical Deep Gaussian Processes Latent Variable Model via Expectation Propagation

  • Conference paper
  • First Online:
Artificial Neural Networks and Machine Learning – ICANN 2021 (ICANN 2021)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 12893))

Included in the following conference series:

  • 2426 Accesses

Abstract

Gaussian Processes (GPs) and related unsupervised learning techniques such as Gaussian Process Latent Variable Models (GP-LVMs) have been very successful in the accurate modeling of high-dimensional data based on limited amounts of training data. Usually these techniques have the disadvantage of a high computational complexity. This makes it difficult to solve the associated learning problems for complex hierarchical models and large data sets, since the related computations, as opposed to neural networks, are not node-local. Combining sparse approximation techniques for GPs and Power Expectation Propagation, we present a framework for the computationally efficient implementation of hierarchical deep Gaussian process (latent variable) models. We provide implementations of this approach on the GPU as well as on the CPU, and we benchmark efficiency comparing different optimization algorithms. We present the first implementation of such deep hierarchical GP-LVMs and demonstrate the computational efficiency of our GPU implementation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, Heidelberg (2007)

    Google Scholar 

  2. Brand, M., Hertzmann, A.: Style machines. In: Proceedings of SIGGRAPH 2000, pp. 183–192 (2000)

    Google Scholar 

  3. Bui, T.D.: Efficient deterministic approximate Bayesian inference for Gaussian process models (September 2017)

    Google Scholar 

  4. Quiñonero Candela, J., Rasmussen, C.E.: A unifying view of sparse approximate Gaussian process regression. J. Mach. Learn. Res. 6, 1939–1959 (2005)

    MathSciNet  MATH  Google Scholar 

  5. Chai, J., Hodgins, J.K.: Performance animation from low-dimensional control signals. ACM Trans. Graph. 24(3), 686–696 (2005)

    Article  Google Scholar 

  6. Dai, Z., Damianou, A.C., Hensman, J., Lawrence, N.D.: Gaussian process models with parallelization and GPU acceleration. CoRR abs/1410.4 (2014)

    Google Scholar 

  7. Damianou, A., Lawrence, N.: Deep Gaussian processes. In: Proceedings of Machine Learning Research, vol. 31, pp. 207–215. PMLR, Scottsdale (2013)

    Google Scholar 

  8. Grassia, F.S.: Practical parameterization of rotations using the exponential map. J. Graph. Tools 3(3), 29–48 (1998)

    Article  Google Scholar 

  9. Grochow, K., Martin, S.L., Hertzmann, A., Popovic, Z.: Style-based inverse kinematics. ACM Trans. Graph. 23(3), 522–531 (2004)

    Article  Google Scholar 

  10. Harvey, F.G., Pal, C.: Recurrent transition networks for character locomotion (2019)

    Google Scholar 

  11. Hernandez-Lobato, D., Hernandez-Lobato, J.M.: Scalable Gaussian process classification via expectation propagation. In: Gretton, A., Robert, C.C. (eds.) Proceedings of the 19th International Conference on Artificial Intelligence and Statistics. Proceedings of Machine Learning Research, vol. 51, pp. 168–176. PMLR, Cadiz (2016)

    Google Scholar 

  12. Holden, D., Komura, T., Saito, J.: Phase-functioned neural networks for character control. ACM Trans. Graph. 36(4), 1–13 (2017)

    Article  Google Scholar 

  13. Ikemoto, L., Arikan, O., Forsyth, D.A.: Generalizing motion edits with Gaussian processes. ACM Trans. Graph. 28(1), 1–12 (2009)

    Article  Google Scholar 

  14. Kaiser, M., Otte, C., Runkler, T., Ek, C.H.: Bayesian alignments of warped multi-output Gaussian processes. In: Bengio, S., Wallach, H., Larochelle, H., Grauman, K., Cesa-Bianchi, N., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 31, pp. 6995–7004. Curran Associates, Inc. (2018)

    Google Scholar 

  15. Lawrence, N.D.: Probabilistic non-linear principal component analysis with Gaussian process latent variable models. J. Mach. Learn. Res. 6, 1783–1816 (2005)

    MathSciNet  MATH  Google Scholar 

  16. Levine, S., Wang, J.M., Haraux, A., Popović, Z., Koltun, V.: Continuous character control with low-dimensional embeddings. ACM Trans. Graph. 31(4), 1–10 (2012)

    Article  Google Scholar 

  17. Li, Y., Hernandez-Lobato, J.M., Turner, R.E.: Stochastic expectation propagation (2015)

    Google Scholar 

  18. Minka, T.P.: Expectation propagation for approximate Bayesian inference. In: Proceedings of the Seventeenth Conference on Uncertainty in Artificial Intelligence, UAI 2001, pp. 362–369. Morgan Kaufmann Publishers Inc., San Francisco (2001)

    Google Scholar 

  19. Minka, T.: Power EP. Tech. rep. (2004)

    Google Scholar 

  20. Naish-guzman, A., Holden, S.: The generalized FITC approximation. In: Platt, J., Koller, D., Singer, Y., Roweis, S. (eds.) Advances in Neural Information Processing Systems, vol. 20. Curran Associates, Inc. (2008)

    Google Scholar 

  21. Neal, R.: Bayesian learning for neural networks. Ph.D. thesis, Dept. of Computer Science, University of Toronto (1994)

    Google Scholar 

  22. Rasmussen, C.E., Williams, C.K.I.: Gaussian processes for machine learning. J. Am. Stat. Assoc. 103, 429 (2008)

    Google Scholar 

  23. Schmerling, E., Leung, K., Vollprecht, W., Pavone, M.: Multimodal probabilistic model-based planning for human-robot interaction. In: 2018 IEEE International Conference on Robotics and Automation (ICRA), pp. 3399–3406 (May 2018)

    Google Scholar 

  24. Schwaighofer, A., Tresp, V.: Transductive and inductive methods for approximate Gaussian process regression. In: Advances in Neural Information Processing Systems, vol. 15. pp. 953–960. MIT Press (2002)

    Google Scholar 

  25. Taubert, N., Löffler, M., Ludolph, N., Christensen, A., Endres, D., Giese, M.A.: A virtual reality setup for controllable, stylized real-time interactions between humans and avatars with sparse Gaussian process dynamical models. In: Proceedings of SAP 2013, pp. 41–44 (2013)

    Google Scholar 

  26. Titsias, M.: Variational learning of inducing variables in sparse Gaussian processes. J. Mach. Learn. Res. - Proc. Track 5, 567–574 (2009)

    Google Scholar 

  27. van der Wilk, M., Dutordoir, V., John, S.T., Artemev, A., Adam, V., Hensman, J.: A framework for interdomain and multioutput Gaussian processes (2020)

    Google Scholar 

  28. Yalamanchili, P., et al.: ArrayFire - a high performance software library for parallel computing with an easy-to-use API (2015)

    Google Scholar 

  29. Ye, Y., Liu, C.K.: Synthesis of responsive motion using a dynamic model. Comput. Graph. Forum 29(2), 555–562 (2010)

    Article  Google Scholar 

  30. Zhao, X., Robu, V., Flynn, D., Dinmohammadi, F., Fisher, M., Webster, M.: Probabilistic model checking of robots deployed in extreme environments (2018)

    Google Scholar 

  31. Zhu, H., Rohwer, R.: Information geometric measurements of generalisation (1995)

    Google Scholar 

Download references

Acknowledgments

This research was funded through The research leading to these results has received funding from HFSP RGP0036/2016; NVIDIA Corp.; BMBF FKZ 01GQ1704, KONSENS-NHE BW Stiftung NEU007/1; DFG GZ: KA 1258/15-1; ERC 2019-SyG-RELEVANCE-856495; SSTeP-KiZ BMG: ZMWI1-2520DAT700.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nick Taubert .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Taubert, N., Giese, M.A. (2021). Hierarchical Deep Gaussian Processes Latent Variable Model via Expectation Propagation. In: Farkaš, I., Masulli, P., Otte, S., Wermter, S. (eds) Artificial Neural Networks and Machine Learning – ICANN 2021. ICANN 2021. Lecture Notes in Computer Science(), vol 12893. Springer, Cham. https://doi.org/10.1007/978-3-030-86365-4_26

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-86365-4_26

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-86364-7

  • Online ISBN: 978-3-030-86365-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics