Skip to main content

High-Quality Consistent Illumination in Mobile Augmented Reality by Radiance Convolution on the GPU

  • Conference paper
  • First Online:
Advances in Visual Computing (ISVC 2015)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 9474))

Included in the following conference series:

Abstract

Consistent illumination of virtual and real objects in augmented reality (AR) is essential to achieve visual coherence. This paper presents a practical method for rendering with consistent illumination in AR in two steps. In the first step, a user scans the surrounding environment by rotational motion of the mobile device and the real illumination is captured. We capture the real light in high dynamic range (HDR) to preserve its high contrast. In the second step, the captured environment map is used to precalculate a set of reflection maps on the mobile GPU which are then used for real-time rendering with consistent illumination. Our method achieves high quality of the reflection maps because the convolution of the environment map by the BRDF is calculated accurately per each pixel of the output map. Moreover, we utilize multiple render targets to calculate reflection maps for multiple materials simultaneously. The presented method for consistent illumination in AR is beneficial for increasing visual coherence between virtual and real objects. Additionally, it is highly practical for mobile AR as it uses only a commodity mobile device.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Azuma, R.T.: A survey of augmented reality. Presence Teleoperators Virtual Environ. 6, 355–385 (1997)

    Article  Google Scholar 

  2. Knorr, S., Kurz, D.: Real-time illumination estimation from faces for coherent rendering. In: IEEE ISMAR 2014, pp. 113–122 (2014)

    Google Scholar 

  3. Gruber, L., Richter-Trummer, T., Schmalstieg, D.: Real-time photometric registration from arbitrary geometry. In: IEEE ISMAR, pp. 119–128 (2012)

    Google Scholar 

  4. Meilland, M., Barat, C., Comport, A.: 3D high dynamic range dense visual slam and its application to real-time object re-lighting. In: IEEE ISMAR (2013)

    Google Scholar 

  5. Kán, P., Kaufmann, H.: High-quality reflections, refractions, and caustics in augmented reality and their contribution to visual coherence. In: IEEE ISMAR, pp. 99–108. IEEE Computer Society (2012)

    Google Scholar 

  6. Knecht, M., Traxler, C., Mattausch, O., Wimmer, M.: Reciprocal shading for mixed reality. Comput. Graph. 36, 846–856 (2012)

    Article  Google Scholar 

  7. Grosch, T., Eble, T., Mueller, S.: Consistent interactive augmentation of live camera images with correct near-field illumination. In: ACM Symposium on Virtual Reality Software and Technology, pp. 125–132. ACM, New York (2007)

    Google Scholar 

  8. Kán, P., Kaufmann, H.: Differential irradiance caching for fast high-quality light transport between virtual and real worlds. In: IEEE ISMAR, pp. 133–141 (2013)

    Google Scholar 

  9. Franke, T.: Delta voxel cone tracing. In: IEEE ISMAR, pp. 39–44 (2014)

    Google Scholar 

  10. Franke, T.: Delta light propagation volumes for mixed reality. In: IEEE ISMAR, pp. 125–132 (2013)

    Google Scholar 

  11. Rohmer, K., Buschel, W., Dachselt, R., Grosch, T.: Interactive near-field illumination for photorealistic augmented reality on mobile devices. In: IEEE ISMAR, pp. 29–38 (2014)

    Google Scholar 

  12. Pessoa, S., Moura, G., Lima, J., Teichrieb, V., Kelner, J.: Photorealistic rendering for augmented reality: a global illumination and brdf solution. In: 2010 IEEE Virtual Reality Conference (VR), pp. 3–10 (2010)

    Google Scholar 

  13. Nowrouzezahrai, D., Geiger, S., Mitchell, K., Sumner, R., Jarosz, W., Gross, M.: Light factorization for mixed-frequency shadows in augmented reality. In: IEEE ISMAR, pp. 173–179 (2011)

    Google Scholar 

  14. Sato, I., Sato, Y., Ikeuchi, K.: Illumination from shadows. IEEE Trans. Pattern Anal. Mach. Intell. 25, 290–300 (2003)

    Article  Google Scholar 

  15. Miller, G.S., Hoffman, C.R.: Illumination and reflection maps: simulated objects in simulated and real environments. In: SIGGRAPH 1984 (1984)

    Google Scholar 

  16. Kautz, J., Vzquez, P.P., Heidrich, W., Seidel, H.P.: A unified approach to prefiltered environment maps. In: Péroche, B., Rushmeier, H. (eds.) Rendering Techniques 2000, pp. 185–196. Springer, Vienna (2000)

    Chapter  Google Scholar 

  17. Ramamoorthi, R., Hanrahan, P.: An efficient representation for irradiance environment maps. In: SIGGRAPH, pp. 497–500. ACM, New York (2001)

    Google Scholar 

  18. McGuire, M., Evangelakos, D., Wilcox, J., Donow, S., Mara, M.: Plausible Blinn-Phong reflection of standard cube MIP-maps. Technical report CSTR201301, Department of Computer Science, Williams College, USA (2013)

    Google Scholar 

  19. Scherzer, D., Nguyen, C.H., Ritschel, T., Seidel, H.P.: Pre-convolved Radiance Caching. Comput. Graph. Forum 4, 1391–1397 (2012)

    Google Scholar 

  20. Kautz, J., Daubert, K., Seidel, H.P.: Advanced environment mapping in VR applications. Comput. Graph. 28, 99–104 (2004)

    Article  Google Scholar 

  21. Agusanto, K., Li, L., Chuangui, Z., Sing, N.W.: Photorealistic rendering for augmented reality using environment illumination. In: IEEE ISMAR, pp. 208–218 (2003)

    Google Scholar 

  22. Supan, P., Stuppacher, I., Haller, M.: Image based shadowing in real-time augmented reality. IJVR 5, 1–7 (2006)

    Google Scholar 

  23. Franke, T., Jung, Y.: Real-time mixed reality with GPU techniques. In: GRAPP, pp. 249–252 (2008)

    Google Scholar 

  24. Mehta, S.U., Kim, K., Pajak, D., Pulli, K., Kautz, J., Ramamoorthi, R.: Filtering environment illumination for interactive physically-based rendering in mixed reality. In: Eurographics Symposium on Rendering (2015)

    Google Scholar 

  25. Kán, P.: Interactive HDR environment map capturing on mobile devices. In: Eurographics 2015 - Short Papers, pp. 29–32. The Eurographics Association (2015)

    Google Scholar 

  26. Debevec, P.: Rendering synthetic objects into real scenes: Bridging traditional and image-based graphics with global illumination and high dynamic range photography. In: SIGGRAPH 1998, pp. 189–198. ACM, New York (1998)

    Google Scholar 

  27. Robertson, M., Borman, S., Stevenson, R.: Dynamic range improvement through multiple exposures. ICIP 3, 159–163 (1999)

    Google Scholar 

  28. Kajiya, J.T.: The rendering equation. In: SIGGRAPH, pp. 143–150 (1986)

    Google Scholar 

  29. Lafortune, E.P., Willems, Y.D.: Using the modified phong reflectance model for physically based rendering. Technical report, K.U. Leuven (1994)

    Google Scholar 

  30. Reinhard, E., Stark, M., Shirley, P., Ferwerda, J.: Photographic tone reproduction for digital images. ACM Trans. Graph. 21, 267–276 (2002)

    Article  Google Scholar 

Download references

Acknowledgements

The dragon model is the courtesy of Stanford Computer Graphics Laboratory. The teapot model is the courtesy of Martin Newell. This research was funded by Austrian project FFG-BRIDGE 843484.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Peter Kán .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Kán, P., Unterguggenberger, J., Kaufmann, H. (2015). High-Quality Consistent Illumination in Mobile Augmented Reality by Radiance Convolution on the GPU. In: Bebis, G., et al. Advances in Visual Computing. ISVC 2015. Lecture Notes in Computer Science(), vol 9474. Springer, Cham. https://doi.org/10.1007/978-3-319-27857-5_52

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-27857-5_52

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-27856-8

  • Online ISBN: 978-3-319-27857-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics