skip to main content
10.1145/1806596.1806618acmconferencesArticle/Chapter ViewAbstractPublication PagespldiConference Proceedingsconference-collections
research-article

Evaluating the accuracy of Java profilers

Published:05 June 2010Publication History

ABSTRACT

Performance analysts profile their programs to find methods that are worth optimizing: the "hot" methods. This paper shows that four commonly-used Java profilers (xprof , hprof , jprofile, and yourkit) often disagree on the identity of the hot methods. If two profilers disagree, at least one must be incorrect. Thus, there is a good chance that a profiler will mislead a performance analyst into wasting time optimizing a cold method with little or no performance improvement.

This paper uses causality analysis to evaluate profilers and to gain insight into the source of their incorrectness. It shows that these profilers all violate a fundamental requirement for sampling based profilers: to be correct, a sampling-based profilermust collect samples randomly.

We show that a proof-of-concept profiler, which collects samples randomly, does not suffer from the above problems. Specifically, we show, using a number of case studies, that our profiler correctly identifies methods that are important to optimize; in some cases other profilers report that these methods are cold and thus not worth optimizing.

References

  1. B. Alpern, C. R. Attanasio, J. J. Barton, M. G. Burke, P. Cheng, J.-D. Choi, A. Cocchi, S. J. Fink, D. Grove, M. Hind, S. F. Hummel, D. Lieber, V. Litvinov, M. F. Mergen, T. Ngo, J. R. Russell, V. Sarkar, M. J. Serrano, J. C. Shepherd, S. E. Smith, V. C. Sreedhar, H. Srinivasan, and J. Whaley. The Jalapeño virtual machine. IBM Systems Journal, 39(1):211--238, February 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. M. Arnold and D. Grove. Collecting and exploiting high-accuracy call graph profiles in virtual machines. In Proc. of Int'l Symposium on Code Generation and Optimization, pages 51--62, Los Alamos, CA, March 2005. IEEE Computer Society. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. S. M. Blackburn, R. Garner, C. Hoffman, A. M. Khan, K. S. McKinley, R. Bentzur, A. Diwan, D. Feinberg, D. Frampton, S. Z. Guyer, M. Hirzel, A. Hosking, M. Jump, H. Lee, J. E. B. Moss, A. Phansalkar, D. Stefanović, T. VanDrunen, D. von Dincklage, and B. Wiedermann. The DaCapo benchmarks: Java benchmarking development and analysis. In Proc. of ACM SIGPLAN Conf. on Object-Oriented Programing, Systems, Languages, and Applications, pages 169--190, Portland, OR, Oct. 2006. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. S. M. Blackburn, P. Cheng, and K. S. Mckinley. Myths and realities: The performance impact of garbage collection. In Proc. of ACM SIMETRICS Conf. onMeasurement andModeling Computer Systems, pages 25--36, New York, NY, Jan. 2004. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. D. Buytaert, A. Georges, M. Hind, M. Arnold, L. Eeckhout, and K. De Bosschere. Using HPM-sampling to drive dynamic compilation. In Proc. of ACM SIGPLAN Conf. on Object-Oriented Programing, Systems, Languages, and Applications, pages 553--568, Montreal, Canada, Oct. 2007. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Dehao Chen, Neil Vachharajani, and Robert Hundt. Taming hardware event samples for fdo compilation. International Symposium on Code Generation and Optimization (CGO), 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. A. Diwan, E. Moss, and R. Hudson. Compiler support for garbage collection in a statically typed language. SIGPLAN Not., 27(7):273--282, 1992. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. M. Dmitriev. Selective profiling of Java applications using dynamic bytecode instrumentation. In Proc. of IEEE Int'l Symposium on Performance Analysis of Systems and Software, pages 141--150, Washington, DC, March 2004. IEEE. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. E. Duesterwald and V. Bala. Software profiling for hot path prediction: less is more. SIGPLAN Not., 35(11):202--211, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Eclipse: Open source java profiler v4.6.1. http://www.eclipse.org/tptp/.Google ScholarGoogle Scholar
  11. Ej technologies: Commercial java profiler. http://www.ejtechnologies.com/products/jprofiler/overview.html.Google ScholarGoogle Scholar
  12. A. Georges, D. Buytaert, and L. Eeckhout. Statistically rigorous Java performance evaluation. In Proc. of ACM SIGPLAN Conf. on Objectoriented Programming, Systems, Languages and Applications, pages 57--76, Montreal, Canada, Oct. 2007. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. S. L. Graham, P. B. Kessler, and M. K. Mckusick. Gprof: A call graph execution profiler. In Proc. of ACM SIGPLAN Symposium on Compiler Construction, pages 120--126, Boston, Mass., 1982. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. D. Gu, C. Verbrugge, and E. Gagnon. Code layout as a source of noise in JVM performance. Studia Informatica Universalis, pages 83--99, 2004.Google ScholarGoogle Scholar
  15. R. Hegger, H. Kantz, and T. Schreiber. Practical implementation of nonlinear time series methods: The TISEAN package. Chaos, 9(2):413--435, 1999.Google ScholarGoogle ScholarCross RefCross Ref
  16. Sam Kash Kachigan. Statistical Analysis: An Interdisciplinary Introduction to Univariate & Multivariate Methods. Radius Press, 1986.Google ScholarGoogle Scholar
  17. S. Mccanne and C. Torek. A randomized sampling clock for CPU utilization estimation and code profiling. In Proc. of the Winter USENIX Conf., pages 387--394, San Diego, CA, Jan. 1993.Google ScholarGoogle Scholar
  18. T. Moseley, A. Shye, V.J. Reddi, D. Grunwald, and R. Peri. Shadow profiling: Hiding instrumentation costs with parallelism. In Proc. of Int'l Symposium on Code Generation and Optimization, pages 198--208, Washington, DC, March 2007. IEEE Computer Society. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. T. Mytkowicz, A. Diwan, M. Hauswirth, and P. F. Sweeney. Producing wrong data without doing anything obviously wrong! In Proc. of Int'l Conf. on Architectural Support for Programming Languages and Operating Systems, pages 265--276, Washington, DC, March 2009. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Netbeans: Open source java profiler. v6.7. http://profiler.netbeans.org/.Google ScholarGoogle Scholar
  21. J. Pearl. Causality: Models, Reasoning, and Inference. Cambridge University Press, 1st edition, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. S. Rubin, R. Bodík, and T. Chilimbi. An efficient profile-analysis framework for data-layout optimizations. SIGPLAN Not., 37(1):140--153, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. hprof: an open source java profiler. http://java.sun.com/developer/technicalArticles/Programming/HPROF.htmlGoogle ScholarGoogle Scholar
  24. xprof: Internal profiler for hotspot. http://java.sun.com/docs/books/performance/1st edition/html/JPAppHotspot.fm.html.Google ScholarGoogle Scholar
  25. J. Whaley. A portable sampling-based profiler for java virtual machines. In Proc. of Conf. on Java Grande, pages 78--87, New York, NY, 2000. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Yourkit, llc: Commercial java profiler. http://www.yourkit.comGoogle ScholarGoogle Scholar

Index Terms

  1. Evaluating the accuracy of Java profilers

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        PLDI '10: Proceedings of the 31st ACM SIGPLAN Conference on Programming Language Design and Implementation
        June 2010
        514 pages
        ISBN:9781450300193
        DOI:10.1145/1806596
        • cover image ACM SIGPLAN Notices
          ACM SIGPLAN Notices  Volume 45, Issue 6
          PLDI '10
          June 2010
          496 pages
          ISSN:0362-1340
          EISSN:1558-1160
          DOI:10.1145/1809028
          Issue’s Table of Contents

        Copyright © 2010 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 5 June 2010

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate406of2,067submissions,20%

        Upcoming Conference

        PLDI '24

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader