skip to main content
10.1145/1124772.1124890acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Article

The Sandbox for analysis: concepts and methods

Published:22 April 2006Publication History

ABSTRACT

The Sandbox is a flexible and expressive thinking environment that supports both ad-hoc and more formal analytical tasks. It is the evidence marshalling and sense-making component for the analytical software environment called nSpace. This paper presents innovative Sandbox human information interaction capabilities and the rationale underlying them including direct observations of analysis work as well as structured interviews. Key capabilities for the Sandbox include "put-this-there" cognition, automatic process model templates, gestures for the fluid expression of thought, assertions with evidence and scalability mechanisms to support larger analysis tasks. The Sandbox integrates advanced computational linguistic functions using a Web Services interface and protocol. An independent third party evaluation experiment with the Sandbox has been completed. The experiment showed that analyst subjects using the Sandbox did higher quality analysis in less time than with standard tools. Usability test results indicated the analysts became proficient in using the Sandbox with three hours of training.

References

  1. Alonso, R. and H. Li, Combating Cognitive Biases in Information Retrieval, In Proc. International Conference on Intelligence Analysis, 2005.]]Google ScholarGoogle Scholar
  2. ARDA Novel Intelligence From Massive Data, NIMD, http://www.ic-arda.org/Novel_Intelligence/ , 2002.]]Google ScholarGoogle Scholar
  3. Bodnar, J.W., Warning Analysis for the Information Age: Rethinking the Intelligence Process, Joint Military Intelligence College (JMIC), December, 2003.]]Google ScholarGoogle Scholar
  4. Caid, W. and Pu Oing, System and Method of Context Vector Generation and Retrieval, United States Patent 5,619,709, 1997.]]Google ScholarGoogle Scholar
  5. Card, Stuart, J. Mackinlay, B. Shneiderman Readings in Information Visualization, Morgan Kaufman Publishers, San Francisco, CA., 1999.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Cowley, P., L. Nowell, J. Scholtz, Glass Box: An Instrumented Infrastructure for Supporting Human Interaction with Information, In Proc. Hawaii International Conf. on System Sciences, 2005, 296.3]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Hampson, E., and P. Cowley, Instrumenting the Intelligence Analysis Process, In Proc. International Conference on Intelligence Analysis, 2005.]]Google ScholarGoogle Scholar
  8. Heuer, Richard, Psychology of Intelligence Analysis, Center for the Study of Intelligence, 1999.]]Google ScholarGoogle Scholar
  9. Hughes, F. and D. Schum, Discovery-Proof-Choice, The Art and Science of the Process of Intelligence Analysis -- Preparing for the Future of Intelligence Analysis, JMIC Joint Military Intelligence College, 2003.]]Google ScholarGoogle Scholar
  10. Hutchings, D. and J. Stasko, QuickSpace, Short Paper, ACM CHI Conference, 2002, 802--803.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Institute for Human Machine Cognition, University of West Florida., Cmap Tools, http://cmap.ihmc.us/]]Google ScholarGoogle Scholar
  12. i2 Limited., http://www.i2inc.com]]Google ScholarGoogle Scholar
  13. Jonker, D., W. Wright, D. Schroh, P. Proulx and B. Cort, Information Triage with TRIST, In Proc. International Conference on Intelligence Analysis, 2005.]]Google ScholarGoogle Scholar
  14. Johnston, R., Analytic Culture in the U.S. Intelligence Community, Center for the Study of Intelligence, Government Printing Office, Pittsburgh, PA.]]Google ScholarGoogle Scholar
  15. Kapler, T. and W. Wright, GeoTime Information Visualization, Information Visualization Journal, Palgrave Macmillan, 4,(2), Summer 2005, 136--146.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Larkin, J. and H. Simon, Why a Diagram is (Sometimes) Worth Ten Thousand Words, Cognitive Science, 11(1), 1987, 65--99.]]Google ScholarGoogle Scholar
  17. Mindjet Inc., http://www.mindjet.com]]Google ScholarGoogle Scholar
  18. Morse, E., M. Potts Steves and J. Scholtz, Metrics and Methodologies for Evaluating Technologies for Intelligence Analysts, In Proc. Conference on Intelligence Analysis, 2005.]]Google ScholarGoogle Scholar
  19. Mynatt, E., T. Igarashi, K. Edwards, A. LaMarca, Designing an Augmented Writing Surface, IEEE Computer Graphics and Applications, July 2000, 55--61.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Pirolli, P. and S, Card, Information Foraging in Information Access Environments, In Proc. SIGCHI Conference on Human Factors, 1995, 51--58.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Pirolli, P. and S, Card, The Sensemaking Process and Leverage Points for Analyst Technology as Identified Through Cognitive Task Analysis, In Proc. International Conference on Intelligence Analysis, 2005.]]Google ScholarGoogle Scholar
  22. Robertson, G., M. Czerwinski, , K. Larson, D. Robbins, D. Thiel and M. van Dantzich, Data Mountain, In Proc. ACM Symposium on UIST, 1998, 153--162.]] Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Rose, Russ, ChairP1000 Committee, P1000 Report, Office of Research and Development, CIA, 1996.]]Google ScholarGoogle Scholar
  24. Visual Analytics Inc., http://www.visualanalytics.com]]Google ScholarGoogle Scholar
  25. Wright, William and Kapler, Thomas, Speaking with Analysts -- Observations of Current Practices with Massive Data, submitted to Journal of Intelligence Community Research and Development.]]Google ScholarGoogle Scholar

Index Terms

  1. The Sandbox for analysis: concepts and methods

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        CHI '06: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
        April 2006
        1353 pages
        ISBN:1595933727
        DOI:10.1145/1124772

        Copyright © 2006 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 22 April 2006

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • Article

        Acceptance Rates

        Overall Acceptance Rate6,199of26,314submissions,24%

        Upcoming Conference

        CHI '24
        CHI Conference on Human Factors in Computing Systems
        May 11 - 16, 2024
        Honolulu , HI , USA

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader