skip to main content
10.1145/2908131.2908210acmconferencesArticle/Chapter ViewAbstractPublication PageswebsciConference Proceedingsconference-collections
extended-abstract

Automated question generation for quality control in human computation tasks

Published:22 May 2016Publication History

ABSTRACT

When running large human computation tasks in the real-world, honeypots play an important role for assessing the overall quality of the work produced. The generation of such honeypots can be a significant burden on the task owner as they require specific characteristics in their design and implementation and continuous maintenance when operating data pipelines that include a human computation component. In this extended abstract we outline a novel approach for creating honeypots using automatically generated questions from a reference knowledge base with the ability to control such parameters as topic and difficulty.

References

  1. O. Alonso, C. C. Marshall, and M. Najork. Debugging a crowdsourced task with low inter-rater agreement. In Proc. of JCDL, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. L. Beinborn, T. Zesch, and I. Gurevych. Predicting the difficulty of language proficiency tests. TACL, 2, 2014.Google ScholarGoogle Scholar
  3. M. Christoforaki and P. G. Ipeirotis. STEP: A scalable testing and evaluation platform. In Proc. of HCOMP, 2014.Google ScholarGoogle Scholar
  4. P. Dai, J. M. Rzeszotarski, P. Paritosh, and E. H. Chi. And now for something completely different: Improving crowdsourcing workflows with micro-diversions. In Proc. of CSCW, pages 628--638, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. D. E. Difallah, G. Demartini, and P. Cudré-Mauroux. Mechanical cheat: Spamming schemes and adversarial techniques on crowdsourcing platforms. In Proc. of Workshop on Crowdsourcing Web Search, Lyon, France, April 17, 2012, pages 26--30, 2012.Google ScholarGoogle Scholar
  6. C. M. Feeney and M. Heilman. Automatically generating and validating reading-check questions. In Proc. of ITS, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. S. K. Kondreddi, P. Triantafillou, and G. Weikum. Combining information extraction and human computing for crowdsourced knowledge acquisition. In Proc. of ICDE, pages 988--999, 2014.Google ScholarGoogle ScholarCross RefCross Ref
  8. A. Marcus and A. G. Parameswaran. Crowdsourced Data Management. Foundations and Trends in Databases, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. D. Seyler, M. Yahya, and K. Berberich. Generating Quiz Questions from Knowledge Graphs. In Proc. of WWW, 2015. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Automated question generation for quality control in human computation tasks

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      WebSci '16: Proceedings of the 8th ACM Conference on Web Science
      May 2016
      392 pages
      ISBN:9781450342087
      DOI:10.1145/2908131

      Copyright © 2016 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 22 May 2016

      Check for updates

      Qualifiers

      • extended-abstract

      Acceptance Rates

      WebSci '16 Paper Acceptance Rate13of70submissions,19%Overall Acceptance Rate218of875submissions,25%

      Upcoming Conference

      Websci '24
      16th ACM Web Science Conference
      May 21 - 24, 2024
      Stuttgart , Germany

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader