Skip to main content

ECIR 23 Tutorial: Neuro-Symbolic Approaches for Information Retrieval

  • Conference paper
  • First Online:
Advances in Information Retrieval (ECIR 2023)

Abstract

This tutorial will provide an overview of recent advances on neuro-symbolic approaches for information retrieval. A decade ago, knowledge graphs and semantic annotations technology led to active research on how to best leverage symbolic knowledge. At the same time, neural methods have demonstrated to be versatile and highly effective.

From a neural network perspective, the same representation approach can service document ranking or knowledge graph reasoning. End-to-end training allows to optimize complex methods for downstream tasks.

We are at the point where both the symbolic and the neural research advances are coalescing into neuro-symbolic approaches. The underlying research questions are how to best combine symbolic and neural approaches, what kind of symbolic/neural approaches are most suitable for which use case, and how to best integrate both ideas to advance the state of the art in information retrieval.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bast, H., Haussmann, E.: More accurate question answering on freebase. In: Proceedings of the 24th ACM International on Conference on Information and Knowledge Management, pp. 1431–1440 (2015)

    Google Scholar 

  2. Cao, N.D., Izacard, G., Riedel, S., Petroni, F.: Autoregressive entity retrieval. CoRR abs/2010.00904 (2020). https://arxiv.org/abs/2010.00904

  3. Chatterjee, S., Dietz, L.: BERT-ER: query-specific BERT entity representations for entity ranking. In: Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2022, pp. 1466–1477. Association for Computing Machinery, New York (2022). https://doi.org/10.1145/3477495.3531944

  4. Chatterjee, S., Dietz, L.: Predicting guiding entities for entity aspect linking. In: Proceedings of the 31st ACM International Conference on Information and Knowledge Management, CIKM 2022. Association for Computing Machinery, New York (2022). https://doi.org/10.1145/3511808.3557671

  5. Chen, X., Hu, Z., Sun, Y.: Fuzzy logic based logical query answering on knowledge graphs. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 3939–3948 (2022)

    Google Scholar 

  6. Dalton, J., Dietz, L., Allan, J.: Entity query feature expansion using knowledge base links. In: Proceedings of the 37th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2014, pp. 365–374. Association for Computing Machinery, New York (2014). https://doi.org/10.1145/2600428.2609628

  7. Funke, T., Khosla, M., Rathee, M., Anand, A.: Zorro: valid, sparse, and stable explanations in graph neural networks. IEEE Trans. Knowl. Data Eng. (2022)

    Google Scholar 

  8. Gerritse, E.J., Hasibi, F., de Vries, A.P.: Graph-embedding empowered entity retrieval. In: Jose, J.M., Yilmaz, E., Magalhães, J., Castells, P., Ferro, N., Silva, M.J., Martins, F. (eds.) ECIR 2020. LNCS, vol. 12035, pp. 97–110. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-45439-5_7

    Chapter  Google Scholar 

  9. Gerritse, E.J., Hasibi, F., de Vries, A.P.: Entity-aware transformers for entity search. In: Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2022, pp. 1455–1465. Association for Computing Machinery, New York (2022). https://doi.org/10.1145/3477495.3531971

  10. Guu, K., Lee, K., Tung, Z., Pasupat, P., Chang, M.W.: Realm: retrieval-augmented language model pre-training. In: Proceedings of the 37th International Conference on Machine Learning, pp. 3929–3938 (2020)

    Google Scholar 

  11. van Hulst, J.M., Hasibi, F., Dercksen, K., Balog, K., de Vries, A.P.: Rel: an entity linker standing on the shoulders of giants. In: Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 2197–2200 (2020)

    Google Scholar 

  12. Hwang, J.D., Bhagavatula, C., Le Bras, R., Da, J., Sakaguchi, K., Bosselut, A., Choi, Y.: (comet-) atomic 2020: on symbolic and neural commonsense knowledge graphs. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 6384–6392 (2021)

    Google Scholar 

  13. Joko, H., Hasibi, F., Balog, K., de Vries, A.P.: Conversational entity linking: problem definition and datasets. In: Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 2390–2397 (2021)

    Google Scholar 

  14. Lewis, M., et al.: Bart: denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 7871–7880 (2020)

    Google Scholar 

  15. Lin, J., Nogueira, R., Yates, A.: Pretrained transformers for text ranking: BERT and beyond. CoRR abs/2010.06467 (2020). https://arxiv.org/abs/2010.06467

  16. Mitra, B., Craswell, N.: An updated duet model for passage re-ranking. arXiv preprint arXiv:1903.07666 (2019)

  17. Mitra, B., Craswell, N., et al.: An introduction to neural information retrieval. Found. Trends® Inf. Retrieval 13(1), 1–126 (2018)

    Google Scholar 

  18. Ou, M., Cui, P., Pei, J., Zhang, Z., Zhu, W.: Asymmetric transitivity preserving graph embedding. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 1105–1114 (2016)

    Google Scholar 

  19. Poerner, N., Waltinger, U., Schütze, H.: E-BERT: efficient-yet-effective entity embeddings for BERT. In: Findings of the Association for Computational Linguistics: EMNLP 2020, pp. 803–818. Association for Computational Linguistics, Online, November 2020. https://doi.org/10.18653/v1/2020.findings-emnlp.71

  20. Ponza, M., Ceccarelli, D., Ferragina, P., Meij, E., Kothari, S.: Contextualizing trending entities in news stories. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 346–354 (2021)

    Google Scholar 

  21. Pradeep, R., Nogueira, R., Lin, J.: The expando-mono-duo design pattern for text ranking with pretrained sequence-to-sequence models. arXiv e-prints pp. arXiv-2101 (2021)

    Google Scholar 

  22. Reinanda, R., Meij, E., de Rijke, M., et al.: Knowledge graphs: an information retrieval perspective. Found. Trends® Inf. Retrieval 14(4), 289–444 (2020)

    Google Scholar 

  23. Saparov, A., He, H.: Language models are greedy reasoners: a systematic formal analysis of chain-of-thought. arXiv preprint arXiv:2210.01240 (2022)

  24. Xiong, C., Liu, Z., Callan, J., Hovy, E.: Jointsem: combining query entity linking and entity based document ranking. In: Proceedings of the 2017 ACM SIGIR Conference on Information and Knowledge Management, CIKM 2017, pp. 2391–2394. Association for Computing Machinery, New York (2017). https://doi.org/10.1145/3132847.3133048

  25. Yamada, I., et al.: Wikipedia2Vec: an efficient toolkit for learning and visualizing the embeddings of words and entities from Wikipedia. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, pp. 23–30. Association for Computational Linguistics, Online, October 2020. https://doi.org/10.18653/v1/2020.emnlp-demos.4

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Laura Dietz .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Dietz, L., Bast, H., Chatterjee, S., Dalton, J., Meij, E., de Vries, A. (2023). ECIR 23 Tutorial: Neuro-Symbolic Approaches for Information Retrieval. In: Kamps, J., et al. Advances in Information Retrieval. ECIR 2023. Lecture Notes in Computer Science, vol 13982. Springer, Cham. https://doi.org/10.1007/978-3-031-28241-6_33

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-28241-6_33

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-28240-9

  • Online ISBN: 978-3-031-28241-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics