Skip to main content

Contrastive Learning Augmented Graph Auto-Encoder

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2023)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1965))

Included in the following conference series:

  • 486 Accesses

Abstract

Graph embedding aims to embed the information of graph data into low-dimensional representation space. Prior methods generally suffer from an imbalance of preserving structural information and node features due to their pre-defined inductive biases, leading to unsatisfactory generalization performance. In order to preserve the maximal information, graph contrastive learning (GCL) has become a prominent technique for learning discriminative embeddings. However, in contrast with graph-level embeddings, existing GCL methods generally learn less discriminative node embeddings in a self-supervised way. In this paper, we ascribe above problem to two challenges: 1) graph data augmentations, which are designed for generating contrastive representations, hurt the original semantic information for nodes. 2) the nodes within the same cluster are selected as negative samples. To alleviate these challenges, we propose Contrastive Variational Graph Auto-Encoder (CVGAE). Specifically, we first propose a distribution-dependent regularization to guide the paralleled encoders to generate contrastive representations following similar distributions. Then, we utilize truncated triplet loss, which only selects top-k nodes as negative samples, to avoid over-separate nodes affiliated to the same cluster. Experiments on several real-world datasets show that our model CVGAE advanced performance over all baselines in link prediction, node clustering tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Aggarwal, C.C., Wang, H.: A survey of clustering algorithms for graph data. In: Aggarwal, C.C., Wang, H. (eds.) Managing and Mining Graph Data, pp. 275–301. Springer, Boston (2010). https://doi.org/10.1007/978-1-4419-6045-0_9

  2. Ahmed, M., Seraj, R., Islam, S.M.S.: The k-means algorithm: a comprehensive survey and performance evaluation. Electronics 9(8), 1295 (2020)

    Article  Google Scholar 

  3. Ahn, S.J., Kim, M.: Variational graph normalized autoencoders. In: Proceedings of the 30th ACM International Conference on Information and Knowledge Management, pp. 2827–2831 (2021)

    Google Scholar 

  4. Cabanes, C., et al.: The cora dataset: validation and diagnostics of in-situ ocean temperature and salinity measurements. Ocean Sci. 9(1), 1–18 (2013)

    Google Scholar 

  5. Cai, T.T., Frankle, J., Schwab, D.J., Morcos, A.S.: Are all negatives created equal in contrastive instance discrimination? arXiv preprint arXiv:2010.06682 (2020)

  6. Caragea, C., et al.: CiteSeer x: a scholarly big dataset. In: de Rijke, M., et al. (eds.) ECIR 2014. LNCS, vol. 8416, pp. 311–322. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-06028-6_26

  7. Chen, T., Kornblith, S., Norouzi, M., Hinton, G.: A simple framework for contrastive learning of visual representations. In: International Conference on Machine Learning, pp. 1597–1607. PMLR (2020)

    Google Scholar 

  8. Chuang, C.Y., Robinson, J., Lin, Y.C., Torralba, A., Jegelka, S.: Debiased contrastive learning. Adv. Neural. Inf. Process. Syst. 33, 8765–8775 (2020)

    Google Scholar 

  9. Falagas, M.E., Pitsouni, E.I., Malietzis, G.A., Pappas, G.: Comparison of pubmed, scopus, web of science, and google scholar: strengths and weaknesses. FASEB J. 22(2), 338–342 (2008)

    Article  Google Scholar 

  10. Fuglede, B., Topsoe, F.: Jensen-shannon divergence and hilbert space embedding. In: International Symposium on Information Theory, 2004 (ISIT 2004), Proceedings, p. 31. IEEE (2004)

    Google Scholar 

  11. Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126. PMLR (2020)

    Google Scholar 

  12. Hershey, J.R., Olsen, P.A.: Approximating the kullback leibler divergence between gaussian mixture models. In: 2007 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 2007), vol. 4, pp. IV–317. IEEE (2007)

    Google Scholar 

  13. Hjelm, R.D., et al.: Learning deep representations by mutual information estimation and maximization. arXiv preprint arXiv:1808.06670 (2018)

  14. Hou, Z., et al.: Graphmae: self-supervised masked graph autoencoders. arXiv preprint arXiv:2205.10803 (2022)

  15. Jaiswal, A., Babu, A.R., Zadeh, M.Z., Banerjee, D., Makedon, F.: A survey on contrastive self-supervised learning. Technologies 9(1), 2 (2020)

    Article  Google Scholar 

  16. Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016)

  17. Latham, P.E., Roudi, Y.: Mutual information. Scholarpedia 4(1), 1658 (2009)

    Article  Google Scholar 

  18. Liben-Nowell, D., Kleinberg, J.: The link-prediction problem for social networks. J. Am. Soc. Inform. Sci. Technol. 58(7), 1019–1031 (2007)

    Article  Google Scholar 

  19. Mavromatis, C., Karypis, G.: Graph infoclust: leveraging cluster-level node information for unsupervised graph representation learning. arXiv preprint arXiv:2009.06946 (2020)

  20. Oord, A.v.d., Li, Y., Vinyals, O.: Representation learning with contrastive predictive coding. arXiv preprint arXiv:1807.03748 (2018)

  21. Pan, L., Shi, C., Dokmanić, I.: Neural link prediction with walk pooling. arXiv preprint arXiv:2110.04375 (2021)

  22. Pan, S., Hu, R., Long, G., Jiang, J., Yao, L., Zhang, C.: Adversarially regularized graph autoencoder for graph embedding. arXiv preprint arXiv:1802.04407 (2018)

  23. Tian, Y., Krishnan, D., Isola, P.: Contrastive multiview coding. In: Vedaldi, A., Bischof, H., Brox, T., Frahm, J.-M. (eds.) ECCV 2020. LNCS, vol. 12356, pp. 776–794. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58621-8_45

  24. Tokui, S., et al.: Reparameterization trick for discrete variables. arXiv preprint arXiv:1611.01239 (2016)

  25. Veličković, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. arXiv preprint arXiv:1809.10341 (2018)

  26. Wang, G., Wang, K., Wang, G., Torr, P.H., Lin, L.: Solving inefficiency of self-supervised representation learning. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 9505–9515 (2021)

    Google Scholar 

  27. Wang, H., Li, Y., Huang, Z., Dou, Y., Kong, L., Shao, J.: SNCSE: contrastive learning for unsupervised sentence embedding with soft negative samples. arXiv preprint arXiv:2201.05979 (2022)

  28. Xia, J., Wu, L., Chen, J., Hu, B., Li, S.Z.: Simgrace: a simple framework for graph contrastive learning without data augmentation. In: Proceedings of the ACM Web Conference 2022, pp. 1070–1079 (2022)

    Google Scholar 

  29. Xu, M.: Understanding graph embedding methods and their applications. SIAM Rev. 63(4), 825–853 (2021)

    Article  MathSciNet  MATH  Google Scholar 

  30. You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Adv. Neural. Inf. Process. Syst. 33, 5812–5823 (2020)

    Google Scholar 

  31. Zhang, S., Tong, H., Xu, J., Maciejewski, R.: Graph convolutional networks: a comprehensive review. Comput. Soc. Netw. 6(1), 1–23 (2019). https://doi.org/10.1186/s40649-019-0069-y

    Article  Google Scholar 

  32. Zhu, Y., Xu, Y., Yu, F., Liu, Q., Wu, S., Wang, L.: Graph contrastive learning with adaptive augmentation. In: Proceedings of the Web Conference 2021, pp. 2069–2080 (2021)

    Google Scholar 

Download references

Acknowledgments

This research was partially supported by grants from the National Natural Science Foundation of China (No. 61877051). We acknowledge all the developers and researchers for developing useful tools that enable our experiments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Li Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zu, S., Wang, C., Liu, Y., Shen, J., Li, L. (2024). Contrastive Learning Augmented Graph Auto-Encoder. In: Luo, B., Cheng, L., Wu, ZG., Li, H., Li, C. (eds) Neural Information Processing. ICONIP 2023. Communications in Computer and Information Science, vol 1965. Springer, Singapore. https://doi.org/10.1007/978-981-99-8145-8_22

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-8145-8_22

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-8144-1

  • Online ISBN: 978-981-99-8145-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics