skip to main content
10.1145/3637528.3672047acmconferencesArticle/Chapter ViewAbstractPublication PageskddConference Proceedingsconference-collections
research-article

DPHGNN: A Dual Perspective Hypergraph Neural Networks

Published: 24 August 2024 Publication History

Abstract

Message passing on hypergraphs has been a standard framework for learning higher-order correlations between hypernodes. Recently-proposed hypergraph neural networks (HGNNs) can be categorized into spatial and spectral methods based on their design choices. In this work, we analyze the impact of change in hypergraph topology on the suboptimal performance of HGNNs and propose DPHGNN, a novel dual-perspective HGNN that introduces equivariant operator learning to capture lower-order semantics by inducing topology-aware spatial and spectral inductive biases. DPHGNN employs a unified framework to dynamically fuse lower-order explicit feature representations from the underlying graph into the super-imposed hypergraph structure. We benchmark DPHGNN over eight benchmark hypergraph datasets for the semi-supervised hypernode classification task and obtain superior performance compared to seven state-of-the-art baselines. We also provide a theoretical framework and a synthetic hypergraph isomorphism test to express the power of spatial HGNNs and quantify the expressivity of DPHGNN beyond the Generalized Weisfeiler Leman (1-GWL) test. Finally, DPHGNN was deployed by our partner e-commerce company, Meesho for the Return-to-Origin (RTO) prediction task, which shows ~7% higher macro F1-Score than the best baseline.

References

[1]
Michael Armbrust, Tathagata Das, Joseph Torres, Burak Yavuz, Shixiong Zhu, Reynold Xin, Ali Ghodsi, Ion Stoica, and Matei Zaharia. 2018. Structured streaming: A declarative api for real-time applications in apache spark. In Proceedings of the 2018 International Conference on Management of Data. 601--613.
[2]
Devanshu Arya, Deepak K. Gupta, Stevan Rudinac, and Marcel Worring. 2020. HyperSAGE: Generalizing Inductive Representation Learning on Hypergraphs. CoRR, Vol. abs/2010.04558 (2020). http://dblp.uni-trier.de/db/journals/corr/corr2010.html#abs-2010-04558
[3]
Song Bai, Feihu Zhang, and Philip H. S. Torr. 2019. Hypergraph Convolution and Hypergraph Attention. CoRR, Vol. abs/1901.08150 (2019). http://dblp.uni-trier.de/db/journals/corr/corr1901.html#abs-1901-08150
[4]
Sambaran Bandyopadhyay, Kishalay Das, and M. Narasimha Murty. 2020. Hypergraph Attention Isomorphism Network by Learning Line Graph Expansion. In 2020 IEEE International Conference on Big Data (Big Data). 669--678. https://doi.org/10.1109/BigData50022.2020.9378335
[5]
Alain Bretto. 2013. Hypergraph Theory. Springer, Heidelberg. https://doi.org/10.1007/978--3--319-00080-0
[6]
Jan Böker. 2019. Color Refinement, Homomorphisms, and Hypergraphs. In WG (Lecture Notes in Computer Science, Vol. 11789), Ignasi Sau and Dimitrios M. Thilikos (Eds.). Springer, 338--350. http://dblp.uni-trier.de/db/conf/wg/wg2019.html#Boker19
[7]
Guanzi Chen, Jiying Zhang, Xi Xiao, and Yang Li. 2022. Preventing Over-Smoothing for Hypergraph Neural Networks. arxiv: 2203.17159 [cs.LG]
[8]
Eli Chien, Chao Pan, Jianhao Peng, and Olgica Milenkovic. 2022. You are AllSet: A Multiset Function Framework for Hypergraph Neural Networks. In International Conference on Learning Representations. https://openreview.net/forum?id=hpBTIv2uy_E
[9]
Giorgio Ciano, Alberto Rossi, Monica Bianchini, and Franco Scarselli. 2021. On inductive--transductive learning with graph neural networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 44, 2 (2021), 758--769.
[10]
Yihe Dong, Will Sawin, and Yoshua Bengio. 2020. HNHN: Hypergraph Networks with Hyperedge Neurons. ArXiv, Vol. abs/2006.12278 (2020). https://api.semanticscholar.org/CorpusID:219965680
[11]
Wenqi Fan, Yao Ma, Qing Li, Yuan He, Eric Zhao, Jiliang Tang, and Dawei Yin. 2019. Graph neural networks for social recommendation. In The world wide web conference. 417--426.
[12]
Yifan Feng, Haoxuan You, Zizhao Zhang, Rongrong Ji, and Yue Gao. 2019. Hypergraph Neural Networks. In AAAI. AAAI Press, 3558--3565. http://dblp.uni-trier.de/db/conf/aaai/aaai2019.html#FengYZJG19
[13]
Yue Gao, Yifan Feng, Shuyi Ji, and Rongrong Ji. 2022. HGNN: General Hypergraph Neural Networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 45 (2022), 3181--3199. https://api.semanticscholar.org/CorpusID:249644940
[14]
Yue Gao, Zizhao Zhang, Haojie Lin, Xibin Zhao, Shaoyi Du, and Changqing Zou. 2022. Hypergraph Learning: Methods and Practices. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 44, 5 (2022), 2548--2566. https://doi.org/10.1109/TPAMI.2020.3039374
[15]
Xuemei Gu, Lijun Chen, and Mario Krenn. 2020. Quantum experiments and hypergraphs: Multiphoton sources for quantum interference, quantum computation, and quantum entanglement. Physical Review A, Vol. 101, 3 (mar 2020). https://doi.org/10.1103/physreva.101.033816
[16]
Will Hamilton, Zhitao Ying, and Jure Leskovec. 2017. Inductive Representation Learning on Large Graphs. In Advances in Neural Information Processing Systems. 11.
[17]
Mikhail Hayhoe, Hans Matthew Riess, Michael M. Zavlanos, VICTOR PRECIADO, and Alejandro Ribeiro. 2023. Transferable Hypergraph Neural Networks via Spectral Similarity. In The Second Learning on Graphs Conference. https://openreview.net/forum?id=cHuii4NOB9
[18]
Jing Huang and Jie Yang. 2021. UniGNN: a Unified Framework for Graph and Hypergraph Neural Networks. In Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence, IJCAI 2021, Virtual Event / Montreal, Canada, 19--27 August 2021, Zhi-Hua Zhou (Ed.). ijcai.org, 2563--2569. https://doi.org/10.24963/ijcai.2021/353
[19]
Ningyuan Teresa Huang and Soledad Villar. 2021. A short tutorial on the weisfeiler-lehman test and its variants. In ICASSP 2021--2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 8533--8537.
[20]
Shuyi Ji, Yifan Feng, Rongrong Ji, Xibin Zhao, Wanwan Tang, and Yue Gao. 2020. Dual channel hypergraph collaborative filtering. In Proceedings of the 26th ACM SIGKDD international conference on knowledge discovery & data mining. 2020--2029.
[21]
Thomas N. Kipf and Max Welling. 2016. Semi-Supervised Classification with Graph Convolutional Networks. http://arxiv.org/abs/1609.02907 cite arxiv:1609.02907Comment: Published as a conference paper at ICLR 2017.
[22]
Jay Kreps, Neha Narkhede, Jun Rao, et al. 2011. Kafka: A distributed messaging system for log processing. In Proceedings of the NetDB, Vol. 11. Athens, Greece, 1--7.
[23]
Guillaume Lachaud, Patricia Conde-Cespedes, and Maria Trocan. 2022. Graph neural networks-based multilabel classification of citation network. In Asian Conference on Intelligent Information and Database Systems. Springer, 128--140.
[24]
Rashmee Lahon. 2022. Return to Origin- Why it Happens, Its Impact, and How to Solve it? https://razorpay.com/blog/return-to-origin-challenges-and-how-to-solve-it
[25]
Qimai Li, Zhichao Han, and Xiao-Ming Wu. 2018. Deeper insights into graph convolutional networks for semi-supervised learning. In Proceedings of the AAAI conference on artificial intelligence, Vol. 32.
[26]
Fan Liang, Cheng Qian, Wei Yu, David W. Griffith, and Nada Golmie. 2022. Survey of Graph Neural Networks and Applications. Wireless Communications and Mobile Computing (2022). https://api.semanticscholar.org/CorpusID:251192566
[27]
Shengyuan Liu, Pei Lv, Yuzhen Zhang, Jie Fu, Junjin Cheng, Wanqing Li, Bing Zhou, and Mingliang Xu. 2020. Semi-Dynamic Hypergraph Neural Network for 3D Pose Estimation. In IJCAI, Christian Bessiere (Ed.). ijcai.org, 782--788. http://dblp.uni-trier.de/db/conf/ijcai/ijcai2020.html#LiuLZFCLZX20 Scheduled for July 2020, Yokohama, Japan, postponed due to the Corona pandemic.
[28]
Haggai Maron, Heli Ben-Hamu, Hadar Serviansky, and Yaron Lipman. 2019. Provably powerful graph networks. Advances in neural information processing systems, Vol. 32 (2019).
[29]
Christopher Morris, Gaurav Rattan, and Petra Mutzel. 2020. Weisfeiler and Leman go sparse: Towards scalable higher-order graph embeddings. In Advances in Neural Information Processing Systems, H. Larochelle, M. Ranzato, R. Hadsell, M.F. Balcan, and H. Lin (Eds.), Vol. 33. Curran Associates, Inc., 21824--21840. https://proceedings.neurips.cc/paper/2020/file/f81dee42585b3814de199b2e88757f5c-Paper.pdf
[30]
Raffaella Mulas, Christian Kuehn, Tobias Böhle, and Jürgen Jost. 2021. Random walks and Laplacians on hypergraphs: When do they match?arxiv: 2106.11663 [math.SP]
[31]
Raffaella Mulas, Rubén J Sánchez-Garcia, and Ben D MacArthur. 2020. Hypergraph automorphisms. arXiv preprint arXiv:2010.01049 (2020).
[32]
Raffaella Mulas and Dong Zhang. 2021. Spectral theory of Laplace operators on oriented hypergraphs. Discrete Mathematics, Vol. 344, 6 (jun 2021), 112372. https://doi.org/10.1016/j.disc.2021.112372
[33]
Khaled Mohammed Saifuddin, Bri Bumgardnerr, Farhan Tanvir, and Esra Akbas. 2022. HyGNN: Drug-Drug Interaction Prediction via Hypergraph Neural Network. 2023 IEEE 39th International Conference on Data Engineering (ICDE) (2022), 1503--1516. https://api.semanticscholar.org/CorpusID:250072419
[34]
Jake Topping, Francesco Di Giovanni, Benjamin Paul Chamberlain, Xiaowen Dong, and Michael M. Bronstein. 2022. Understanding over-squashing and bottlenecks on graphs via curvature. arxiv: 2111.14522 [stat.ML]
[35]
Laurens Van der Maaten and Geoffrey Hinton. 2008. Visualizing data using t-SNE. Journal of machine learning research, Vol. 9, 11 (2008).
[36]
Petar Velickovic, Guillem Cucurull, A. Casanova, A. Romero, P. Liò, and Yoshua Bengio. 2018. Graph Attention Networks. ArXiv, Vol. abs/1710.10903 (2018).
[37]
Janu Verma, Srishti Gupta, Debdoot Mukherjee, and Tanmoy Chakraborty. 2019. Heterogeneous edge embedding for friend recommendation. In Advances in Information Retrieval: 41st European Conference on IR Research, ECIR 2019, Cologne, Germany, April 14--18, 2019, Proceedings, Part II 41. Springer, 172--179.
[38]
Keyulu Xu, Weihua Hu, Jure Leskovec, and Stefanie Jegelka. 2019. How Powerful are Graph Neural Networks?. In ICLR. OpenReview.net. http://dblp.uni-trier.de/db/conf/iclr/iclr2019.html#XuHLJ19
[39]
Keyulu Xu, Chengtao Li, Yonglong Tian, Tomohiro Sonobe, Ken ichi Kawarabayashi, and Stefanie Jegelka. 2018. Representation Learning on Graphs with Jumping Knowledge Networks. In ICML (Proceedings of Machine Learning Research, Vol. 80), Jennifer G. Dy and Andreas Krause (Eds.). PMLR, 5449--5458. http://dblp.uni-trier.de/db/conf/icml/icml2018.html#XuLTSKJ18
[40]
Naganand Yadati. 2020. Neural Message Passing for Multi-Relational Ordered and Recursive Hypergraphs. In Advances in Neural Information Processing Systems (NeurIPS) 33. Curran Associates, Inc.
[41]
Naganand Yadati, Madhav Nimishakavi, Prateek Yadav, Vikram Nitin, Anand Louis, and Partha Talukdar. 2019. Hypergcn: A new method for training graph convolutional networks on hypergraphs. Advances in neural information processing systems, Vol. 32 (2019).
[42]
Kai Yang, Hao Sun, Chunbo Zou, and Xiaoqiang Lu. 2021. Cross-attention spectral--spatial network for hyperspectral image classification. IEEE Transactions on Geoscience and Remote Sensing, Vol. 60 (2021), 1--14.
[43]
Jonathan S. Yedidia. 2011. Message-Passing Algorithms for Inference and Optimization. Journal of Statistical Physics, Vol. 145 (2011), 860--890. https://api.semanticscholar.org/CorpusID:120991239
[44]
Jie Zhou, Ganqu Cui, Zhengyan Zhang, Cheng Yang, Zhiyuan Liu, and Maosong Sun. 2018. Graph Neural Networks: A Review of Methods and Applications. ArXiv, Vol. abs/1812.08434 (2018). https://api.semanticscholar.org/CorpusID:56517517
[45]
J.Y. Zien, M.D.F. Schlag, and P.K. Chan. 1999. Multilevel spectral hypergraph partitioning with arbitrary vertex sizes. IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, Vol. 18, 9 (1999), 1389--1399.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
KDD '24: Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining
August 2024
6901 pages
ISBN:9798400704901
DOI:10.1145/3637528
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 24 August 2024

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. RTO prediction
  2. graph neural network
  3. hypergraph neural networks
  4. topological deep learning

Qualifiers

  • Research-article

Conference

KDD '24
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,133 of 8,635 submissions, 13%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 102
    Total Downloads
  • Downloads (Last 12 months)102
  • Downloads (Last 6 weeks)102
Reflects downloads up to 23 Sep 2024

Other Metrics

Citations

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media