- 无标题文档
查看论文信息

论文中文题名:

 基于深度学习的图表示学习研究    

姓名:

 王煜文    

学号:

 20208223033    

保密级别:

 公开    

论文语种:

 chi    

学科代码:

 085400    

学科名称:

 工学 - 电子信息    

学生类型:

 硕士    

学位级别:

 工程硕士    

学位年度:

 2023    

培养单位:

 西安科技大学    

院系:

 计算机科学与技术学院    

专业:

 软件工程    

研究方向:

 图表示学习    

第一导师姓名:

 冯健    

第一导师单位:

 西安科技大学    

论文提交日期:

 2023-06-25    

论文答辩日期:

 2023-06-04    

论文外文题名:

 Graph Representation Learning Research based on Deep Learning    

论文中文关键词:

 图表示学习 ; 注意力机制 ; 图卷积网络 ; 静态 ; 动态    

论文外文关键词:

 graph representation learning ; attention mechanism ; graph convolutional network ; static ; dynamic    

论文中文摘要:

现实生活中,许多数据以图的形式存在,如社交网络、交易网络、引文网络等。有效地分析这些图数据至关重要。图表示学习是学习图数据的重要方法,其目的是学习一个映射函数,将图中所有节点映射为低维向量表示,同时尽可能地保留图中丰富的信息。图数据根据拓扑结构和节点/边的属性信息是否随时间演变可以划分为静态图和动态图两类,为此,本文分别针对静态图和动态图研究了图表示学习模型。

针对现有静态图表示学习方法中区分节点重要性能力不足和信息丢失的问题,提出一种基于图卷积网络的静态图表示学习模型。首先,为了缓解特征学习过程中信息丢失的问题,使用密集图卷积网络(Graph Convolutional Network,GCN);然后,使用曲率生成模块将曲率聚合到GCN中,以增强模型区分节点在图拓扑结构中重要性的能力;最后,将密集GCN和曲率GCN得到的节点表示融合得到最终节点表示。在节点分类任务上验证模型性能,实验结果证明了模型的有效性。

针对现有的动态图表示学习方法中难以有效地捕获节点的高阶近邻关系和动态图的时序信息问题,提出一种基于图注意力网络的动态图表示学习模型。首先,采用叠加的图注意力网络捕捉节点的高阶近邻关系,同时,引入ResNet网络的残差连接机制缓解叠加的图注意力网络导致的过平滑问题;其次,使用独热编码对快照的相对时间位置信息进行编码来处理动态图的时序信息;然后,引入因果卷积作为时序学习的预处理来避免未来的信息泄露;最后,引入时序注意力机制来进一步学习动态网络的拓扑结构随时间的演化模式。实验结果证明了模型的有效性。

论文外文摘要:

In reality, many data exist in the form of graphs, such as social networks, transaction networks, citation networks, etc.  The effective analysis of such graph data is crucial. Graph representation learning is an important approach for studying graph data, with the aim of learning a mapping function that can map all the nodes in a graph to low-dimensional vector representations while retaining the rich information present in the graph. Graph data can be categorized into two types: static graphs and dynamic graphs, based on whether the topology and attributes of nodes/edges evolve over time. In this regard, this paper focuses on studying graph representation learning models specifically designed for static and dynamic graphs, respectively.

In order to address the limitations of existing static graph representation learning methods in terms of distinguishing node importance and information loss, a novel static graph representation learning model is proposed. Firstly, to mitigate the issue of information loss during feature learning, a dense Graph Convolutional Network (GCN) is employed. Then, a curvature generation module is used to aggregate curvature information into the GCN, enhancing the model's ability to differentiate the importance of nodes within the graph’s topological structure. Finally, the node representations obtained from the dense GCN and the curvature GCN are fused to obtain the final node representation. The model’s performance is evaluated on a node classification task, and experimental results demonstrate its effectiveness.

In response to the challenges of effectively capturing high-order neighboring relationships of nodes and temporal information in dynamic graph representation learning methods, a dynamic graph representation learning model is proposed. Firstly, a stacked graph attention network is employed to capture the high-order neighboring relationships of nodes. The residual connection mechanism from the ResNet network is introduced to alleviate the oversmoothing issue caused by the stacked graph attention network. Secondly, the relative temporal positional information of snapshots is encoded using one hot encoding to handle the temporal information of dynamic graphs. Next, causal convolution is introduced as a preprocessing step for temporal learning to prevent future information leakage. Finally, a temporal attention mechanism is incorporated to further learn the evolving patterns of the dynamic network’s topology over time. Experimental results validate the effectiveness of the model.

参考文献:

[1]Yi H C, You Z H, Huang D S, et al. Graph representation learning in bioinformatics: Trends, methods and applications[J]. Briefings in Bioinformatics, 2022, 23(1): bbab340.

[2]蔡晓东, 曾志杨. AFGSRec:一种自适应融合全局协同特征的社交推荐模型[J]. 华南理工大学学报(自然科学版), 2022, 50(12): 71-79.

[3]韩忠明, 王宇航, 毛雅俊, 等. 基于图神经网络的比特币交易预测[J]. 计算机应用研究, 2022, 39(12): 3562-3567.

[4]朱丹浩, 黄肖宇. 基于异构特征融合的论文引用预测方法[J]. 数据采集与处理, 2022, 37(05): 1134-1144.

[5]Li X, Wei W, Feng X, et al. Representation learning of graphs using graph convolutional multilayer networks based on motifs[J]. Neurocomputing, 2021, 464: 218-226.

[6]袁立宁,李欣,王晓冬, 等. 图嵌入模型综述[J]. 计算机科学与探索, 2022, 16(01): 59-87.

[7]Zhou J, Liu L, Wei W, et al. Network representation learning: From preprocessing, feature extraction to node embedding[J]. ACM Computing Surveys, 2022, 55(2): 1-35.

[8]陈东洋, 郭进利. 基于图注意力的高阶网络节点分类方法[J]. 计算机应用研究: 2023, 40(4): 1095-1100, 1136.

[9]Yang L, Jiang X, Ji Y, et al. Gated graph convolutional network based on spatio-temporal semi-variogram for link prediction in dynamic complex network[J]. Neurocomputing, 2022, 505: 289-303.

[10]郑裕龙, 陈启买, 贺超波, 等. 图卷积网络增强的非负矩阵分解社区发现方法[J]. 计算机工程与应用, 2022, 58(11): 73-83.

[11]Xue G, Zhong M, Li J, et al. Dynamic network embedding survey[J]. Neurocomputing, 2022, 472: 212-223.

[12]Leng Y, Yu L, Niu X. Dynamically aggregating individuals social influence and interest evolution for group recommendations[J]. Information Sciences, 2022, 614: 223-239.

[13]Wu S, Sun F, Zhang W, et al. Graph neural networks in recommender systems: A survey[J]. ACM Computing Surveys, 2022, 55(5): 1-37.

[14]Peng H, Du B, Liu M, et al. Dynamic graph convolutional network for long-term traffic flow prediction with reinforcement learning[J]. Information Sciences, 2021, 578: 401-416.

[15]Zhang X, Xie K, Wang S, et al. Learning based proximity matrix factorization for node embedding[C]//Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining. 2021: 2243-2253.

[16]Hao Zhu and Piotr Koniusz. REFINE: Random rangr flnder for network embedding[C]//Proceedings of the 30th ACM International Conference on Information & Knowledge Management. 2021: 3682–3686.

[17]Belkin M, Niyogi P. Laplacian eigenmaps for dimensionality reduction and data representation[J]. Neural computation, 2003, 15(6): 1373-1396.

[18]Cao S, Lu W, Xu Q, et al. GraRep: Learning graph representations with global structural information[C]//Proceedings of the 24th ACM international on conference on information and knowledge management. New York: ACM, 2015: 891-900.

[19]Mikolov T, Chen K, Corrado G, et al. Efficient Estimation of Word Representations in Vector Space[C]//Proceedings of the International Conference on Learning Representations. USA: DBLP, 2013: 1301-3781.

[20]Perozzi B, Al-Rfou R, Skiena S. Deepwalk: Online learning of social representations[C]//Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining. New York: ACM, 2014: 701-710.

[21]Grover A, Leskovec J. node2vec: Scalable feature learning for networks[C]//Proceedings of the 22nd ACM SIGKDD international conference on Knowledge discovery and data mining. New York: ACM, 2016: 855-864.

[22]Wang D, Cui P, Zhu W. Structural deep network embedding[C]//Proceedings of the 22nd ACM SIGKDD international conference on Knowledge discovery and data mining. New York: ACM, 2016: 1225-1234.

[23]Cao S, Lu W, Xu Q. Deep neural networks for learning graph representations[C]//Proceedings of the AAAI conference on artificial intelligence. AAAI Press, 2016: 1145–1152.

[24]Kipf T N, Welling M. Variational graph auto-encoders[J]. stat, 2016, 1050: 21.

[25]Kim P. Convolutional neural network[J]. MATLAB deep learning: with machine learning, neural networks and artificial intelligence, 2017: 121-147.

[26]Kipf T N, Welling M. Semi-supervised classification with graph convolutional networks[C]//Proceedings of the 5th International Conference on Learning Representations. USA: DBLP, 2017: 1-8.

[27]Hamilton W, Ying Z, Leskovec J. Inductive representation learning on large graphs[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems. New York: Curran Associates Inc, 2017: 1025-1035.

[28]Veličković P, Cucurull G, Casanova A, et al. Graph attention networks[C]//Proceedings of the International Conference on Learning Representations. USA: DBLP, 2018.

[29]Topping J, Di Giovanni F, Chamberlain B P, et al. Understanding over-squashing and bottlenecks on graphs via curvature[C]// Proceedings of the International Conference on Learning Representations. USA: DBLP, 2022.

[30]Wu W, Hu G, Yu F. Ricci curvature-based semi-supervised learning on an attributed network[J]. Entropy, 2021, 23(3): 292-292.

[31]Li H, Cao J, Zhu J, et al. Curvature graph neural network[J]. Information Sciences, 2022, 592: 50-66.

[32]Bober J, Monod A, Saucan E, et al. Rewiring networks for graph neural network training using discrete geometry[J]. arXiv preprint arXiv:2207.08026, 2022.

[33]Chami I, Ying Z, Ré C, et al. Hyperbolic graph convolutional neural networks [C]//Proceedings of the 30th Conference on Neural Information Processing Systems. USA: MIT Press, 2019: 4869–4880.

[34]Ma J, Zhang Q, Lou J, et al. Temporal network embedding via tensor factorization[C]//Proceedings of the 30th ACM International Conference on Information & Knowledge Management. New York: ACM, 2021: 3313-3317.

[35]Li J, Dani H, Hu X, et al. Attributed network embedding for learning in a dynamic environment[C]//Proceedings of the 26th ACM International Conference on Information and Knowledge Management. New York: ACM, 2017: 387-396.

[36]Chen C, Tong H. Fast eigen-functions tracking on dynamic graphs[C]//Proceedings of the 2015 SIAM International Conference on Data Mining. 2015: 559-567.

[37]Zhang Z, Cui P, Pei J, et al. TIMERS: error-bounded SVD restart on dynamic networks[C]//Proceedings of the AAAI Conference on Artificial Intelligence. USA: AAAI Press, 2018: 224-231.

[38]Zhu D, Cui P, Zhang Z, et al. High-order proximity preserved embedding for dynamic networks[J]. IEEE Transactions on Knowledge and Data Engineering, 2018, 30(11): 2134-2144.

[39]Mahdavi S, Khoshraftar S, An A. dynnode2vec: Scalable dynamic network embedding[C]//Proceedings of the IEEE International Conference on Big Data (Big Data). IEEE, 2018: 3762-3765.

[40]Zuo Y, Liu G, Lin H, et al. Embedding temporal network via neighborhood formation[C]//Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. New York: ACM, 2018: 2857-2866.

[41]Goyal P, Kamra N, He X, et al. DynGEM: Deep embedding method for dynamic graphs[J]. arXiv preprint arXiv:1805.11273, 2018.

[42]Goyal P, Chhetri S R, Canedo A. dyngraph2vec: Capturing network dynamics using dynamic graph representation learning[J]. Knowledge-Based Systems, 2020, 187: 104816.

[43]Zhou L, Yang Y, Ren X, et al. Dynamic network embedding by modeling triadic closure process[C]//Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence and Thirtieth Innovative Applications of Artificial Intelligence Conference and Eighth AAAI Symposium on Educational Advances in Artificial Intelligence. USA: AAAI Press, 2018: 571–578.

[44]Fu D, He J. SDG: A simplified and dynamic graph neural network[C]//Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval. New York: ACM, 2021: 2273-2277.

[45]Gao C, Zhu J, Zhang F, et al. A novel representation learning for dynamic graphs based on graph convolutional networks[J]. IEEE Transactions on Cybernetics, 2022, 1-14.

[46]Zhong F, Liu Y, Liu L, et al. DEDGCN: Dual evolving dynamic graph convolutional network[J]. Security and Communication Networks, 2022, 2022: 1-11.

[47]Zhang C Y, Yao Z L, Yao H Y, et al. Dynamic representation learning via recurrent graph neural networks[J]. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 2023, 53(2): 1284-1297.

[48]Li X, Li Y, Shang Y, et al. DDNE: Discriminative distance metric learning for network embedding[C]//Proceedings of the International Conference on Computational Science. USA: Springer, 2020: 568-581.

[49]Manessi F, Rozza A, Manzo M. Dynamic graph convolutional networks[J]. Pattern Recognition, 2020, 97: 107000.

[50]Pareja A, Domeniconi G, Chen J, et al. EvolveGCN: Evolving graph convolutional networks for dynamic graphs[C]//Proceedings of the AAAI Conference on Artificial Intelligence. USA: AAAI Press, 2020: 5363-5370.

[51]Trivedi R, Farajtabar M, Biswal P, et al. DyRep: Learning representations over dynamic graphs[C]//Proceedings of the International Conference on Learning Representations. USA: DBLP, 2019: 1-25.

[52]Sun L, Zhang Z, Zhang J, et al. Hyperbolic variational graph neural network for modeling dynamic graphs[C]//Proceedings of the AAAI Conference on Artificial Intelligence. USA: AAAI Press, 2021: 4375-4383.

[53]Fathy A, Li K. TemporalGAT: Attention-based dynamic graph representation learning[C]//Proceedings of the Advances in Knowledge Discovery and Data Mining. Singapore: Springer, 2020: 413-423.

[54]Lei K, Qin M, Bai B, et al. GCN-GAN: A non-linear temporal link prediction model for weighted dynamic networks[C]//Proceedings of the IEEE Conference on Computer Communications. USA: IEEE, 2019: 388-396.

[55]Goodfellow I, Pouget-Abadie J, Mirza M, et al. Generative adversarial networks[J]. Communications of the ACM, 2020, 63(11): 139-144.

[56]He K, Zhang X, Ren S, et al. Deep residual learning for image recognition[C]//Proceedings of Conference on Computer Vision and Pattern Recognition. USA: IEEE, 2016: 770-778.

[57]Ollivier Y. Ricci curvature of markov chains on metric spaces[J]. Journal of Functional Analysis, 2009, 256(3): 810-864.

[58]Wu W, Hu G, Yu F. Graph classification method based on wasserstein distance[J]. Journal of Physics: Conference Series, 2021, 1952(2): 022018.

[59]Tian Y, Lubberts Z, Weber M. Mixed-membership community detection via line graph curvature[C]//Proceedings of the NeurIPS Workshop on Symmetry and Geometry in Neural Representations. 2023: 219-233.

[60]Li J, Fu X, Sun Q, et al. Curvature graph generative adversarial Networks[C]//Proceedings of the ACM Web Conference 2022. New York: ACM, 2022: 1528-1537.

[61]Ye Z, Liu K S, Ma T, et al. Curvature graph network[C]// //Proceedings of the International Conference on Learning Representations. USA: DBLP, 2019.

[62]温雯, 黄家明, 蔡瑞初, 等. 一种融合节点先验信息的图表示学习方法[J].软件学报, 2018, 29(03): 786-798.

[63]蒋林浦,陈可佳. 基于对比预测的自监督动态图表示学习方法[J/OL].计算机科学, 2023, 1-9.

[64]李青,王一晨,杜承烈. 图表示学习方法研究综述[J/OL].计算机应用研究, 2023, 40(6): 1-16.

[65]刘杰, 尚学群, 宋凌云, 等. 图神经网络在复杂图挖掘上的研究进展[J]. 软件学报, 2022, 33(10): 3582-3618.

[66]Zhou J, Cui G, Hu S, et al. Graph neural networks: A review of methods and applications[J]. AI open, 2020, 1: 57-81.

[67]Miao X, Gürel N M, Zhang W, et al. Degnn: Improving graph neural networks with graph decomposition[C]//Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining. 2021: 1223-1233.

[68]Zeng H, Zhang M, Xia Y, et al. Decoupling the depth and scope of graph neural networks[J]. Advances in Neural Information Processing Systems, 2021, 34: 19665-19679.

[69]Oono K, Suzuki T. Graph Neural Networks Exponentially Lose Expressive Power for Node Classification[C]//International Conference on Learning Representations.

[70]Li H, Cao J, Zhu J, et al. Graph Information Vanishing Phenomenon inImplicit Graph Neural Networks[J]. arXiv e-prints, 2021: arXiv: 2103.01770.

[71]Huang G, Liu Z, Van Der Maaten L, et al. Densely connected convolutional networks[C]// Proceedings of the IEEE Conference on Computer Vision and Pattern Recognitionthe. USA: IEEE, 2017: 4700-4708.

[72]韩旭, 闵超, 张靖雯. 基于多维特征的引文扩散模式预测研究[J]. 图书情报工作, 2022, 66(09): 82-92.

[73]曹燕, 董一鸿, 邬少清, 等. 动态网络表示学习研究进展[J]. 电子学报, 2020, 48(10): 2047-2059.

[74]Hajiramezanali E, Hasanzadeh A, Narayanan K, et al. Variational graph recurrent neural networks[C]//Proceedings of the 33rd International Conference on Neural Information Processing Systems. 2019: 10701–10711.

中图分类号:

 391.9    

开放日期:

 2023-06-26    

无标题文档

   建议浏览器: 谷歌 火狐 360请用极速模式,双核浏览器请用极速模式