《动态推荐场景下的图学习.pdf》由会员分享,可在线阅读,更多相关《动态推荐场景下的图学习.pdf(61页珍藏版)》请在三个皮匠报告上搜索。
1、动态推荐场景下的图学习孙庆赟北京航空航天大学 计算机学院Homepage:https:/sunqysunqy.github.io/Email:sunqybuaa.eduOutline Background:Deep Graph Learning for Recommendation Dynamic Graph OOD:Environment-aware Dynamic Graph Learning Topology-Imbalance:Position-aware Graph Structure Learning Dataset Distillation:Structure-broadcast
2、ing Graph Condensation Privacy-Preserving Recommendation:Differential privacy for HGNN Conclusion2?Node classificationAd&ProductRecommendationFriend RecommendationnGraph Learning has been widely applied in online recommendationp E-commerce,Content Sharing,Social Networking,Forum User-User Connection
3、sUser-Item ConnectionsItem-Item ConnectionsPOI&PostRecommendationThe Era of Connected Worldlink predictionsubgraph classificationMethodologyBackgroundBackgroundChallanges for Graph LearningnDynamic&Open:Distribution shifts naturally exists in graph,and can be spatio-temporal.nImbalance:Graph-specifi
4、c topology imbalance leads to decision boundary shift.nLarge-scale:How to construct smaller-scale recommendation datasets for efficiently training?nPrivacy:Leakage of sensitive user informationGraph Data from multiple domainsDynamic Graph DataImbalanced topology distributionMethodologyBackgroundBack
5、groundOutline Background:Deep Graph Learning for Recommendation Dynamic Graph OOD:Environment-aware Dynamic Graph Learning Topology-Imbalance:Position-aware Graph Structure Learning Dataset Distillation:Structure-broadcasting Graph Condensation Privacy-Preserving Recommendation:Differential privacy
6、for HGNN Conclusion5nTasks on real-world graphs are challengingp Distribution shifts naturally exists in graph data,and can be spatio-temporalp Out-of-distribution(OOD)generalized GNNs are critically needed!ConclusionExperimentstraffic networkstransaction networksMethodologyBackgroundBackgroundDynam
7、ic Graph OODConclusionExperimentsBackgroundnProblem formulationp OOD generalization:p OOD generalization on dynamic graphs:nMain ideap Investigating environments carefully,finding spatio-temporal invariant patterns,applying causal inference to decorrelations by interventionsgoalMethodologyBackground
8、BackgroundDynamic Graph OODConclusionExperimentsBackgroundHaonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dynamic Graph Learning for Out-of-Distribution Generalization,NeurIPS 2023MethodologyBackgroundBackgroundDynamic Graph OODnStep-1:Environments Modelingp Goal:capture latent environments around
9、each nodep Environment-Aware DGNN(EA-DGNN)u EAConv:u multi-channel convolutions with spatial aggregation:u holistic temporal aggregation:u overall architecture:p Now we have modeled environments by obtaining environment-aware node representations(easily extended to other sequential convolution model
10、s)Haonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dynamic Graph Learning for Out-of-Distribution Generalization,NeurIPS 2023MethodologyBackgroundBackgroundDynamic Graph OODConclusionExperimentsBackgroundHaonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dynamic Graph Learning for Out-of-Distribution
11、Generalization,NeurIPS 2023MethodologyBackgroundBackgroundDynamic Graph OODConclusionBackgroundwith their respective multi-labelsampling&generatingnStep-2:Environments Inferringp Goal:infer distribution of latent environments and instantiate samples with given labelsu ECVAE to infer:u denote observe
12、d sample library as:u maximize minimize u environment recognition network (encoder)prior network (observed)environment sample generation network (decoder)p Now we have inferred distributions of environments and established joint sample librariesHaonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dynami
13、c Graph Learning for Out-of-Distribution Generalization,NeurIPS 2023MethodologyBackgroundBackgroundDynamic Graph OODConclusionExperimentsBackgroundHaonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dynamic Graph Learning for Out-of-Distribution Generalization,NeurIPS 2023MethodologyBackgroundBackgroun
14、dDynamic Graph OODConclusionExperimentsBackgroundInvarianceOOD GeneralizationVarianceEnvironment/DomainInformationHaonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dynamic Graph Learning for Out-of-Distribution Generalization,NeurIPS 2023MethodologyBackgroundBackgroundDynamic Graph OODnStep-3:Environ
15、ments Discriminatingp Goal:discriminate spatio-temporal invariant/variant patterns for generalized predictionu Assumption (a)Invariance Property:(b)Sufficient Condition:u Propositionp Now we have discriminated spatio-temporal invariant/variant patterns node-wisely over time ConclusionExperimentsBack
16、groundHaonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dynamic Graph Learning for Out-of-Distribution Generalization,NeurIPS 2023MethodologyBackgroundBackgroundDynamic Graph OODConclusionExperimentsBackgroundnStep-4:Environments Generalizingp Goal:applying causal inference to decorrelations with int
17、erventions on variant parts Ladder of Causation(Judea Pearl)u Objectivesu Interventionu Overall Loss Haonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dynamic Graph Learning for Out-of-Distribution Generalization,NeurIPS 2023MethodologyBackgroundBackgroundDynamic Graph OODMethodologyBackgroundnDatase
18、tsHaonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dynamic Graph Learning for Out-of-Distribution Generalization,NeurIPS 2023MethodologyBackgroundBackgroundDynamic Graph OODnMain results(future link prediction)MethodologyBackgroundHaonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dynamic Graph Learni
19、ng for Out-of-Distribution Generalization,NeurIPS 2023MethodologyBackgroundBackgroundDynamic Graph OODnAblation Studyp EAGLE(w/o EI).Remove the Environment Instantiation mechanism.p EAGLE(w/o IPR).Remove the Invariant Pattern Recognition mechanismp EAGLE(w/o Interv).Remove the spatio-temporal causal
20、 Intervention mechanism.Haonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dynamic Graph Learning for Out-of-Distribution Generalization,NeurIPS 2023MethodologyBackgroundBackgroundDynamic Graph OODnAnalysis on Invariant Pattern Recognition MechanismHaonan Yuan,Qingyun Sun*,et.al.Environment-Aware Dyna
21、mic Graph Learning for Out-of-Distribution Generalization,NeurIPS 2023MethodologyBackgroundBackgroundDynamic Graph OODOutline Background:Deep Graph Learning for Recommendation Dynamic Graph OOD:Environment-aware Dynamic Graph Learning Topology-Imbalance:Position-aware Graph Structure Learning Datase
22、t Distillation:Structure-broadcasting Graph Condensation Privacy-Preserving Recommendation:Differential privacy for HGNN Conclusion20Imbalance Problem in Machine LeaningData imbalance leads to decision boundary shift.21Acknowledgement from https:/arxiv.org/pdf/2111.12791.pdfbias induced by imbalance
23、BackgroundDynamic Graph OODTopologyImbalanceSolutions for Learning from Imbalanced Datare-sampling and re-weighting22Acknowledgement from https:/ai-scholar.tech/zh/articles/deep-learning/DIR,https:/arxiv.org/pdf/2111.12791.pdfInformation Redistributionalgorithm level:re-weightingdata level:re-sampli
24、ngBackgroundDynamic Graph OODTopologyImbalanceImbalance Issue in Graphs23 For graph data,a significant challenge is that the topological properties of the nodes(e.g.,locations,roles)are unbalanced(topology imbalance).The distribution of node labels is uneven in the position of topologyThe number of
25、node labels is unevenQuantity imbalancTopology imbalancebalanceimbalancevsBackgroundDynamic Graph OODTopologyImbalanceGraph Imbalance Needs New SolutionsReweighting and resampling cannot reduce the imbalance bias in non-IID Graph data effectively.-information density,strength of supervision signal,i
26、nformation propagation way24Acknowledgement from https:/iq.opengenus.org/graph-neural-networks/?vs.IID datanon-IID Graph dataBackgroundDynamic Graph OODTopologyImbalance25Qingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Relieving Under-reachi
27、ng and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardUnderstanding Position-imbalancen over-squashing2The receptive field of GNNs is exponentially-growing and all information is compressed into fixed-length vectors.The supervision information is squashed when passing across the narrow p
28、ath together with other useless information.over-squashingn under-reaching 1The influence from labeled nodes decays with the topology distance,resulting in that the nodes far away from labeled nodes lack supervision information.vavbvcunder-reachingXunlabeled nodeslabeled nodes1 Buchnik E,Cohen E.Boo
29、tstrapped graph diffusions:Exposing the power of nonlinearity.ACM Inter.Conf.on Measurement and Modeling of Computer Systems.2018.2 Alon U,Yahav E.On the bottleneck of graph neural networks and its practical implications.ICLR 2021.Q1:Why does position-imbalance affect graph representation learning?B
30、ackgroundDynamic Graph OODTopologyImbalance26Qingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Relieving Under-reaching and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardUnderstanding Position-imbalanceover-squashingvavbvcunder-re
31、achingXunlabeled nodeslabeled nodes Q1:Why does position-imbalance affect graph representation learning?n Under-reaching Reaching Coefficient(RC)RC:better reachability(more shortcuts/paths)n Over-squashing Squashing Coefficient(SC)SC:lower squashing(more ring structures)BackgroundDynamic Graph OODTo
32、pologyImbalance27Understanding Topology-imbalancen Q1:Why does topology-imbalance affect graph representation learning?n Q2:What kind of graphs are susceptible to topology-imbalance?SBMn Conclusion:poor reachability(smaller RC)and stronger squashing(smaller SC)acc.=65.31%acc.=62.86%acc.=49.89%acc.=5
33、1.24%acc.=34.88%acc.=24.98%same structure+different labeled nodesdifferent structure+same labeled nodesQingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Relieving Under-reaching and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardBa
34、ckgroundDynamic Graph OODTopologyImbalancenPosition-Aware STructurE Learning framework(PASTEL)n Task:Semi-supervised node classificationn Position-aware Structure Learning:anchor-based position encoding methodn Class-wise Conflict Measure:guide what nodes should be more closely connectedn Learning w
35、ith the Optimized Structure:original+position+feature+constraints 28Qingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Relieving Under-reaching and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardBackgroundDynamic Graph OODTopologyIm
36、balancePosition-aware Structure Learningn Anchor-based Position Encodingn separate labeled nodes by class:n considerate an unlabeled node (e.g.6):n measure position relations between and anchor sets:n position-aware encoding of node:n transform to learnable vector:n Position-aware Metric Learningn c
37、onsider both feature info and position-based similarity to form an edge:=,6=6=,29Qingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Relieving Under-reaching and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardBackgroundDynamic Graph
38、OODTopologyImbalanceClass-wise Conflict Measuren Group PageRank(GPR)n traditional PageRank Group PageRank(label-aware):measure supervision information from labeled nodes of each classn Expecting:GPR vector of nodes to form a“sharp”distribution focusing on their ground truth label n Control the conne
39、ction strength of an edgen GPR vectors measure conflict:n conflict edge weight:,position-aware i-th rowGPR vector of c-th dimensioninfluence of labeled nodesof class c on 30Qingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Relieving Under-reac
40、hing and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardBackgroundDynamic Graph OODTopologyImbalanceLearning with the Optimized Structuren Graph structure mixing and optimizationn position-aware adjacency:n node feature view adjacency:,n mixing:n Learning objectivesn structure quality co
41、ntrol:,n classification loss:,n overall loss:smoothconnectivitysparsity31Qingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Relieving Under-reaching and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardBackgroundDynamic Graph OODTopol
42、ogyImbalanceExperimental Setups32n Datasetsn real-world datasets:Cora,Citeseer,Photo,Actor,Chameleon,Squirreln synthetic graph:Stochastic Block Model(SBM)n Baselinesn GNN backbones:GCN,GAT,APPNP,GraphSAGEn topology-imbalance specific baselines:ReNoden graph structure learning baselines:DropEdge,AddE
43、dge,SDRF,NeuralSparse,IDGLn Classification task setting:set the number of labeled nodes in each class to be 20n Metricsn Weighted-F1(W-F1)standard deviationn Macro-F1(M-F1)standard deviationQingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Rel
44、ieving Under-reaching and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardBackgroundDynamic Graph OODTopologyImbalanceNode Classification on Real-world Graphs33Qingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Relieving Under-reachi
45、ng and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardReNode:based on homophily assmptionPASTEL:show superiority on all datasets.BackgroundDynamic Graph OODTopologyImbalanceNode Classification on Cora with different imbalance level34 Cora-L,Cora-M,Cora-H:topology-imbalance level low(L)/m
46、edium(M)/high(H)GNN backbone:GCNPASTEL performs best on all datasets with different imbalance level.PASTEL achieve up to 4.4%improvement on the highly topology-imbalance dataset.Qingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Relieving Under
47、-reaching and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardBackgroundDynamic Graph OODTopologyImbalanceNode Classification on Synthetic Graphs35 Stochastic Block Model(SBM)(N=3000,C=6)GNN backbone:GCNtopology-imbalance level:high lowPASTEL can increase the classification Weighted-F1 sc
48、ore by 5.38%-21.35%on SBM graphs with different community structures,showing superior effectiveness.Qingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Relieving Under-reaching and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardBackg
49、roundDynamic Graph OODTopologyImbalanceAnalysis of Learned Structure36All the structure learning methods learn structures with larger RC&SC,leading the performance improvement of node classification.(a)Original Graph(b)ReNode(c)SDRF(d)IDGL(e)PASTEL(Ours)Qingyun Sun,Jianxin Li,Haonan Yuan,et al.Posit
50、ion-aware Structure Learning for Graph Topology-imbalance by Relieving Under-reaching and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardPASTEL can obtain graph structure with clearer class boundaries.BackgroundDynamic Graph OODTopologyImbalanceAnalysis of Learned Structure:Change of GPR
51、 Vector randomly choose 10 nodes for each class in Cora visualize their GPR vectors on the original graph and the learned graph37C1C2C3C4C5C6C7V1V2V3V4V5V6V7C1C2C3C4C5C6C7V1V2V3V4V5V6V7(a)Original Graph(b)Learned Graphn :10 nodes of class in :the i-th classThe class-wise conflict measure plays an im
52、portant role on giving guidance for more class connectivity orthogonality.Qingyun Sun,Jianxin Li,Haonan Yuan,et al.Position-aware Structure Learning for Graph Topology-imbalance by Relieving Under-reaching and Over-squashing.CIKM 2022,Best Paper Honorable Mention AwardBackgroundDynamic Graph OODTopo
53、logyImbalanceLabel Imbalance of Hierarchical StructureHierarchy-imbalance is caused by the uneven distribution of labeled nodes in implicit topological properties.38Xingcheng Fu,Yuecen Wei,Qingyun Sun,et al.Hyperbolic Geometric Graph Representation Learning for Hierarchy-imbalance Node Classificatio
54、n.WWW 2023 SpotlightHierarchy-imbalanceHierarchy-balanceEmployeeDepartment ManagerCEO etc.Department ADepartment CDepartment BFor example,in the organizational structure of a business,we sometimes prefer to be organized by department rather than by the rank.BackgroundDynamic Graph OODTopologyImbalan
55、ceIn a Hyperbolic Geometry Perspective Hierarchical properties can be better preserved in hyperbolic space39Xingcheng Fu,Yuecen Wei,Qingyun Sun,et al.Hyperbolic Geometric Graph Representation Learning for Hierarchy-imbalance Node Classification.WWW 2023 SpotlightTree structure and hierarchical struc
56、turein hyperbolic spaceStructural Geometric PrioriBackgroundDynamic Graph OODTopologyImbalanceIn a Hyperbolic Geometry Perspective 40Xingcheng Fu,Yuecen Wei,Qingyun Sun,et al.Hyperbolic Geometric Graph Representation Learning for Hierarchy-imbalance Node Classification.WWW 2023 SpotlightTopology Spa
57、ceQuantityImbalancePosition ImbalanceHierarchy ImbalanceEmbeddingSpaceBackgroundDynamic Graph OODTopologyImbalanceHyperIMBA Architecture Hierarchy-aware Margin(HAM):reducing the decision boundary bias by hierarchy-imbalance labeled nodes.Hierarchy-aware Message-passing(HMPNN):alleviating the over-sq
58、uashing caused by cross-hierarchy connectivity.41Xingcheng Fu,Yuecen Wei,Qingyun Sun,et al.Hyperbolic Geometric Graph Representation Learning for Hierarchy-imbalance Node Classification.WWW 2023 SpotlightBackgroundDynamic Graph OODTopologyImbalanceEvaluation on Synthetic GraphTo verify the hierarchy
59、 capturing ability,we evaluate our method on hierarchical organization synthetic graph Hierarchical Network Model(HNM).42Xingcheng Fu,Yuecen Wei,Qingyun Sun,et al.Hyperbolic Geometric Graph Representation Learning for Hierarchy-imbalance Node Classification.WWW 2023 SpotlightThree communities of HNM
60、 with the same hierarchical structure and are evenly distributed in three directions of the graph.Top-level(1、2、3-order fractals)Middle-level(4-order fractals)Bottom-level(5-order fractals)BackgroundDynamic Graph OODTopologyImbalanceEvaluation on Real-world Graph43Xingcheng Fu,Yuecen Wei,Qingyun Sun
61、,et al.Hyperbolic Geometric Graph Representation Learning for Hierarchy-imbalance Node Classification.WWW 2023 SpotlightHigh homophilyHigh heterophilyWeak hierarchyPoor connectivityBackgroundDynamic Graph OODTopologyImbalanceOutline Background:Deep Graph Learning for Recommendation Dynamic Graph OOD
62、:Environment-aware Dynamic Graph Learning Topology-Imbalance:Position-aware Graph Structure Learning Dataset Distillation:Structure-broadcasting Graph Condensation Privacy-Preserving Recommendation:Differential privacy for HGNN Conclusion441 Dataset Condensation for Recommendation.Jiahao Wu et.al;Ar
63、xiv 2023.Question:How to construct smaller-scale recommend datasets for efficiently training?Existing Methods Flaws:Sampling-based:long-tailed distribution problemSynthesizing-based:discreteness of interactionsWith Graph Dataset Distillation(Potential):Performance preserving:novel gradient matching
64、strategyBalance sample ratio preserving:balanced initial label generationStructure preserving:user-item interactions pattern preservingGraph Dataset Distillation for Recommendation BackgroundDynamic Graph OODTopologyImbalanceGraphCondensationl“压缩即智慧压缩即智慧”:AGI 基础模型的目标是实现对有效信息最大限度无损压缩基础模型的目标是实现对有效信息最大
65、限度无损压缩图像数据压缩用0.01%的数据保持99.9%的性能Beining Yang,Kai Wang,Qingyun Sun,et.al.Does Graph Distillation See Like Vision Dataset Counterpart?NeurIPS 2023?Full Dataset(size=50k)Distilled Dataset(size=10)LLM:智能涌现,通过泛化提取规则图数据压缩如何保持图结构信息?Dictionary of 1000 pagesGrammar Book of 100 pagescompression/distillation数据压
66、缩:对训练数据所代表的真实世界信息能够最大程度的泛化表示OpenAI:“GPT 的训练过程是对数据的无损压缩”BackgroundDynamic Graph OODTopologyImbalanceGraphCondensationl 图数据压缩,图数据压缩,如何如何度量结构信息度量结构信息:从谱域的角度出发,从谱域的角度出发,拉普拉斯能量分布拉普拉斯能量分布(Laplacian Energy Distribution,LED)压缩过程:GNN被看作图信号的带通滤波器1,可能会导致生成图损失特定带宽的信息(即LED shift)压缩结果:下游GNN对应的带通信号未知,导致不同框架的效果方差1
67、Muhammet Balcilar.Analyzing the Expressive Power of Graph Neural Networks in a Spectral Perspective,ICLR 2021.Magnitude02CondenseGCN Frequency ProfileMagnitude02LEDLED压缩图的LED和原始图LED存在显著差异(不同的peak形状)随着SC的升高(即LED shift越显著),在不同框架下的平均结果降低图压缩的LED shift差异下界和GNN的带通特性有关例如:GCN作为压缩器,丢失高频信号BackgroundDynamic Gr
68、aph OODTopologyImbalanceGraphCondensationl 压缩方法:要把有压缩方法:要把有N的节点的图压缩到的节点的图压缩到M个节点(个节点(MN),通过梯度匹),通过梯度匹配和结构学习蒸馏原始图知识配和结构学习蒸馏原始图知识l 优化目标:优化最优传输距离优化目标:优化最优传输距离(Optimal Transition Distance),降低压缩后,降低压缩后图结构图结构A和原始图结构和原始图结构A的的LED shift差异差异Beining Yang,Kai Wang,Qingyun Sun,et.al.Does Graph Distillation See L
69、ike Vision Dataset Counterpart?NeurIPS 2023BackgroundDynamic Graph OODTopologyImbalanceGraphCondensationBeining Yang,Kai Wang,Qingyun Sun,et al.Does Graph Distillation See Like Vision Dataset Counterpart?NeurIPS 2023异常检测任务:更关注于节点高频特征本方法:超出baseline 5%-10%使用本方法压缩后的数据训练,可以在仅有0.1%规模的数据上达到原精度的99%l 在在社交、购
70、物、引文等社交、购物、引文等9个数据集上进行个数据集上进行不同不同比例压缩比例压缩l 三大典型任务验证:节点分类、连接预测、异常检测三大典型任务验证:节点分类、连接预测、异常检测BackgroundDynamic Graph OODTopologyImbalanceGraphCondensationl 跨框架泛化实验跨框架泛化实验(1 vs N)l 跨框架泛化实验跨框架泛化实验(N vs N)l 可视化实验可视化实验l 在压缩图上训练,时间加速比在压缩图上训练,时间加速比23-51.6倍倍,内存节省,内存节省3.9-14.5倍倍本方法可以更好保留结构特性本方法不依赖于具体GNN压缩器,泛化性强
71、BackgroundDynamic Graph OODTopologyImbalanceGraphCondensationOutline Background:Deep Graph Learning for Recommendation Dynamic Graph OOD:Environment-aware Dynamic Graph Learning Topology-Imbalance:Position-aware Graph Structure Learning Dataset Distillation:Structure-broadcasting Graph Condensation
72、Privacy-Preserving Recommendation:Differential privacy for HGNN Conclusion51同质图数据隐私保护(只对单一类型敏感节点扰动)单一语义:同质图隐私保护复杂语义:异质图隐私保护异质图数据隐私保护(难以保护多种类型关联的数据)扰动保护同类节点A和B的敏感关系不同节点类型信息增强了攻击者的推理能力Yuecen Wei,Xingcheng Fu,Qingyun Sun,et.al.Heterogeneous Graph Neural Network for Privacy-Preserving Recommendation.ICD
73、M 2022 (Best-rank paper).BackgroundDynamic Graph OODTopologyImbalanceDP for RecommendationGraphCondensationlMethodology:针对图数据异质性中的隐私问题,提出一种具有语义感知的差分隐私异质图神经网络学习方法HeteDP。分别解决:不同类型节点对隐私保护需求不同的问题拓扑异质增强了关系推理能力的问题多重噪声影响模型性能的问题Yuecen Wei,Xingcheng Fu,Qingyun Sun,et.al.Heterogeneous Graph Neural Network for
74、 Privacy-Preserving Recommendation.ICDM 2022 (Best-rank paper).BackgroundDynamic Graph OODTopologyImbalanceDP for RecommendationGraphCondensation特征注意扰动机制:特征注意扰动机制:利用元路径来适应节点的异质性,并通过注意力机制实施个性化的节点级隐私保护。2=max,2=+0,2 2 2 注入噪声 计算节点邻居的注意力系数 计算语义级注意力系数,+1=1=1 ;=exp+exp+Yuecen Wei,Xingcheng Fu,Qingyun Sun,e
75、t.al.Heterogeneous Graph Neural Network for Privacy-Preserving Recommendation.ICDM 2022 (Best-rank paper).BackgroundDynamic Graph OODTopologyImbalanceDP for RecommendationGraphCondensation拓扑梯度扰动机制:拓扑梯度扰动机制:设计多关系聚合的消息传递卷积神经网络层以适应拓扑的异质性,并通过在训练梯度中施加高斯噪声来进行拓扑级隐私保护。异质卷积神经网络层+1=AGG ,.=HETEGCN,嵌入编码器/链接预测器
76、,=1Pn ,2 =1 =1 注入噪声=1 gmax 1,g 22+0,2 2 2 gYuecen Wei,Xingcheng Fu,Qingyun Sun,et.al.Heterogeneous Graph Neural Network for Privacy-Preserving Recommendation.ICDM 2022 (Best-rank paper).BackgroundDynamic Graph OODTopologyImbalanceDP for RecommendationGraphCondensation 扰动扰动-效用均衡的双层优化机制效用均衡的双层优化机制:通过分
77、别优化节点级与拓扑级的隐私预算分配,在隐私保护的前提下提高模型的效用性。Yuecen Wei,Xingcheng Fu,Qingyun Sun,et.al.Heterogeneous Graph Neural Network for Privacy-Preserving Recommendation.ICDM 2022 (Best-rank paper).BackgroundDynamic Graph OODTopologyImbalanceDP for RecommendationGraphCondensationHeteSDG能达到较高性能,分类边界更清晰,表明语义感知具有更好的适应能力。
78、HeteSDG具有更强的隐私保护效果,其中拓扑扰动对模型影响更大。w/o w/o TopoGDPTopoGDP:近似基于特征扰动的模型w/o w/o FeatADPFeatADP:近似基于梯度扰动的模型Yuecen Wei,Xingcheng Fu,Qingyun Sun,et.al.Heterogeneous Graph Neural Network for Privacy-Preserving Recommendation.ICDM 2022 (Best-rank paper).BackgroundDynamic Graph OODTopologyImbalanceDP for Recom
79、mendationGraphCondensation 隐私预算参数的敏感性实验:随着的增加,模型的性能呈上升趋势。双层优化实验:与两阶段平均分配隐私预算相比,双层优化可以更好的权衡隐私保护与模型性能。Yuecen Wei,Xingcheng Fu,Qingyun Sun,et.al.Heterogeneous Graph Neural Network for Privacy-Preserving Recommendation.ICDM 2022 (Best-rank paper).BackgroundDynamic Graph OODTopologyImbalanceDP for Recomm
80、endationGraphCondensationOutline Background:Deep Graph Learning for Recommendation Dynamic Graph OOD:Environment-aware Dynamic Graph Learning Topology-Imbalance:Position-aware Graph Structure Learning Dataset Distillation:Structure-broadcasting Graph Condensation Privacy-Preserving Recommendation:Di
81、fferential privacy for HGNN Conclusion59BackgroundDynamic Graph OODTopologyImbalanceConclusion60 Dynamic Graph OOD:Environment-Aware Dynamic Graph Learning for Out-of-Distribution Generalization,NeurIPS 2023 Position Imbalance:Position-aware Structure Learning for Graph Topology-imbalance by Relievi
82、ng Under-reaching and Over-squashing.CIKM 2022,Best Paper Honorable Mention Award Hierachy Imbalance:Hyperbolic Geometric Graph Representation Learning for Hierarchy-imbalance Node Classification.WWW 2023 Spotlight Graph Condensation:Does Graph Distillation See Like Vision Dataset Counterpart?NeurIPS 2023 DP for Recommendation:Heterogeneous Graph Neural Network for Privacy-Preserving Recommendation.ICDM 2022,Best-rank paper DP for RecommendationGraphCondensationThank you!孙庆赟北京航空航天大学 计算机学院61Email:sunqybuaa.eduHomepage:https:/sunqysunqy.github.io/