kdd 2023 - graph self-distillation and completion to mitigate degree-related biases
Published 1 year ago • 112 plays • Length 2:01Download video MP4
Download video MP3
Similar videos
-
2:04
kdd 2023 - variance reduction using in-experiment data
-
2:00
kdd 2023 - narrow the input mismatch in deep graph neural network distillation
-
1:55
kdd 2023 - kernel ridge regression-based graph dataset distillation
-
1:46
kdd 2023 - a counterfactual graph learning framework for legal case retrieval
-
2:06
kdd 2023 - a look into causal effects under entangled treatment in graphs
-
2:20
kdd 2023 - reconsidering learning objectives in unbiased recommendation
-
1:36
kdd 2023 - learning strong graph neural networks with weak information
-
2:01
kdd 2023 - meta graph learning for long-tail recommendation
-
5:06
generalizing dataset distillation via deep generative prior | cvpr 2023
-
5:00
class similarity weighted knowledge distillation for continual semantic segmentation - cvpr2022
-
38:22
kdd2023 - bridging the utility gap
-
2:22
kdd 2023 - hyperbolic graph neural networks: a tutorial on methods and applications
-
1:56
kdd 2023 - kernel ridge regression-based dataset distillation
-
1:59
kdd 2023 - degree-corrected social graph refinement for fake news detection
-
1:56
kdd 2023 - empower post-hoc graph explanations with information bottleneck
-
1:44
kdd 2023 - synthesizing harder samples for class-imbalanced node classification
-
2:13
kdd 2023 - incremental causal graph learning for online root cause analysis
-
1:51
kdd 2023 - rethinking homophily in graph contrastive learning
-
1:57
kdd 2023 - fusing balanced and biased sampling for graph contrastive learning
-
2:16
kdd 2023 - enhancing graph representations learning with decorrelated propagation
-
1:59
kdd 2023 - clenshaw graph neural networks
-
1:43
kdd 2023 - universal and generalizable structure learning for graph neural networks