Disentangled Multiplex Graph Representation Learning

Cite this paper, related material.

  • Download PDF

Navigation Menu

Search code, repositories, users, issues, pull requests..., provide feedback.

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly.

To see all available qualifiers, see our documentation .

  • Notifications

Latest commit

File metadata and controls, dmg: disentangled multiplex graph representation learning.

This repository contains the reference code for the manuscript ``Disentangled Multiplex Graph Representation Learning"

Installation

Preparation.

  • pip install -r requirements.txt
  • Download the datasets
  • Download the trained models

Important args:

  • --use_pretrain Test checkpoints
  • --dataset acm, imdb, dblp, freebase
  • --custom_key Node: node classification

python main.py

use_pretrain == 'True'

Disentangled Contrastive Learning on Graphs

Part of Advances in Neural Information Processing Systems 34 (NeurIPS 2021)

Haoyang Li, Xin Wang, Ziwei Zhang, Zehuan Yuan, Hang Li, Wenwu Zhu

Recently, self-supervised learning for graph neural networks (GNNs) has attracted considerable attention because of their notable successes in learning the representation of graph-structure data. However, the formation of a real-world graph typically arises from the highly complex interaction of many latent factors. The existing self-supervised learning methods for GNNs are inherently holistic and neglect the entanglement of the latent factors, resulting in the learned representations suboptimal for downstream tasks and difficult to be interpreted. Learning disentangled graph representations with self-supervised learning poses great challenges and remains largely ignored by the existing literature. In this paper, we introduce the Disentangled Graph Contrastive Learning (DGCL) method, which is able to learn disentangled graph-level representations with self-supervision. In particular, we first identify the latent factors of the input graph and derive its factorized representations. Each of the factorized representations describes a latent and disentangled aspect pertinent to a specific latent factor of the graph. Then we propose a novel factor-wise discrimination objective in a contrastive learning manner, which can force the factorized representations to independently reflect the expressive information from different latent factors. Extensive experiments on both synthetic and real-world datasets demonstrate the superiority of our method against several state-of-the-art baselines.

Name Change Policy

Requests for name changes in the electronic proceedings will be accepted with no questions asked. However name changes may cause bibliographic tracking issues. Authors are asked to consider this carefully and discuss it with their co-authors prior to requesting a name change in the electronic proceedings.

Use the "Report an Issue" link to request a name change.

> cs > arXiv:2310.18152

  • Other formats

Current browse context:

Change to browse by:, references & citations, dblp - cs bibliography, computer science > computation and language, title: disentangled representation learning with large language models for text-attributed graphs.

Abstract: Text-attributed graphs (TAGs) are prevalent on the web and research over TAGs such as citation networks, e-commerce networks and social networks has attracted considerable attention in the web community. Recently, large language models (LLMs) have demonstrated exceptional capabilities across a wide range of tasks. However, the existing works focus on harnessing the potential of LLMs solely relying on prompts to convey graph structure information to LLMs, thus suffering from insufficient understanding of the complex structural relationships within TAGs. To address this problem, in this paper we present the Disentangled Graph-Text Learner (DGTL) model, which is able to enhance the reasoning and predicting capabilities of LLMs for TAGs. Our proposed DGTL model incorporates graph structure information through tailored disentangled graph neural network (GNN) layers, enabling LLMs to capture the intricate relationships hidden in text-attributed graphs from multiple structural factors. Furthermore, DGTL operates with frozen pre-trained LLMs, reducing computational costs and allowing much more flexibility in combining with different LLM models. Experimental evaluations demonstrate the effectiveness of the proposed DGTL model on achieving superior or comparable performance over state-of-the-art baselines. Additionally, we also demonstrate that our DGTL model can offer natural language explanations for predictions, thereby significantly enhancing model interpretability.

Submission history

Link back to: arXiv , form interface , contact .

IMAGES

  1. Figure 1 from Disentangled Multiplex Graph Representation Learning

    disentangled multiplex graph representation learning

  2. Disentangled Representation Learning Definition

    disentangled multiplex graph representation learning

  3. Disentangled Graph Convolutional Networks (Paper Explained)

    disentangled multiplex graph representation learning

  4. Disentangled Representation Learning

    disentangled multiplex graph representation learning

  5. Learning Disentangled Representations

    disentangled multiplex graph representation learning

  6. RS-013: Disentangled Representation Learning

    disentangled multiplex graph representation learning

VIDEO

  1. Graph Representation

  2. Graph Representation math solutions

  3. Animation Assignment

  4. Graph Representation Scheme

  5. [tt8745] Text-Attributed Graph Representation Learning: Methods, Applications, and Challenges

  6. Machine Learning Tutorial 6 (Perceptrons and Gradient Descent)

COMMENTS

  1. Disentangled Multiplex Graph Representation Learning

    vestigate a new unsupervised framework, i.e., Disentangled Multiplex Graph representation learning (DMG for brevity), to conduct effective and robust UMGRL, as shown in Figure 1. To do this, we first decouple the common and private representations by designing a new disentangled represen-tation learning for the multiplex graph to extract complete

  2. Disentangled Multiplex Graph Representation Learning

    A paper that proposes a method to extract common and private information from multiplex graphs using disentangled representation learning and contrastive constraint. The method is theoretically analyzed and experimentally verified to benefit downstream tasks.

  3. Disentangled multiplex graph representation learning

    Unsupervised multiplex graph representation learning (UMGRL) has received increasing interest, but few works simultaneously focused on the common and private information extraction. ... To achieve this, we first investigate disentangled representation learning for the multiplex graph to capture complete and clean common information, as well as ...

  4. Disentangled Multiplex Graph Representation Learning

    Disentangled Multiplex Graph Representation Learning. This paper argues that it is essential for conducting effective and robust UMGRL to extract complete and clean common information, as well as more-complementarity and less-noise private information, and investigates disentangled representation learning for the multiplex graph to capture ...

  5. GitHub

    Contribute to YujieMo/DMG development by creating an account on GitHub. @InProceedings{Mo_ICML_2023, title={Disentangled Multiplex Graph Representation Learning}, booktitle={Proceedings of the 40st International Conference on Machine Learning}, author={Mo, Yujie and Lei, Yajie and Shen, Jialie and Shi, Xiaoshuang and Shen, Heng Tao and Zhu, Xiaofeng}, volume={202}, pages={24983--25005}, year ...

  6. Disentangled Multiplex Graph Representation Learning

    Disentangled Multiplex Graph Representation Learning @inproceedings{Mo2023DisentangledMG, title={Disentangled Multiplex Graph Representation Learning}, author={Yujie Mo and Yajie Lei and Jialie Shen and Xiaoshuang Shi and Heng Tao Shen and Xiao-lan Zhu}, booktitle={International Conference on Machine Learning}, year={2023}, url={https://api ...

  7. {"payload":{"allShortcutsEnabled":false,"fileTree":{"":{"items":[{"name

    DMG: Disentangled Multiplex Graph Representation Learning \n This repository contains the reference code for the manuscript ``Disentangled Multiplex Graph Representation Learning\"

  8. Disentangled Multiplex Graph Representation Learning

    Disentangled Multiplex Graph Representation Learning @inproceedings{Mo2023DisentangledMG, title={Disentangled Multiplex Graph Representation Learning}, author={Yujie Mo and Yajie Lei and Jialie Shen and Xiaoshuang Shi and Heng Tao Shen and Xiao-lan Zhu}, booktitle={International Conference on Machine Learning}, year={2023}, url={https://api ...

  9. Simple Self-supervised Multiplex Graph Representation Learning

    Self-supervised multiplex graph representation learning (SMGRL) aims to capture the information from the multiplex graph, and generates discriminative embedding without labels. ... Yazhou Ren, Huayi Tang, Xiaorong Pu, Xiaofeng Zhu, Ming Zeng, and Lifang He. 2021a. Multi-VAE: Learning Disentangled View-Common and View-Peculiar Visual ...

  10. [2211.11695] Disentangled Representation Learning

    This paper comprehensively reviews the concept, methods and applications of disentangled representation learning (DRL), a learning strategy that aims to identify and separate the underlying factors of variation in data. It covers DRL based on intuitive definition and group theory definition, and four categories of methodologies, such as statistical, variational, generative and hierarchical.

  11. Multiplex Graph Representation Learning via Common and Private

    Multiplex graph representation learning (MGRL) is a pow-erful approach to extracting multiple relationships among nodes in the graph data, and has recently attracted much at-tention in real applications (Chu et al. 2019; Zhang and Kou 2022; Peng et al. 2022). The multiplex graph can be regarded as a combination of multiple graphs.

  12. PDF disentangled representation learning

    Disentangled representations. Disentanglement representation learning [8, 33] aims at recovering the factors of variations underlying a given data distribution. [56] proved that without any form of supervision (whether direct or indirect) on the Factors of Variation (FOV) is not possible to recover them.

  13. Learning Graph-based Disentangled Representations for Next POI

    Lightgcn: Simplifying and powering graph convolution network for recommendation. In SIGIR. 639--648. Google Scholar Digital Library; Vineet John, Lili Mou, Hareesh Bahuleyan, and Olga Vechtomova. 2018. Disentangled representation learning for non-parallel text style transfer. arXiv preprint arXiv:1808.04339 (2018). Google Scholar

  14. PDF Disentangled Contrastive Learning on Graphs

    disentangled graph representation via factor-wise contrastive learning. To the best of our knowledge, we are the first to study disentangled self-supervised graph representation learning. We propose a disentangled graph encoder to capture multiple aspects of graphs through learning disentangled latent factors on graphs.

  15. Disentangled Contrastive Learning on Graphs

    In this paper, we introduce the Disentangled Graph Contrastive Learning (DGCL) method, which is able to learn disentangled graph-level representations with self-supervision. In particular, we first identify the latent factors of the input graph and derive its factorized representations. Each of the factorized representations describes a latent ...

  16. Disentangle-based Continual Graph Representation Learning

    Graph embedding (GE) methods embed nodes (and/or edges) in graph into a low-dimensional semantic space, and have shown its effectiveness in modeling multi-relational data. However, existing GE models are not practical in real-world applications since it overlooked the streaming nature of incoming data. To address this issue, we study the problem of continual graph representation learning which ...

  17. DyTed: Disentangled Representation Learning for Discrete-time Dynamic Graph

    To solve the above problems, in this paper, we propose a novel disenTangled representation learning framework for discrete-time Dynamic graphs, namely DyTed. We specially design a temporal-clips contrastive learning task together with a structure contrastive learning to effectively identify the time-invariant and time-varying representations ...

  18. PDF JOURNAL OF LA Disentangled Representation Learning

    JOURNAL OF LATEX CLASS FILES, VOL. 14, NO. 8, NOVEMBER 2022 1 Disentangled Representation Learning Xin Wang, Member, IEEE, Hong Chen, Si'ao Tang, Zihao Wu and Wenwu Zhu, Fellow, IEEE Abstract—Disentangled Representation Learning (DRL) aims to learn a model capable of identifying and disentangling the underlying factors hidden in the observable data in representation form.

  19. PDF Knowledge Router: Learning Disentangled Representations for Knowledge

    1. KNOWLEDGEROUTER: Learning Disentangled Representations for Knowledge Graphs. Shuai Zhang1, Xi Rao , Yi Tay2and Ce Zhang1 1ETH Zurich, Switzerland. 2Google Research, USA Abstract. The design of expressive representations of en- tities and relations in a knowledge graph is an important endeavor.

  20. [2310.18152] Disentangled Representation Learning with Large Language

    Title: Disentangled Representation Learning with Large Language Models for Text-Attributed Graphs. Authors: Yijian Qin, Xin Wang, Ziwei Zhang, Wenwu Zhu ... To address this problem, in this paper we present the Disentangled Graph-Text Learner (DGTL) model, which is able to enhance the reasoning and predicting capabilities of LLMs for TAGs. ...

  21. Learning Network Representations with Disentangled Graph Auto-Encoder

    The (variational) graph auto-encoder is extensively employed for learning representations of graph-structured data. However, the formation of real-world graphs is a complex and heterogeneous process influenced by latent factors. Existing encoders are fundamentally holistic, neglecting the entanglement of latent factors. This not only makes graph analysis tasks less effective but also makes it ...

  22. Disentangled Representation Learning with Large Language Models for

    Text-attributed graphs (TAGs) are prevalent on the web and research over TAGs such as citation networks, e-commerce networks and social networks has attracted considerable attention in the web community. Recently, large language models (LLMs) have demonstrated exceptional capabilities across a wide range of tasks. However, the existing works focus on harnessing the potential of LLMs solely ...