Description
Graph-structured data, ranging from social networks to financial transaction networks, from citation networks to gene regulatory networks, have been widely used for modeling a myriad of real-world systems. As a prevailing model architecture to model graph-structured data, graph neural

Graph-structured data, ranging from social networks to financial transaction networks, from citation networks to gene regulatory networks, have been widely used for modeling a myriad of real-world systems. As a prevailing model architecture to model graph-structured data, graph neural networks (GNNs) has drawn much attention in both academic and industrial communities in the past decades. Despite their success in different graph learning tasks, existing methods usually rely on learning from ``big'' data, requiring a large amount of labeled data for model training. However, it is common that real-world graphs are associated with ``small'' labeled data as data annotation and labeling on graphs is always time and resource-consuming. Therefore, it is imperative to investigate graph machine learning (Graph ML) with low-cost human supervision for low-resource settings where limited or even no labeled data is available. This dissertation investigates a new research field -- Data-Efficient Graph Learning, which aims to push forward the performance boundary of graph machine learning (Graph ML) models with different kinds of low-cost supervision signals. To achieve this goal, a series of studies are conducted for solving different data-efficient graph learning problems, including graph few-shot learning, graph weakly-supervised learning, and graph self-supervised learning.
Reuse Permissions
  • Downloads
    pdf (17.1 MB)

    Details

    Title
    • Data-Efficient Graph Learning
    Contributors
    Date Created
    2023
    Resource Type
  • Text
  • Collections this item is in
    Note
    • Partial requirement for: Ph.D., Arizona State University, 2023
    • Field of study: Computer Science

    Machine-readable links