Congratulations to Tian Xie for passing his defense on May 4, 2022! His Ph.D. thesis is entitled “Efficient Graph Learning: Theory and Performance Evaluation”. Here we invite Tian to share a brief introduction of his thesis and some words he would like to share at the end of the Ph.D. study journey.
1) Abstract of Thesis
Graphs are generic data representation forms that effectively describe the geometric structures of data domains in various applications. Graph learning, which learns knowledge from this graph-structured data, is an important machine learning application on graphs. In this dissertation, we focus on developing efficient solutions to graph learning problems. In particular, we first present an advanced graph neural network (GNN) method specified for bipartite graphs that is scalable and without label supervision. Then, we investigate and propose new graph learning techniques from the aspects of graph signal processing and regularization frameworks, which identify a new path in solving graph learning problems with efficient and effective co-design.
From the GNN perspective, we extend the general GNN to the node representation learning problem in bipartite graphs. We propose a layerwise-trained bipartite graph neural network (L-BGNN) to address the challenges in bipartite graphs. Specifically, L-BGNN adopts a unique message passing with adversarial training between the embedding space. In addition, a layerwise training mechanism is proposed for efficiency on large-scale graphs.
From the graph signal perspective, we propose a novel two-stage training algorithm named GraphHop for the semi-supervised node classification task. Specifically, two distinctive low-pass filters are respectively designed for attribute and label signals and combined with regression classifiers. The two-stage training framework enables GraphHop scalable to large-scale graphs, and the effective low-pass filtering produces superior performance in extremely small label rates.
From the regularization framework perspective, we develop a variational interpretation and theoretical understanding of the proposed GraphHop method. In particular, we show that the iteration process in the GraphHop method can be explained as an alternate optimization to a certain regularization problem defined on graphs under probability constraints. Based on this interpretation, we further propose an enhanced version named GraphHop++. Experiments show that GraphHop++ achieves superior performance on all benchmarking datasets as well as an object recognition problem.
2) Ph.D. experience
I would like to express my great thanks to Prof. Kuo. Without Prof. Kuo, I would not achieve this far in my Ph.D. journey. Over the past years, I have learned a lot from Prof. Kuo’s wisdom in doing research and his persistence in understanding the reason behind the problem entirely. Prof. Kuo is a lifelong model for me to pursue. Besides, I would also like to thank all MCL members. MCL is a big family where I have received tremendous support from all members. I am really fortunate to join this family. I wish all the best to the lab members.