Ziwen Zhao (Newiz)

| Email | Gmail | GitHub | Google Scholar | DBLP | LinkedIn | Zhihu |

I am now an M.S. student at the Intelligent and Distributed Computing Laboratory (IDC-Lab), Huazhong University of Science and Technology (HUST), Wuhan (advised by Prof. Yuhua Li), majoring in Computer Science. My research interest lies in deep learning, graph mining, weakly- and self-supervised pre-training, especially the task generalization ability of graph knowledge.

News

Publications

Masked Graph Autoencoder with Non-discrete Bandwidths (WWW/TheWebConf’24 research track) Static Badge

Ziwen Zhao, Yuhua Li, Yixiong Zou, Jiliang Tang, Ruixuan Li [Code] [Blog]

We explore the non-discrete edge masking and prediction as a self-supervised GNN pre-training strategy.

The discrete edge masking and binary link reconstruction strategy of existing topological masked graph autoencoders (TopoRecs) is insufficient to learn topologically informative representations, including blocked message flows, vulnerability to over-smoothness, and suboptimal neighborhood discriminability. We propose a novel model coined Bandana, which utilizes non-discrete edge masks - “bandwidths” - sampled from a continuous Boltzmann-Gibbs probability distribution. Bandana’s bandwidth masking and prediction strategy is theoretically connected to regularized denoising autoencoders and energy-based models (EBMs). Bandana outperforms traditional discrete TopoRecs on node classification, link prediction, as well as graph manifold learning.

CSGCL: Community-Strength-Enhanced Graph Contrastive Learning (IJCAI’23 main track) Static Badge

Han Chen*, Ziwen Zhao*, Yuhua Li, Yixiong Zou, Ruixuan Li, Rui Zhang [Code] [Blog]

A graph contrastive learning framework aiming to preserve community strength throughout the learning process.

Firstly, we present two novel graph augmentation methods, Communal Attribute Voting (CAV) and Communal Edge Dropping (CED), where the perturbations of node attributes and edges are guided by community strength. Secondly, we propose a dynamic ‘‘Team-up’’ contrastive learning scheme, where community strength is used to progressively fine-tune the contrastive objective. CSGCL performs well on 3 different kinds of downstream graph tasks, indicating the task generalizability of community knowledge for graph models.

A Survey on Self-Supervised Graph Foundation Models: Knowledge-Based Perspective (‘24) Static Badge

Ziwen Zhao*, Yixin Su*, Yuhua Li, Yixiong Zou, Ruixuan Li, Rui Zhang [Paper list]

We present a new graph knowledge-based taxonomy (9 types, 25+ pretexts) to categorize self-supervised GFMs, incl. GNNs, Graph Transformers, graph language models (GLMs), and more.

Early Open-source Projects

Counting is All You Need: Weakly-Supervised Immunohistochemical Cell Segmentation and Localization by Numbers [Code]

A multi-stage auto-immunoquantitative analytical model based on Multiple Instance Learning for immune cell counting, localization and segmentation.

Taking immunohistochemistry-stained digital cell images as input, the model is merely supervised by positive cell counting labels and transforms whole-image (bag) level counting results into superpixel (instance) level classification results via the specifically designed adaptive top-k instance selection strategy.

This work has achieved 4th place in Lymphocyte Assessment Hackathon (LYSTO) Challenge. Leaderboard


This page uses Jekyll theme Jekyll Gitbook by sighingnow.



results matching ""

    No results matching ""