Thor: Towards General Directed Circuit Graph Encoder with Sample-Efficient Graph Contrastive Learning

Wencheng Zou1, Yiran Xia2, Haoyu Wang3, Pan Li3, Nan Wu1
1George Washington University, 2Hong Kong University of Science and Technology, 3Georgia Institute of Technology


Abstract

Circuit netlists are naturally represented as directed graphs, making them ideal for directed graph representation learning (DGRL) to analyze and predict circuit properties. These DGRL-based encoders offer fast, cost-effective alternatives to traditional simulation and enhance downstream EDA workflows. However, reliable prediction on directed circuit graphs remains challenging: task-specific heuristics limit generalization across diverse design problems, while na ̈ıve directed message-passing neural networks (MPNNs) struggle to capture absolute and relative node positions or long-range dependencies. To address these challenges, first, we introduce general circuit graph encoder architectures with enhanced expressiveness for long-range directional and logical dependencies. Our models leverage graph iso- morphism networks (GINs) and graph transformers as backbones, incorporating bidirected message passing and stable positional encodings. Second, to jointly capture the inductive biases of circuit structure and functionality, we adopt a pretraining–finetuning pipeline: encoders are pretrained using a novel sample-efficient graph contrastive learning framework on unlabeled circuit data, which are augmented with hard negatives generated through functional and topological perturbations, and then finetuned with lightweight task-specific heads. This synergy of more expressive graph encoders and sample-efficient graph contrastive learning substantially enhances representational capacity, yielding general- purpose directed circuit graph encoders applicable across a broad range of design tasks. Evaluation on symbolic reasoning and quality-of-results (QoRs) prediction tasks shows consistent improvements over task-specific baselines.