PolyU develops novel AI graph neural network models to unravel interdisciplinary complexities in image recognition and neuroscience (IMAGE)
Caption
HL-HGAT architecture. (a) Schematic representation of the Hodge-Laplacian Heterogeneous Graph Attention Network (HL-HGAT) architecture showcasing three key innovations: HL-filters, multi-simplicial interaction (MSI), and simplicial attention pooling (SAP). In each processing block, we initiate the workflow by applying HL-filters to signals from the k1- and k2-simplices from the preceding block. Subsequently, an MSI layer is employed to capture signal interactions between the k1- and k2-simplices. Following this, we implement an SAP layer, which involves updating the boundary operator and feature consolidation based on simplex attention. Finally, an output layer is designed for prediction. (b) Flow chart outlining the proposed simplex downsampling algorithm in Section III-D. We employ the Graclus clustering algorithm [49] to derive the node assignment matrix. This is followed by an iterative three-step process (depicted in Fig. 1(c)): 1) Initialization of the updated boundary operator for the (k+1)-th iteration; 2) Removal of non-existent (k+1)-simplices that contain nodes within the same node clusters and duplicated (k+1)-simplices; 3) Computation of the (k+1)-simplex assignment matrix using the updated boundary operator. (c) Schematic diagram illustrating the architecture of the Simplicial Attention Pooling (SAP). Within this framework, we compute self-attention and cross-attention for each simplex. The simplex signals are then modulated by attention mechanisms and pooled based on the assignment matrices.
Credit
polyu
Usage Restrictions
nil
License
Original content