Affiliated with RCfalse
GNNVis: Visualize Large-Scale Data by Learning a Graph Neural Network Representation
Yajun Huang1; Jingbin Zhang2; Yiyang Yang3; Zhiguo Gong2; Zhifeng Hao4
Conference NameConference on Information and Knowledge Management
Source PublicationInternational Conference on Information and Knowledge Management, Proceedings
Conference DateOctober 19–23, 2020
Conference PlaceIreland

Many achievements have been made by studying how to visualize large-scale and high-dimensional data in typically 2D or 3D space. Normally, such a process is performed through a non-parametric (unsupervised) approach which is limited in handling the unseen data. In this work, we study the parametric (supervised) model which is capable to learn a mapping between high-dimensional data space Rd and low-dimensional latent space Rs with similarity structure in Rd preserved where s l d. The GNNVis is proposed, a framework that applies the idea of Graph Neural Networks (GNNs) to the parametric learning process and the learned mapping serves as a Visualizer (Vis) to compute the low-dimensional embeddings of unseen data online. In our framework, the features of data nodes, as well as the (hidden) information of their neighbors are fused to conduct Dimension Reduction. To the best of our knowledge, none of the existing visualization works have studied how to combine such information into the learning representation. Moreover, the learning process of GNNVis is designed as an end-to-end manner and can easily be extended to arbitrary Dimension Reduction methods if the corresponding objective function is given. Based on GNNVis, several typical dimension reduction methods t-SNE, LargeVis, and UMAP are investigated. As a parametric framework, GNNVis is an inherently efficient Visualizer capable of computing the embeddings of large-scale unseen data. To guarantee its scalability in the Training Stage, a novel training strategy with Subgraph Negative Sampling (SNS) is conducted to reduce the corresponding cost. Experimental results in real datasets demonstrate the advantages of GNNVis. The visualization quality of GNNVis outperforms the state-of-the-art parametric models, and is comparable to that of the non-parametric models.

KeywordBig Data Graph Neural Networks High-dimensional Data Neural Networks Semi-supervised Learning Visualization
URLView the original
Scopus ID2-s2.0-85095864770
Fulltext Access
Citation statistics
Cited Times [WOS]:1   [WOS Record]     [Related Records in WOS]
Document TypeConference paper
Affiliation1.Dingxiang Technologies Hangzhou, China
2.University of Macau,Macao
3.Guangdong University of Technology,Guangzhou,China
4.Foshan University Foshan, China
Recommended Citation
GB/T 7714
Yajun Huang,Jingbin Zhang,Yiyang Yang,et al. GNNVis: Visualize Large-Scale Data by Learning a Graph Neural Network Representation[C],2020:545-554.
APA Yajun Huang,Jingbin Zhang,Yiyang Yang,Zhiguo Gong,&Zhifeng Hao.(2020).GNNVis: Visualize Large-Scale Data by Learning a Graph Neural Network Representation.International Conference on Information and Knowledge Management, Proceedings,545-554.
Files in This Item:
There are no files associated with this item.
Related Services
Recommend this item
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Yajun Huang]'s Articles
[Jingbin Zhang]'s Articles
[Yiyang Yang]'s Articles
Baidu academic
Similar articles in Baidu academic
[Yajun Huang]'s Articles
[Jingbin Zhang]'s Articles
[Yiyang Yang]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Yajun Huang]'s Articles
[Jingbin Zhang]'s Articles
[Yiyang Yang]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.