Network Embedding with Topology-Aware Textual Representations.
2021
Textual network embedding aims to learn meaningful low-dimensional representations for vertices with the consideration of the associated texts. When learning the representations for texts in network embedding, existing methods mostly only exploit information from neighboring texts (i.e., contexts), while rarely taking advantages of the valuable network topological (structural) information. To bridge the gap, in this paper, a model based on adaptive-filter convolutional neural networks (CNN) is developed, in which the filters are adapted to local network topologies, rather than clamped to fixed values as in traditional CNNs. The dependency enables the learned text representations to be aware of local network topologies. It is shown that the proposed topology-aware representations can be viewed as a complement to existing context-aware ones. When the two are used together, experimental results on three real-world benchmarks demonstrate that significant performance improvements on the tasks of link prediction and vertex classification.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
37
References
0
Citations
NaN
KQI