Knowledge Graph encodes human-readable information and knowledge in graph format. Triples, denoted by (h,r,t), are basic elements of a KG, where h and t are head and tail entities while r is the relation connecting them. Both manual effort by domain experts and automated information extraction algorithms have contributed to the creation of many existing Knowledge Graphs today. However, given the limited information accessible to each individual and the limitation of algorithms, it is nearly impossible for a Knowledge Graph to perfectly capture every single piece of facts about the world. As such, Knowledge Graphs are often incomplete and many researchers have developed different algorithms to predict missing facts in Knowledge Graphs. Knowledge Graph Embedding models were first proposed to mainly solve the Knowledge Graph Completion problem. Besides, embedding models can also be useful in solving many downstream tasks such as entity classification and entity alignment.

MCL has been recently working on Effective Knowledge Graph Embedding. Translation, rotation, and scaling are three commonly used geometric manipulation operations in image processing. Besides, some of them are successfully used in developing effective knowledge graph embedding (KGE) models such as TransE and RotatE. Inspired by the synergy, we propose a new KGE model by leveraging all three operations. Since translation, rotation, and scaling operations are cascaded to form a compound one, the new model is named CompoundE. By casting CompoundE in the framework of group theory, we show that quite a few distanced-based KGE models are special cases of CompoundE. CompoundE extends the simple distance-based scoring functions to relation-dependent compound operations on head and/or tail entities. To demonstrate the effectiveness of CompoundE, we conduct experiments on three popular knowledge graph completion datasets. Experimental results show that CompoundE consistently achieves state-of-the-art performance.

[1] X. Ge, Y. C. Wang, B. Wang, C.-C. J. Kuo, “CompoundE: Knowledge Graph Embedding with Translation, Rotation and Scaling Compound Operations.” Available:
[2] Bordes A, Usunier N, Garcia-Duran A, Weston J, Yakhnenko O. “Translating embeddings for modeling multi-relational data. Advances in neural information processing systems.” 2013;26. Available:
[3] Sun, Z., Deng, Z.H., Nie, J.Y. and Tang, J., 2018, September. RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space. International Conference on Learning Representations.Available:
[4] Chao, L., He, J., Wang, T. and Chu, W., 2021, August. PairRE: Knowledge Graph Embeddings via Paired Relation Vectors. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (pp. 4360-4369). Available:

Image credits: Both figures are from [1].

– by Xiou Ge