Concept Representation by Learning Explicit and Implicit Concept Couplings

Publisher:
Institute of Electrical and Electronics Engineers
Publication Type:
Journal Article
Citation:
IEEE Intelligent Systems, 2021, 36, (1), pp. 6-15
Issue Date:
2021
Filename Description Size
Concept_Representation_by_Learning_Explicit_and_Implicit_Concept_Couplings.pdfPublished version600.85 kB
Adobe PDF
Full metadata record
IEEE Generating the precise semantic representation of a word/concept is a fundamental task in natural language processing. Recent studies which incorporate semantic knowledge into word embedding have shown their potential in improving the semantic representation of a concept. However, existing approaches only achieved limited performance improvement as they usually (1) model a word's semantics from some explicit aspects while ignoring the intrinsic aspects of the word, (2) treat semantic knowledge as a supplement of word embeddings, and (3) consider partial relations between concepts while ignoring rich coupling relations between them, such as explicit concept co-occurrences in descriptive texts in a corpus as well as concept hyperlink relations in a knowledge network, and implicit couplings between the explicit relations. In human consciousness, concepts are associated with various coupling relations, which inspires us to capture as many concept couplings as possible for building a better concept representation. We thus propose a neural coupled concept representation (CoupledCR) framework and its instantiation: a coupled concept embedding (CCE) model. CCE first learns two types of explicit couplings from concept cooccurrences and hyperlink relations respectively, and then learns a type of high-level implicit couplings between these two types of explicit couplings. Extensive experimental results on real-world datasets show that CCE significantly outperforms state-of-the-art semantic representation methods.
Please use this identifier to cite or link to this item: