If you use the code, please kindly cite the following paper:
Yankai Lin, Zhiyuan Liu, Maosong Sun, Yang Liu, Xuan Zhu. Learning Entity and Relation Embeddings for Knowledge Graph Completion. The 29th AAAI Conference on Artificial Intelligence (AAAI'15).
ConceptNet Numberbatch consists of state-of-the-art semantic vectors (also known as word embeddings) that can be used directly as a representation of word meanings or as a starting point for further machine learning.
In natural language understanding, there is a hierarchy of lenses through which we can extract meaning - from words to sentences to paragraphs to documents. At the document level, one of the most useful ways to understand text is by analyzing its topics.
I made an introductory talk on word embeddings in the past and this write-up is an extended version of the part about philosophical ideas behind word vectors.
M. Artetxe, G. Labaka, I. Lopez-Gazpio, und E. Agirre. Proceedings of the 22nd Conference on Computational Natural Language Learning, Seite 282--291. Association for Computational Linguistics, (2018)
S. Wang, J. Tang, C. Aggarwal, und H. Liu. Proceedings of the 25th ACM International on Conference on Information and Knowledge Management, ACM, (Oktober 2016)