Word Sense Disambiguation (WSD) is a task to identify the sense of a polysemy in given context. Recently, word embeddings are applied to WSD, as additional input features of a supervised classifier. However, previous approaches narrowly use word embeddings to represent surrounding words of target words. They may not make sufficient use of word embeddings in representing different features like dependency relations, word order and global contexts (the whole document). In this work, we combine local and global features to perform WSD. We explore utilizing word embeddings to leverage word order and dependency features. We also use word embeddings to represent global contexts as global features. We conduct experiments to evaluate our methods and find out that our methods outperform the state-of-the-art methods on Lexical Sample WSD datasets. Copyright © 2017 Springer International Publishing AG.
|Title of host publication||Web information systems engineering – WISE 2017: 18th International Conference, Puschino, Russia, October 7-11, 2017, Proceedings, Part II|
|Editors||Athman BOUGUETTAYA, Yunjun GAO , Andrey KLIMENKO , Lu CHEN , Xiangliang ZHANG , Fedor DZERZHINSKIY , Weijia JIA , Stanislav V. KLIMENKO, Qing LI|
|Place of Publication||Cham|
|Publication status||Published - 2017|
CitationLei, X., Cai, Y., Li, Q., Xie, H., Leung, H.-F., & Wang, F. L. (2017). Combining local and global features in supervised word sense disambiguation. In A. Bouguettaya, Y. Gao, A. Klimenko, L. Chen, X. Zhang, F. Dzerzhinskiy, et al. (Eds.), Web information systems engineering – WISE 2017: 18th International Conference, Puschino, Russia, October 7-11, 2017, Proceedings, Part II (pp. 117-131). Cham: Springer.
- Word sense disambiguation
- Word embeddings
- Natural language processing