GEPC: Global Embeddings with PID Control
Computer speech & language(2021)
摘要
Global vectors, or global embeddings, are important word representations for many natural language processing tasks. With the popularity of dynamic embeddings (also known as contextual embeddings, such as ELMo and BERT) in recent years, attentions on global vectors have been diverted to a large extent. While, compared to the dynamic embeddings, the global embeddings are faster to train, straightforward to interpret, and eligible to be evaluated by many standard and credible intrinsic benchmarks (e.g., word similarity correlation and analogy accuracy). Thus, they are still widely-used in numerous downstream applications until now. However, the model design of the global embeddings has some limitations, making the learned word representations suboptimal. In this paper, we propose a novel method to deal with these limitations using PID control. To the best of our knowledge, this is one of the first efforts to leverage PID control in the research of word embeddings. Empirical results on standard intrinsic and extrinsic benchmarks show consistent performance boost of the proposed method, suggesting that the method proposed in this paper can be considered as a promising alternative to learn better word representations for the downstream tasks.
更多查看译文
关键词
Natural language processing,Representation learning,Word embedding,Global vectors
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要