Covering rare words
Table of Contents 1. Covering rare words 1.1. goal 1.2. motivation 1.3. ingredients 1.4. steps 1.5. outlook 1.6. resources Covering rare words goal This week’s blogpost treats a new network architecture, named pointer models, for taking care of rare words. We will dive into some details of the implementation and give a short analysis of the benefits of these kind of models. motivation Motivation for the introduction of new architectures comes directly from short-comings of RNN language models, as well as encoder decoder frameworks. Rare words, especially named entities do not experience good word embeddings and hence do not lead to appropriate sentence embeddings which might be used to initialize a decoder component for predicting an output sequence. Furthermore, the […]