Meet Musio

Memory Neural Networks :MemNN

Goal This summary tries to provide an rough explanation of memory neural networks. In particular, the we focus on the existing architectures with external memory components. Motivation A lot of task, as the babi tasks require a long-term memory component in order to understand longer passages of text, like stories. More general, QA tasks demand accessing memories in a wider context, such as past utterances which date back several days or even weeks. Ingredients External memory, RNN, LSTM, Embedding model, Scoring function, Softmax, Hops. Steps Neural Networks in general rely on storing information about training data in the weights of their hidden layers. However, current architectures, such as RNN and LSTM, limit the access of information seen in the past […]

Sequence-to-Sequence

In this summary I like to provide a rough overview of Sequence-To-Sequence neural network architectures and what purposes these serve. Motivation A key observation when dealing with neural networks is that these can only handle objects of a fixed size. This means that the architecture has to be adopted if sequences like sentences should be process-able. The same problems with objects of variable length also appear on the dialog level, where a certain number of utterances and responses string together. Besides dialog modeling, speech recognition and machine translation demand for advanced neural networks. Ingredients Deep neural network, hidden layer, recurrent neural network, encoder, decoder, LSTM, back-propagation, word embedding, sentence embedding Steps As already stated standard neural networks can not deal with […]

popular technologies in deep learning

For more info : NN for Language Presenter : Juno This seminar introduces several popular technologies in deep learning or neural network for natural language processing such as distributed representation of word, neural network language model, recurrent neural network, and LSTM (Long short term memory). In particular, recent advances and a couple of application examples will also be discussed in the last part. – Reference: Using Neural Network for Modeling  and Representing Natural Languages by Tomas Mikolov (Facebook) – Reference: Recurrent Nets and LSTM by Nando de Freitas (Oxford Univ.)