A.I🇺🇸, All Articles

Theano: Theano Python Language Library

Goal
In this ongoing summary we give a first introduction to the theano library, its basic functionality and usage in th field neural networks.

Motivation
The need for such a library is based on easily handling tensorial objects, like multi-array input data and weights in neural networks.
Of particular interest are symbolic operations that can perform differentiation as need for the backpropagation algorithm in order to update the models according to the seen training data.
A core functionality of libraries nowadays is to make use of GPUs in order to efficiently distribute the huge amount of computing operations.

Ingredients
symbolic operations, numpy, python

Steps
The theano library is an open source project lead by a machine learning group associated with a university.
It nicely integrates the NumPy library widely used in the scientific community and makes use of the computational power of GPUs.

Among its basic features are algebraic operations on vectors, matrices and multi-dimensional arrays.
These objects have to be defined abstractly in advance together with the operations in which they appear.
Theano now allows to pass these objects and the computational steps to a function that is first compiled and can be later called on instances of the abstract objects.

An extremely useful type of objects are so-called shared variables.
Once defined this can be used among multiple functions and updated accordingly.
This allows for an easy way to implement an updating mechanism for the weights of a neural network.

A major advantage of the theano library is its use of symbolic operations for differentiation.
In particular the theano.grad allows to differentiate previous defined functions with respect to any kind of parameters.
Especially in neural networks where the optimization of the weights is handled by the backpropagation algorithm calculating gradients through layers according to the chain rule is of great importance.

Finally, the scan function allows to take care of loops.
The advantage of this function over the standard Python looping is its memory allocation.
Further, the outputs at each time-step can be nicely gathered into new objects which is necessary for building recurrent layers.

Outlook
The theano library is very useful to build up neural networks from ground up.
For quick implementations of standard neural networks one might want to rely on neural network libraries like keras which are build on top of theano.

However, certain neural network architectures as proposed in the latest research papers exceed the functionality of such libraries and theano is a good starting point.

Resources
Theano” (WEB). Theano. Accessed 24 March 2016.
Deep Learning Tutorials” (WEB). Deep Learning Tutorials. Accessed 24 March 2016.

Leave a Reply

Your email address will not be published. Required fields are marked *