Luka Murn (2014) Parallel implementation of improved neutral network classifiers and their experimental assessment on biomedical data sets. EngD thesis.
Abstract
The field of artificial neural networks has been buzzing with increased activity in the past few years. Many new methods were proposed to improve the classification accuracy of neural networks. We present three such methods in this thesis: dropout, stacking denoising autoencoders (SdAs) and stacking restricted Boltzmann machines (SRBMs). Up to now, those methods have mostly been used on large datasets. In this thesis, we test them on small datasets representing data from the area of molecular biology. As the process of learning artificial neural networks is very time consuming, our implementation utilizes the GPU using Theano Python library. Our results show that while the proposed methods increase the classification accuracy of neural networks, they still fall behind classic machine learning models, such as logistic regression, on small datasets. We also show that parallel implementation greatly reduces time needed to learn the model, and present a library that's usable for larger datasets as well.
Actions (login required)