Thesis Details

Paralelní trénování hlubokých neuronových sítí

Master's Thesis Student: Šlampa Ondřej Academic Year: 2016/2017 Supervisor: Hradiš Michal, Ing., Ph.D.
English title
Parallel Deep Learning
Language
Czech
Abstract

Aim of this thesis is to propose how to evaluate favourableness of parallel deep learning.In this thesis I analyze parallel deep learning and I focus on its length.I take into account gradient computation length and weight transportation length.Result of this thesis is proposal of equations, which can estimate the speedup on multiple workers.These equations can be used to determine ideal number of workers for training.

Keywords

Neural networks, convolutional neural networks, training, soft computing, computation length estimation, distributed computing, parallel computing, computer networks, Tensorflow, Python.

Department
Degree Programme
Information Technology, Field of Study Intelligent Systems
Files
Status
defended, grade D
Date
22 June 2017
Reviewer
Committee
Zbořil František V., doc. Ing., CSc. (DITS FIT BUT), předseda
Hrubý Martin, Ing., Ph.D. (DITS FIT BUT), člen
Janoušek Vladimír, doc. Ing., Ph.D. (DITS FIT BUT), člen
Jaroš Jiří, doc. Ing., Ph.D. (DCSY FIT BUT), člen
Matyska Luděk, prof. RNDr., CSc. (FI MUNI), člen
Peringer Petr, Dr. Ing. (DITS FIT BUT), člen
Citation
ŠLAMPA, Ondřej. Paralelní trénování hlubokých neuronových sítí. Brno, 2017. Master's Thesis. Brno University of Technology, Faculty of Information Technology. 2017-06-22. Supervised by Hradiš Michal. Available from: https://www.fit.vut.cz/study/thesis/19669/
BibTeX
@mastersthesis{FITMT19669,
    author = "Ond\v{r}ej \v{S}lampa",
    type = "Master's thesis",
    title = "Paraleln\'{i} tr\'{e}nov\'{a}n\'{i} hlubok\'{y}ch neuronov\'{y}ch s\'{i}t\'{i}",
    school = "Brno University of Technology, Faculty of Information Technology",
    year = 2017,
    location = "Brno, CZ",
    language = "czech",
    url = "https://www.fit.vut.cz/study/thesis/19669/"
}
Back to top