Introduction into Applied Deep Learning

Prerequisites (knowledge of topic)

There are no specific prerequisites for this course except for a lot of curiosity, interest in the topic, and persistence. From practical skills previous knowledge in programming with Python is required (functional, OOP; data processing and analysis).

 

Hardware

During the course you are going to work on your own laptop, so you need a device that works reasonably fast and has sufficient memory. Any standard Mac or Windows device that is not older than 3 years will suffice. Windows, Mac OSX or Linux are all fine as operating systems. iPads (iOS) or Android devices are not suitable. Heavy computations will be outsourced to external servers (e.g. AWS)

 

Software

Python 3+, libraries: Tensorflow, Keras

 

Course content

Deep learning is a subset of machine learning where artificial neural networks, algorithms inspired by the human brain, learn from large amounts of data. World is generating over 2.6 quintillion bytes of data daily, with stronger computing power that’s available today, the artificial intelligence is becoming integrated into different areas of research and organizations without a large initial investment. Deep learning allows machines to solve complex problems even when using a data set that is very diverse, unstructured and inter-connected.

 

In this course, you will acquire some fundamental knowledge as well as practical skills in deep learning. We will look at the topic with a problem-solving approach having short input lectures, focusing rather on programming and practical applications of the models.

 

We introduce the open source deep learning library TensorFlow 2 and Keras to work our way through the necessary concepts and APIs. Given different contexts, we explain how to construct and train models, specifically:

-          convolutional neural network (e.g. keras.layers.Conv2D)

-          recurrent neural network (e.g. keras.layers.SimpleRNN) and long short term memory (e.g.  keras.layers.LSTM)

-          and word embeddings for other applications related to text classification and summarization, speech recognition, tagging, and so on.

 

Course Objectives: Create machine learning models in TensorFlow & Keras to solve numerical problems: define architecture of the model, create layers, train, and evaluate a deep learning model and deploy it on external servers.

 

Structure

Day 1

Introduction to deep learning through handwritten digits recognition. Explore key concepts, such as: learning rate decay, ReLU, dropout, softmax, cross-entropy, mini-batch, and overfitting.

 

Day 2

Convolutional Neural Network (CNN) application for image recognition

 

Day 3

Recurrent Neural Network (RNN) and Long-Short Term Memory (LSTM)

 

Day 4

Natural Language Processing (feature generation, word embeddings, etc) for operations with text

 

Day 5

The problems of neural networks and how to gain insights into the black box.

Afternoon session: Examination

 

Literature

 

Mandatory readings before course start

Jake VanderPlas. Python Data Science Handbook (Chapter 2,3,4). Free book.

 

Supplementary / voluntary

Goodfellow, I., Bengio, Y. & Courville, A. (2016). Deep Learning. Free html book. GitHub repository.

 

Li Deng & Yang Liu (2018). Deep Learning in NLP

Thushan Ganegedara (2018). Natural Language Processing with TensorFlow. GitHub repository.

 

Keras: The Python Deep Learning Library. Documentation.

 

Examination part (1/2)

 

Examination time and form

Decentral – Oral participation (20%)

 

Examination-aid rule

Open Book

 

Students are free to choose aids but will have to comply with the following restrictions:

·         At such examinations, all the pocket calculators of the Texas Instruments TI‑30 series are admissible. Any other pocket calculator models are inadmissible.

·         Students are themselves responsible for the procurement of examination aids.

 

Supplementary aids

Lecture notes, online material, optional reading.

 

Examination languages

Question language: English

Answer language: English

 

Examination part (2/2)

 

Examination time and form

Decentral ‑ Written homework (80%)

 

Remark

Written homework which has to be delivered within 2-3 weeks after the course

 

Examination-aid rule

Open Book

 

Students are free to choose aids but will have to comply with the following restrictions:

·         Term papers must be written without anyone elseʹs help and in accordance with the known quotation standards, and they must contain a declaration of authorship.

·         The documentation of sources (quotations, bibliography) has to be done throughout and consistently in accordance with the APA or MLA standards. The indications of the sources of information taken over verbatim or in paraphrase (quotations) must be integrated into the text in accordance with the precepts of the applicable quotation standard, while informative and bibliographical notes must be added as footnotes (recommendations and standards can be found, for example, in METZGER, C. (2015), Lern‑ und Arbeitsstrategien (11th ed., 4th printing). Aarau: Sauerländer).

·         For any work written at the HSG, the indication of the page numbers both according to the MLA and the APA standard is never optional.

·         Where there are no page numbers in sources, precise references must be provided in a different way: titles of chapters or sections, section numbers, acts, scenes, verses, etc.

·         For papers in law, the legal standard is recommended (by way of example, cf. FORSTMOSER, P., OGOREK R. et SCHINDLER B. (2014, Juristisches Arbeiten: Eine Anleitung für Studierende (5. Auflage), Zürich: Schulthess, or the recommendations of the Law School)

 

Supplementary aids

Lecture notes, online material, optional reading.

 

Examination languages

Question language: English

Answer language: English

 

Literature

The assignments are based on the slides, the script files with the code that we work on during the course, and your notes made during the course. You are also free (and sometimes required) to use online resources.