CS 6890: Deep Learning
Time and Location: Tue, Thu, Fri 9:00 – 10:20am, ARC 121
Instructor: Razvan Bunescu
Office: Stocker 341
Office Hours: Tue, Thu 10:30 – 11:00am, or by email appointment
Email: bunescu @ ohio edu
Deep Learning by Ian Goodfellow, Yoshua Bengio and Aaron Courville. MIT Press, 2016.
Recommended introductory material:
Machine Learning @ Coursera video lectures and exercises.
Machine Learning @ Stanford video lectures and exercises.
Machine Learning @ Ohio course website.
Supplemental deep learning lectures:
Deep Learning course sequence @ Coursera.
Deep Learning and Reinforcement Learning summer school, 2017 @ Montreal Institute for Learning Algorithms.
CS 231n: Convolutional Neural Networks for Visual Recognition, 2017 @ Stanford.
CS 598: Cutting-Edge Trends in Deep Learning and Recognition, 2017 @ Illinois.
This course will introduce the multi-layer perceptron, a common deep learning architecture, and its gradient-based training through the backpropagation algorithm. Fully connected neural networks will be followed by more specialized neural network architectures such as convolutional neural networks (for images), recurrent neural networks (for sequences), and memory-augmented neural networks. The later part of the course will explore more advanced topics, such as generative adversarial networks and deep reinforcement learning. The lectures will cover theoretical aspects of deep learning models, whereas homework assignments will give students the opportunity to build and experiment with shallow and deep learning models, for which skeleton code will be provided.
Logistic and Softmax Regression, Feed-Forward Neural Networks, Backpropagation, Vectorization, PCA and Whitening, Deep Networks, Convolution and Pooling, Recurrent Neural Networks, Long Short-Term Memory, Gated Recurrent Units, Neural Attention Models, Sequence-to-Sequence Models, Memory Networks, Distributional Representations, Generative Adversarial Networks, Deep Reinforcement Learning.
Previous exposure to basic concepts in machine learning, such as: supervised vs. unsupervised learning, classification vs. regression, linear regression, logistic and softmax regression, cost functions, overfitting and regularization, gradient-based optimization. Substantial experience with programming and familiarity with basic concepts in linear algebra and statistics.
- Syllabus & Introduction
- Logistic Regression, Softmax Regression, and Gradient Descent
- Linear algebra and optimization in Python
- Machine Learning with PyTorch
- Feed-Forward Neural Networks and Backpropagation
- Unsupervised Feature Learning with Autoencoders
- PCA, PCA whitening, and ZCA whitening
- Variational Autoencoders
- Assignment and code.
- Assignment and code.
- Assignment and code and data.
Online reading materials: