Studying Multi-head Attention in the Style Tokens Paper

I am trying to understand the multi-head attention introduced in the paper, “Attention Is All You Need“. The purpose to understand the multi-head attention is understand the style token layer, which contains multi-head attention and was introduced in the paper, “Style Tokens: Unsupervised Style Modeling, Control and Transfer in End-to-End Speech Synthesis“. Multi-head attention is […]

Sequence Modeling | Deep Learning Specialization | Coursera

Course planning Week 1: Recurrent neural networks Learn about recurrent neural networks. This type of model has been proven to perform extremely well on temporal data. It has several variants including LSTMs, GRUs and Bidirectional RNNs, which you are going to learn about in this section. Lectures: Recurrent neural networks C4W1L01 Why sequence models C4W1L02 […]

Convolutional Neural Networks | Deep Learning Specialization | Coursera

Course Planning Week 1:?Foundations of convolutional neural networks Learn to implement the foundational layers of CNNs (pooling, convolutions) and to stack them properly in a deep network to solve multi-class image classification problems. Convolutional neural networks C4W1L01 Computer vision C4W1L02 Edge detection example C4W1L03 More edge detection C4W1L04 Padding C4W1L05 Strided convolutions C4W1L06 Convolutions over […]

Neural Networks and Deep Learning | Deep Learning Specialization | Coursera

Lecture Planning Week 1:?Introduction to Deep Learning Welcome to the Deep Learning Specialization C1W1L01 Welcome Introduction to Deep Learning C1W1L02 Welcome C1W1L03 What is a neural network? C1W1L04 Supervised Learning with Neural Networks C1W1L05 Why is Deep Learning taking off? C1W1L06 About this Course C1W1R1 Frequently Asked Questions C1W1L07 Course Resources C1W1R2 How to use […]

Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization | Deep Learning Specialization | Coursera

Brief information Course name: Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization Instructor: Andrew Ng Institution:?deeplearning.ai Media: Coursera Specialization: Deep Learning Duration: 3 weeks About this Course This course will teach you the “magic” of getting deep learning to work well. Rather than the deep learning process being a black box, you will understand […]

Structuring Machine Learning Projects | Deep Learning Specialization | Coursera

Brief information Course name: Structuring Machine Learning Projects Instructor: Andrew Ng Institution:?deeplearning.ai Media: Coursera Specialization: Deep Learning Duration: 2 weeks About this Course You will learn how to build a successful machine learning project. If you aspire to be a technical leader in AI, and know how to set direction for your team’s work, this […]

Conditional Generative Adversarial Nets | M. Mirza, S. Osindero | 2014

Introduction Conditional version of Generative Adversarial Nets (GAN) where both generator and discriminator are conditioned on some data y (class label or data from some other modality). Architecture Feed y into both the generator and discriminator as additional input layers such that y and input are combined in a joint hidden representation.

Studying Generative Adversarial Networks (GANs)

References Lecture 13: Generative Models. CS231n: Convolutional Neural Networks for Visual Recognition. Spring 2017. [SLIDE][VIDEO] Generative Adversarial Nets.?Goodfellow et al.. NIPS 2014. 2014. [LINK][arXiv] How to Train a GAN? Tips and tricks to make GANs work. Soumith Chintala. github. [LINK] The GAN Zoo.?Avinash Hindupur. github. [LINK]

Inception Module | Summary

References Udacity (2016. 6. 6.). Inception Module. YouTube. [LINK] Udacity (2016. 6. 6.). 1×1 Convolutions. YouTube. [LINK] Tommy Mulc (2016. 9.?25.). Inception modules: explained and implemented. [LINK] Szegedy et al. (2015). Going Deeper with Convolutions. CVPR 2015. [arXiv] Summary History The inception module was first introduced in?GoogLeNet for?ILSVRC’14 competition. Key concept Let?a convolutional network decide […]