Studying Multi-head Attention in the Style Tokens Paper

I am trying to understand the multi-head attention introduced in the paper, “Attention Is All You Need“. The purpose to understand the multi-head attention is understand the style token layer, which contains multi-head attention and was introduced in the paper, “Style Tokens: Unsupervised Style Modeling, Control and Transfer in End-to-End Speech Synthesis“. Multi-head attention is […]

Digital Signal Processing | Coursera

1.2.a Discrete-time signals Discrete-time signals Discrete-time signal:= A sequence of complex numbers Dimension = 1 (for now) Notation: where is an integer Two-sided sequences: is one-dimensional “time”. Analysis: Periodic measurement approach Discrete-time signals can be created by an analysis process where we take periodic measurements of a physical phenomenon. Synthesis: Stream of generated samples Delta […]

Studying Variational Autoencoders

References Arxiv Insights (2018. 2. 25.) Variational Autoencoders. YouTube. [YouTube] Diederik P. Kingma, and Max Welling (2014). Auto-Encoding Variational Bayes. ICLR 2014. [paper] Higgins, I., Matthey, L., Pal, A., Burgess, C., Glorot, X., Botvinick, M., … & Lerchner, A. (2017). beta-VAE: Learning Basic Visual Concepts with a Constrained Variational Framework. ICLR 2017. [paper] Higgins, I., […]

Computational Neuroscience | Coursera

Brief information Instructors: Rajesh P. N. Rao, Adrienne Fairhall About this course: This course provides an introduction to basic computational methods for understanding what nervous systems do and for determining how they function. We will explore the computational principles governing various aspects of vision, sensory-motor control, learning, and memory. Specific topics that will be covered […]

Survey on knowledge graph embedding

Papers Q. Wang, Z. Mao, B. Wang and L. Guo, “Knowledge Graph Embedding: A Survey of Approaches and Applications,” in IEEE Transactions on Knowledge and Data Engineering, vol. 29, no. 12, pp. 2724-2743, 1 Dec. 2017. Maximilian Nickel, Kevin Murphy, Volker Tresp, Evgeniy Gabrilovich. A Review of Relational Machine Learning for Knowledge Graphs. Proc. IEEE, […]

[YouTube] Demis Hassabis, CEO, DeepMind Technologies – The Theory of Everything

Worth to studying Physics Neuroscience “What I cannot build, I don not understand.” – Richard Feynman Theme Park: one of the games Demis made Demis’ interest areas in the Ph.D course: imagination and memory DeepMind was founded in 2018. is an Apollo prgramme for AI (>100 scientist from machine learning fields and neuroscience fields) Neuroscience-inspired […]

Sequence Modeling | Deep Learning Specialization | Coursera

Course planning Week 1: Recurrent neural networks Learn about recurrent neural networks. This type of model has been proven to perform extremely well on temporal data. It has several variants including LSTMs, GRUs and Bidirectional RNNs, which you are going to learn about in this section. Lectures: Recurrent neural networks C4W1L01 Why sequence models C4W1L02 […]

Convolutional Neural Networks | Deep Learning Specialization | Coursera

Course Planning Week 1:?Foundations of convolutional neural networks Learn to implement the foundational layers of CNNs (pooling, convolutions) and to stack them properly in a deep network to solve multi-class image classification problems. Convolutional neural networks C4W1L01 Computer vision C4W1L02 Edge detection example C4W1L03 More edge detection C4W1L04 Padding C4W1L05 Strided convolutions C4W1L06 Convolutions over […]

Neural Networks and Deep Learning | Deep Learning Specialization | Coursera

Lecture Planning Week 1:?Introduction to Deep Learning Welcome to the Deep Learning Specialization C1W1L01 Welcome Introduction to Deep Learning C1W1L02 Welcome C1W1L03 What is a neural network? C1W1L04 Supervised Learning with Neural Networks C1W1L05 Why is Deep Learning taking off? C1W1L06 About this Course C1W1R1 Frequently Asked Questions C1W1L07 Course Resources C1W1R2 How to use […]