Studying Knowledge Distillation
References Knowledge Distillation, Keras documentation [link] Intro Knowledge Distillation : Simplified [link] Intro Distilling the Knowledge in a Neural Network [arXiv] Official original paper
studying, book, lecture
References Knowledge Distillation, Keras documentation [link] Intro Knowledge Distillation : Simplified [link] Intro Distilling the Knowledge in a Neural Network [arXiv] Official original paper
I am trying to understand the multi-head attention introduced in the paper, “Attention Is All You Need“. The purpose to understand the multi-head attention is understand the style token layer, which contains multi-head attention and was introduced in the paper, “Style Tokens: Unsupervised Style Modeling, Control and Transfer in End-to-End Speech Synthesis“. Multi-head attention is […]
1.2.a Discrete-time signals Discrete-time signals Discrete-time signal:= A sequence of complex numbers Dimension = 1 (for now) Notation: where is an integer Two-sided sequences: is one-dimensional “time”. Analysis: Periodic measurement approach Discrete-time signals can be created by an analysis process where we take periodic measurements of a physical phenomenon. Synthesis: Stream of generated samples Delta […]
References Arxiv Insights (2018. 2. 25.) Variational Autoencoders. YouTube. [YouTube] Diederik P. Kingma, and Max Welling (2014). Auto-Encoding Variational Bayes. ICLR 2014. [paper] Higgins, I., Matthey, L., Pal, A., Burgess, C., Glorot, X., Botvinick, M., … & Lerchner, A. (2017). beta-VAE: Learning Basic Visual Concepts with a Constrained Variational Framework. ICLR 2017. [paper] Higgins, I., […]
Brief information Instructors: Rajesh P. N. Rao, Adrienne Fairhall About this course: This course provides an introduction to basic computational methods for understanding what nervous systems do and for determining how they function. We will explore the computational principles governing various aspects of vision, sensory-motor control, learning, and memory. Specific topics that will be covered […]
Number Sense | Wikipedia Number sense can refer to “an intuitive understanding of numbers, their magnitude, relationships, and how they are affected by operations”. Psychologists believe that the number sense in humans can be differentiated into the approximate number system and the parallel individuation system. The approximate number system is a system that supports the […]
Lecture 1 | Introduction “Freud was inspired by the theory of thermodynamics and used the term psychodynamics to describe the processes of the mind as flows of psychological energy (libido or psi) in an organically complex brain.” [Psychodynamics – Wikipedia] Lecture 2 | Linear models What is a linear model? If the derivative of a […]
Brief Information Name (en) :?Seminar in Methodology on Experimental Psychology?(Fundamentals and Applications of Cognitive Modeling) Name (ko) : 실험심리방법론세미나?(인지모델링의 기초와 응용) Lecturer : Koh, Sungryong 고성룡 Semester : 2018 Fall Major?: MS, Cognitive Science Textbook Busemeyer, J. R., & Diederich, A. (2010). Cognitive modeling. Sage. Syllabus : 2018-2_Seminar-in-Methodology-on-Experimental-Psychology.pdf In?short To learn cognitive modeling and its […]
Course planning Week 1: Recurrent neural networks Learn about recurrent neural networks. This type of model has been proven to perform extremely well on temporal data. It has several variants including LSTMs, GRUs and Bidirectional RNNs, which you are going to learn about in this section. Lectures: Recurrent neural networks C4W1L01 Why sequence models C4W1L02 […]