Studying Multi-head Attention in the Style Tokens Paper

I am trying to understand the multi-head attention introduced in the paper, “Attention Is All You Need“. The purpose to understand the multi-head attention is understand the style token layer, which contains multi-head attention and was introduced in the paper, “Style Tokens: Unsupervised Style Modeling, Control and Transfer in End-to-End Speech Synthesis“. Multi-head attention is […]

Studying Variational Autoencoders

References Arxiv Insights (2018. 2. 25.) Variational Autoencoders. YouTube. [YouTube] Diederik P. Kingma, and Max Welling (2014). Auto-Encoding Variational Bayes. ICLR 2014. [paper] Higgins, I., Matthey, L., Pal, A., Burgess, C., Glorot, X., Botvinick, M., … & Lerchner, A. (2017). beta-VAE: Learning Basic Visual Concepts with a Constrained Variational Framework. ICLR 2017. [paper] Higgins, I., […]

Studying Number Sense

Number Sense | Wikipedia Number sense can refer to “an intuitive understanding of numbers, their magnitude, relationships, and how they are affected by operations”. Psychologists believe that the number sense in humans can be differentiated into the approximate number system and the parallel individuation system. The approximate number system is a system that supports the […]

Studying Generative Adversarial Networks (GANs)

References Lecture 13: Generative Models. CS231n: Convolutional Neural Networks for Visual Recognition. Spring 2017. [SLIDE][VIDEO] Generative Adversarial Nets.?Goodfellow et al.. NIPS 2014. 2014. [LINK][arXiv] How to Train a GAN? Tips and tricks to make GANs work. Soumith Chintala. github. [LINK] The GAN Zoo.?Avinash Hindupur. github. [LINK]

Convolutional Neural Networks | Study

  References L. Fei-Fei, Justin Johnson (Spring 2017)CS231n: Convolutional Neural Networks for Visual Recognition. [LINK] Jefkine (5 September 2016). Backpropagation In Convolutional Neural Networks. [LINK] Convnet: Implementing Convolution Layer with Numpy [LINK] CNN의 역전파(backpropagation) [LINK]

Studying ‘Cognitive Science’

References Cognitive Science | Stanford Encyclopedia of Philosophy Suilin Lavelle, Kenny Smith, Mark Sprevak, David Carmel, Andy Clark & Barbara Webb (Mar. 2017) Philosophy and the Sciences: Introduction to the Philosophy of Cognitive Sciences. the University of Edinburgh. Coursera. 이정모. (2009). 인지과학: 학문 간 융합의 원리와 응용. 서울: 성균관대학교출판부. 이정모. (2010).?인지과학: 과거-현재-미래. 서울: 학지사. 장병탁, […]

Studying ‘Deep Learning’

References Lectures Hinton, G. (2013) Neural Networks for Machine Learning. Coursera Deep Learning Nanodegree Foundations. Udacity CS231n: Convolutional Neural Networks for Visual Recognition.?Stanford University CS224d: Deep Learning for Natural Language Processing.?Stanford University CS 294-131: Special Topics in Deep Learning.?UC Berkeley CS 294: Deep Reinforcement Learning, Spring 2017.?UC Berkeley Vanhoucke, V.. Deep Learning. Udacity Books Goodfellow, […]

Studying ‘Linguistics’

Fields of Linguistics 구분1: 시간 공시언어학 Descriptive linguistic 어떤 한 순간의 언어 상태를 연구 통시언어학/역사언어학 Historical linguistics 시간에 따른 언어의 변화를 연구 구분2: 연구 대상 음성학?Phonetics 음성의 물리적 성질 음운론?Phonology 화자가 말할 때 심리적으로 구분하는 소리(음운) 형태론?Morphology 단어의 내부 구조 통사론?Syntax 문장의 내부 구조 의미론?Semantics 단어의 의미와 단어의 조합에 따른 의미 변화 화용론?Pragmatics 대화에서 화자의 […]