Studying Generative Adversarial Networks (GANs)

References Lecture 13: Generative Models. CS231n: Convolutional Neural Networks for Visual Recognition. Spring 2017. [SLIDE][VIDEO] Generative Adversarial Nets.?Goodfellow et al.. NIPS 2014. 2014. [LINK][arXiv] How to Train a GAN? Tips and tricks to make GANs work. Soumith Chintala. github. [LINK] The GAN Zoo.?Avinash Hindupur. github. [LINK]

Inception Module | Summary

References Udacity (2016. 6. 6.). Inception Module. YouTube. [LINK] Udacity (2016. 6. 6.). 1×1 Convolutions. YouTube. [LINK] Tommy Mulc (2016. 9.?25.). Inception modules: explained and implemented. [LINK] Szegedy et al. (2015). Going Deeper with Convolutions. CVPR 2015. [arXiv] Summary History The inception module was first introduced in?GoogLeNet for?ILSVRC’14 competition. Key concept Let?a convolutional network decide […]

Batch Normalization | Summary

References Sergey Ioffe, Christian Szegedy (2015). Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift.?ICML 2015. [ICML][arXiv] Lecture 6: Training Neural Networks, Part 1. CS231n:Convolutional Neural Networks for Visual Recognition. 48:52~1:04:39 [YouTube] Choung young jae (2017. 7. 2.). PR-021: Batch Normalization. Youtube. [YouTube] tf.nn.batch_normalization. Tensorflow. [LINK] Rui Shu (27 DEC 2016). A GENTLE […]

Convolutional Neural Networks | Study

  References L. Fei-Fei, Justin Johnson (Spring 2017)CS231n: Convolutional Neural Networks for Visual Recognition. [LINK] Jefkine (5 September 2016). Backpropagation In Convolutional Neural Networks. [LINK] Convnet: Implementing Convolution Layer with Numpy [LINK] CNN의 역전파(backpropagation) [LINK]

CS231n: Convolutional Neural Networks for Visual Recognition | Course

Lecture 6 | Training Neural Networks I Sigmoid Problems of the sigmoid activation function Problem 1: Saturated neurons kill the gradients. Problem 2: Sigmoid outputs are not zero-centered. Suppose a given feed-forward neural network has hidden layers and all activation functions are sigmoid. Then, except the first layer, the other layers get only positive inputs. […]

Neural Networks and Learning Machines. 3rd Ed. Simon O. Haykins. Pearson. 2008

Chapter 8. Principal-Components Analysis 8.1 Introduction 8.2 Principles of Self-Organization Principle 1. Self-Amplification Principle 2. Competition Principle 3. Cooperation Principle 4. Structural Information 8.3 Self-Organized Feature Analysis 8.4 Principal-Components Analysis: Perturbation Theory 8.5 Hebbian-Based maximum Eigenfilter 8.6 Hebbian-Based Principal Components Analysis 8.7 Case Study: Image Coding 8.8 Kernel Principal-Components Analysis 8.9 Basic Issues Involved in […]

Computational Neuroscience | Course | MS CogSci

Range 8.1~8.7 9.1~9.10 10.1~10.14 10.19~10.21 Chapter 8. Principal-Components Analysis 8.1. Introduction Self-organized learning Self-organized learning is a type of unsupervised learning. locality of learning 8.2. Principles of Self-Organization Principle 1: self-amplification The following rule is based on Hebb’s postulate of learning. If two neurons of a synapse are activated simultaneously, then synaptic strength is selectively […]

Sequence to Sequence Learning with Neural Networks | Summary

References Ilya Sutskever, Oriol Vinyals, Quoc V. Le (2014). “Sequence to Sequence Learning with Neural Networks”. NIPS 2014: 3104-3112. [PDF] Sequence-to-Sequence Models. TensorFlow [LINK] The official tutorial for sequence-to-sequence models. Seq2seq Library (contrib). Tensorflow [LINK] Translation with a Sequence to Sequence Network and Attention. PyTorch. [LINK]

Deep Learning | Udacity

[latexpage] Brief Information Instructor:?Vincent Vanhoucke (Principal Scientist at Google Brain) Flatform: Udacity Course homepage:?https://www.udacity.com/course/deep-learning–ud730 Duration 2017-08-24~25:?Took Lesson 1, 3-7 without programming assignments. Course Overview Lesson 1: From Machine Learning to Deep Learning Lesson 2: Assignment: notMNIST Lesson 3: Deep Neural Networks Lesson 4: Convolutional Neural Networks Lesson 5: Deep Models for Text and Sequences Lesson […]

Neural Network for Machine Learning | by Geoffrey Hinton | Coursera

Brief Information Course name 😕Neural Network for Machine Learning Lecturer 😕Geoffrey Hinton Duration: Syllabus Record Certificate Learning outcome About this course Learn about artificial neural networks and how they’re being used for machine learning, as applied to speech and object recognition, image segmentation, modeling language and human motion, etc. We’ll emphasize both the basic algorithms […]