Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization | Deep Learning Specialization | Coursera

Brief information Course name: Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization Instructor: Andrew Ng Institution: deeplearning.ai Media: Coursera Specialization: Deep Learning Duration: 3 weeks About this Course This course will teach you the “magic” of getting deep learning to work well. Rather than the deep learning process being a black box, you will understand […]

Structuring Machine Learning Projects | Deep Learning Specialization | Coursera

Brief information Course name: Structuring Machine Learning Projects Instructor: Andrew Ng Institution: deeplearning.ai Media: Coursera Specialization: Deep Learning Duration: 2 weeks About this Course You will learn how to build a successful machine learning project. If you aspire to be a technical leader in AI, and know how to set direction for your team’s work, this […]

Conditional Generative Adversarial Nets | M. Mirza, S. Osindero | 2014

Introduction Conditional version of Generative Adversarial Nets (GAN) where both generator and discriminator are conditioned on some data y (class label or data from some other modality). Architecture Feed y into both the generator and discriminator as additional input layers such that y and input are combined in a joint hidden representation.

Studying Generative Adversarial Networks (GANs)

References Lecture 13: Generative Models. CS231n: Convolutional Neural Networks for Visual Recognition. Spring 2017. [SLIDE][VIDEO] Generative Adversarial Nets. Goodfellow et al.. NIPS 2014. 2014. [LINK][arXiv] How to Train a GAN? Tips and tricks to make GANs work. Soumith Chintala. github. [LINK] The GAN Zoo. Avinash Hindupur. github. [LINK]

Inception Module | Summary

References Udacity (2016. 6. 6.). Inception Module. YouTube. [LINK] Udacity (2016. 6. 6.). 1×1 Convolutions. YouTube. [LINK] Tommy Mulc (2016. 9. 25.). Inception modules: explained and implemented. [LINK] Szegedy et al. (2015). Going Deeper with Convolutions. CVPR 2015. [arXiv] Summary History The inception module was first introduced in GoogLeNet for ILSVRC’14 competition. Key concept Let a convolutional network decide […]

Batch Normalization | Summary

References Sergey Ioffe, Christian Szegedy (2015). Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift. ICML 2015. [ICML][arXiv] Lecture 6: Training Neural Networks, Part 1. CS231n:Convolutional Neural Networks for Visual Recognition. 48:52~1:04:39 [YouTube] Choung young jae (2017. 7. 2.). PR-021: Batch Normalization. Youtube. [YouTube] tf.nn.batch_normalization. Tensorflow. [LINK] Rui Shu (27 DEC 2016). A GENTLE […]

Convolutional Neural Networks | Study

  References L. Fei-Fei, Justin Johnson (Spring 2017)CS231n: Convolutional Neural Networks for Visual Recognition. [LINK] Jefkine (5 September 2016). Backpropagation In Convolutional Neural Networks. [LINK] Convnet: Implementing Convolution Layer with Numpy [LINK] CNN의 역전파(backpropagation) [LINK]

Neural Networks and Learning Machines. 3rd Ed. Simon O. Haykins. Pearson. 2008

Chapter 8. Principal-Components Analysis 8.1 Introduction 8.2 Principles of Self-Organization Principle 1. Self-Amplification Principle 2. Competition Principle 3. Cooperation Principle 4. Structural Information 8.3 Self-Organized Feature Analysis 8.4 Principal-Components Analysis: Perturbation Theory 8.5 Hebbian-Based maximum Eigenfilter 8.6 Hebbian-Based Principal Components Analysis 8.7 Case Study: Image Coding 8.8 Kernel Principal-Components Analysis 8.9 Basic Issues Involved in […]

Computational Neuroscience | Course | MS CogSci

Range 8.1~8.7 9.1~9.10 10.1~10.14 10.19~10.21 Chapter 8. Principal-Components Analysis 8.1. Introduction Self-organized learning Self-organized learning is a type of unsupervised learning. locality of learning 8.2. Principles of Self-Organization Principle 1: self-amplification The following rule is based on Hebb’s postulate of learning. If two neurons of a synapse are activated simultaneously, then synaptic strength is selectively […]

Sequence to Sequence Learning with Neural Networks | Summary

References Ilya Sutskever, Oriol Vinyals, Quoc V. Le (2014). “Sequence to Sequence Learning with Neural Networks”. NIPS 2014: 3104-3112. [PDF] Sequence-to-Sequence Models. TensorFlow [LINK] The official tutorial for sequence-to-sequence models. Seq2seq Library (contrib). Tensorflow [LINK] Translation with a Sequence to Sequence Network and Attention. PyTorch. [LINK]