François Chung, Ph.D.

Tag: neural network

Deep learning and TensorFlow

Deep learning and TensorFlow

Cognitive Class training, MOOC (2020). This learning path presents the basic concepts of deep learning and TensorFlow with hands-on experience in solving problems. Throughout the training, TensorFlow is used in curve fitting, regression, classification and minimization of error functions. This concept is then explored in the deep learning world where TensorFlow is applied for backpropagation to tune the weights and biases while the neural networks are being trained.

Course 1: Deep learning fundamentals

Main topics:

  • Introduction to deep learning;
  • Deep learning models;
  • Additional deep learning models;
  • Deep learning platforms and libraries.

Course 2: Deep learning with TensorFlow

Main topics:

  • Introduction to TensorFlow;
  • CNN - Convolutional Neural Network;
  • RNN - Recurrent Neural Network;
  • Unsupervised learning.

References

Training

Deep learning fundamentals (course certificate)
Deep Learning Essentials (certification badge)
Deep learning with TensorFlow (course certificate)
Deep Learning using TensorFlow (certification badge)

Related articles

Learn more

Neural networks and deep learning

Neural networks and deep learning

Coursera training, MOOC (2018). Given online by Stanford University (US), this training introduces the foundations of deep learning. Main objectives are to understand the major technology trends driving deep learning, be able to build, train, apply and implement fully connected deep neural networks, and understand their key parameters. The training aims to teach how deep learning actually works, rather than presenting only a surface-level description.

Week 1: Introduction to deep learning

Main topics:

  • What is a neural network?
  • Supervised learning with neural networks;
  • Why is deep learning taking off?

Week 2: Neural networks basics

Main topics:

  • Binary classification;
  • Logistic regression;
  • Gradient descent;
  • Derivatives with a computation graph;
  • Vectorizing logistic regression.

Week 3: Shallow neural networks

Main topics:

  • Neural network representation;
  • Computing a neural network's output;
  • Vectorizing across multiple examples;
  • Activation functions and their derivatives;
  • Gradient descent for neural networks;
  • Random initialization.

Week 4: Deep neural networks

Main topics:

  • Multilayer neural network;
  • Forward and backward propagation;
  • Building blocks of deep neural networks;
  • Parameters vs hyperparameters.

References

Related articles

Learn more