13 February 2018

[Data Science] Thoughts After Completing Coursera's Deep Learning 5-Course Specialisation

I have just completed Coursera's 5-Course Specialisation in Deep Learning taught by Andrew Ng. I have been looking forward to this Deep Learning course after taking Ng's Machine Learning course, also in Coursera, and learning a little more about Deep Learning.

Course Structure

The specialisation consists of 5 courses:

1. Neural Networks and Deep Learning 
This course introduces Neural Networks and Deep Learning. Since I have taken the Machine Learning course, the first half of the course was not totally new to me. Deep Learning is an neural network on steroids:  more hidden layers, which can be multi-dimensional. The exciting part about this course is I get to code Deep Learning Algorithms by hand using Python and Numpy - no Tensorflow or Keras, yet.
2. Improving Deep Neural Networks: Hyperparameter tuning, Regularisation and Optimization 
One of the powerful features of Deep Nets is the ability to learn complex relationship. However, the compromise to this power is the need for many parameters to the algorithm, or hyperparameters, to tune. This course is specifically for tuning hyper paramters. It can be quite dry. The only respite is when the Ng started to touch on Optimisation algorithms. I find this part very interesting, and I get to understand the workings behind when I type things like opt = AdamOptimizer(...) 
3. Structuring Machine Learning Projects 
This is one of the shorter module, and also another drier one. I shall not describe too much about it. The next two courses are the "specialisations" to the course.
4. Convolutional Neural Networks 
Or CovNets for short. It is used for image recognition (or computer vision). Wonder how a security camera is able to pick up faces, verify objects or identify suspicious articles? It is like the works of CovNets. It took me a while to have a fuzzy understanding of this powerful application of Deep Nets.
5. Sequence Models 
What happens if a time dimension is included into data? The data becomes a time series, or sequences. The Recurrent Neural Network is an application of Deep Nets on time series or sequence data. This is also a short course, but a pretty heavy one. It deals with mostly Natural Language Processing (NLP) and Machine Translation. There is also a side project on Jazz music improvisation, which I find interesting too.
The programming exercises are coded in Python using Jupyter Notebooks. They are interesting! Along the way, there will be some opportunity to use Deep Learning frameworks like Keras and Tensorflow. Keras is a "simpler" but less flexible derivative of Tensorflow, by the way.


Take-Away

The reasons I took this specialisation up are:

  1. Deep Learning is a building block of artificial intelligence (AI), a topic which I have grown to be very interested in. I always believe AI is going to provide an opportunity to provide more equity to mankind, if managed properly.
  2. My current work involves in analysis tasks which I think can be automated. With the knowledge of Deep Learning, I believe I can implement/build something, or at least a proof-of-concept, AI-ish to improvement work efficiency. If complicated tasks like image recognition, jazz improvisation, and even translation can be accomplished by Deep Learning, I believe my work can benefit from it too; I do not think the work I do is more complicated than what Deep Learning can accomplish, honestly.
  3. Well, the name "Andrew Ng" has become somewhat a household name for machine learning.

I gained better understanding in Deep Learning and its related concepts. However I come to realise a few "inconvenient"  aspects of learning from online resources, based on my person experience:

  1. In online courses, data are provided - cleaned, transformed and ready to be deployed. In real life, data can be a pain to obtain, let alone clean and validated, which will consume much time.
  2. The application of the concepts, however, requires much self-study is needed to learn about using Keras and Tensorflow. This mirrors much in real life. Stackoverflow remains my best friend.

Nevertheless, the learning continues.

~ZF