I’ve been taking the Deep Learning Specialization on Coursera starting with Neural Networks and Deep Learning. This is a revamped specialization from the first Machine Learning series presented by Dr. Andrew Ng, the globally recognized leader in Artificial Intelligence.
Having completed the first course on Neural Networks and Deep Learning, here are some pros and cons of the new series (so far).
Pros
- The course is now using Python instead of Octave. While Octave was a decent platform, it just always felt kludgy having to install a separate IDE which also contains the Octave interpreter. Ultimately, I felt I would have to relearn everything in a more relevant framework in the event I wanted to start exploring ML in my actual work.
- Along with the switch to Python, you now learn all of the ML things that come with Python – like numpy. Granted, I have never been a big numpy user and dataframes are still very foreign to me. But remember, I’m the SQL Tech Lead so I have very little use for manipulating data in dataframes when I could do it much quicker and more efficiently in SQL. That being said, this course really shows the power of numpy with respect to Machine Learning. Although, I have been coming across more SQL for Machine Learning content so perhaps there is hope for a dinosaur like myself to eventually break into this field.
Cons
- The course is still a little too math-y for my taste. I didn’t complete the original specialization probably because of the combination of the math-y’ness and having to work in Octave. At least this time around the Octave issue has been solved. Don’t get me wrong, I’m not one to shy away from a good math kick in the butt. But it would be good to tie the math more closely to the process of determining whether or not we’re looking at a picture of a cat or a not-cat.
- To extrapolate more on the math-y’ness, I was able to complete all of the assignments with a good understanding of how the math translates to code but with very little understanding of how the math or the code actually recognize cat or not-cat in pictures. That code just worked as long as you paid close attention to the formulas.
That being said, I’ve learned a great deal about linear regression and convoluted neural networks. I’m looking forward to the remaining courses in the specialization and hopefully these gaps in my understanding are eventually filled over time.