In Which Instructions were Sought, Part 3
Digging Deeper
![]() |
He shovels well, he shovels VERY well. (Wikipedia link, IMDB link) |
As I mentioned in the last blog (In Which Instructions were Sought, Part 2, link), I wanted to get more experience in some of the modern frameworks which are commonly used nowadays. The deep learning specialization covered the theoretical background well, and the assignments involved implementing the algorithms directly, which helped solidify understanding, but they did not go in depth into any of the frameworks, which means that in many production environments where frameworks are used to speed up implementation, I would have no relevant experience with the use of those frameworks. Having enjoyed the Coursera Deep Learning Specialization offered by deeplearning.ai, I wanted to take another course on the Coursera platform. I found two specializations which interested me:
TensorFlow in Practice Specialization, offered by deeplearning.ai (Coursera link)
Duration: 1 month for 4 courses (13 hours a week)Machine Learning with TensorFlow on Google Cloud Platform Specialization, offered by Google (Coursera link)
Duration: 1 month for 5 courses (15 hours a week)The courses offered by Google were promising since they were the ones who created TensorFlow. However, I didn't want to be limited to the Google Cloud Platform, even though I'm sure there is probably some way to port anything I wrote to other platforms. I had a good experience with the last deeplearning.ai specialization, so I decided to take the TensorFlow Specialization from deeplearning.ai as well.
Coursera TensorFlow in Practice Specialization Review
![]() |
"Only dead fish go with the flow" -Andy Hunt |
This specialization is much more practically-oriented than the deep learning specialization. The instructor Laurence Moroney, an AI Advocate at Google, spends hardly any time before diving right into the code, which was exactly what I was looking for. There are plenty of usage examples of neural network training on test databases, which can then be generalized to real world cases in your own work. How to use the tools and set up your environment is also explained. Below is a short summary of the material that the courses cover.
______________________________________________________
Course 1: Introduction to TensorFlow for Artificial Intelligence, Machine Learning, and Deep Learning (4 weeks)
Week 1: A New Programming Paradigm (14 minutes of video)Week 2: Introduction to Computer Vision (13 minutes of video)
Week 3: Enhancing Vision with Convolutional Neural Networks (15 minutes of video)
Week 4: Using Real-world Images (22 minutes of video)
Course 1 covers image recognition and classification using neural networks, with fashion items and horse/human classification as test cases.
______________________________________________________
Course 2: Convolutional Neural Networks in TensorFlow (4 weeks)
Week 1: Exploring a Larger Dataset (21 minutes of video)Week 2: Augmentation: A technique to avoid overfitting (10 minutes of video)
Week 3: Transfer Learning (11 minutes of video)
Week 4: Multiclass Classifications (9 minutes of video)
Course 2 uses more complex neural networks to do image classification of cats and dogs, covers how to automatically generate augmentation data, using transfer learning to take advantage of a trained Inception V3 model to quickly learn classification, and finally expanding classification to multiple classes, in this case rock paper scissors.
______________________________________________________
Course 3: Natural Language Processing in TensorFlow (4 weeks)
Week 1: Sentiment in text (24 minutes of video)Week 2: Word Embeddings (35 minutes of video)
Week 3: Sequence models (13 minutes of video)
Week 4: Sequence models and literature (22 minutes of video)
Course 3 moves onto sequential data, in this case processing text, and going over the nitty gritty of how to preprocess a stream of text so that it is ready for consumption by a neural network. The neural networks were then used to predict sentiment and generate text based on a corpus.
______________________________________________________
Course 4: Sequences, Time Series and Prediction (4 weeks)
Week 1: Sequences and Prediction (29 minutes of video)Week 2: Deep Neural Networks for Time Series (24 minutes of video)
Week 3: Recurrent Neural Networks for Time Series (16 minutes of video)
Week 4: Real-world time series data (20 minutes of video)
Course 4 looks at sequential data that is time-based. Some time is spent on techniques to generate artificial data which has seasonality, trend, and noise. Techniques on how to tune hyperparameters were also taught. Sunspot activity logs were used as the dataset to bring everything together for the final lesson of how to do predictions on real world data.
______________________________________________________
This course follows the typical format of a series of video lectures every week, followed by a quiz (which you have infinite attempts to get all the answers correct), followed by a programming assignment.
The instructor is friendly and easy to understand, however the course material feels like it has not gone through the more thorough testing and review that the deep learning specialization had. That's understandable, considering that Andrew Ng has refined his material over many years. However, it's disappointing when it sometimes feels like the instructor is stumbling through the demo and figuring out how to optimize the neural network without some rigorous mental framework of why he is applying the specific changes he did. The course also has the underlying assumption that anything not explained has been covered by Andrew Ng's courses, even when not specifically pointed out.
This course relies much more on you exploring the code yourself and figuring out what each part is doing, and how you can change and improve the code to increase the performance. There is also a lot more reading involved in this course, where you are expected to read the manual pages for various function and algorithms that you will be using in the course, and figure out what options to use and parameters to set.
Similar to the deep learning specialization, the assignments are Jupyter notebooks which run on Google Colab for free. The assignments were helpful in that it showed the use of many TensorFlow, Keras, and NumPy features which you can follow up with your own research to fully understand. The assignments are ungraded, however, with the solution available for you to look at when you are ready. This is less than ideal compared with the deep learning specialization which has an automatic grader, giving this course a feeling of being unpolished or lower tech. However, I have to admit that having the solution available has helped a few times, where I ran into issues and there were no answers in the forums, and putting a question in the forum is a hit-or-miss regarding whether it gets answered quickly, or gets answered at all, considering how few people were active at that time.
The last course in the specialization was just fully uploaded a few weeks ago, which means that for that course the forums are mostly empty, and there are very few students answering questions. At the time I took the course, there was no mentor monitoring the forums, which means that us students were left to figure out problems on our own.
In Closing
![]() |
That's all folks (for this entry)! (Wikipedia link, Goodreads link) |
Unfortunately, I cannot recommend this specialization very highly at this point. Perhaps in the future there will be enough updates to the course to make it worthwhile, but as of now, there just isn't enough meat for the courses to be worth the subscription fee (although you can always take the course for free). I suspect taking the fast.ai courses might have been a better use of my time, and I will give a more definitive answer after I have finished those courses.
However, at this point I feel like I have spent enough time studying, and want to get my hands dirty doing a project of my own. Follow along in my next blog, In Which a Project was Undertaken, Part 1 (link), as I mull over which project to pursue, and watch me flounder as I train a neural network of my own design for the first time!
Comments
Post a Comment