Learning deep learning : theory and practice of neural networks, computer vision, natural language processing, and transformers using tensorflow / Magnus Ekman
Material type:
- 9780137470358
- Q 325.5 .E36 2022

Item type | Current library | Home library | Collection | Call number | Copy number | Status | Date due | Barcode | |
---|---|---|---|---|---|---|---|---|---|
![]() |
National University - Manila | LRC - Main General Circulation | Machine Learning | GC Q 325.5 .E36 2022 (Browse shelf(Opens below)) | c.1 | Available | NULIB000019603 |
Browsing LRC - Main shelves, Shelving location: General Circulation, Collection: Machine Learning Close shelf browser (Hides shelf browser)
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
||
GC Q 325.5 .B47 2021 Introduction to machine learning / | GC Q 325.5 .B53 2019 c.1 Machine learning A-Z : introduction to AI digital brains of the future / | GC Q 325.5 .C43 2017 Machine learning algorithms : fundamental algorithms for supervised and unsupervised learning / | GC Q 325.5 .E36 2022 Learning deep learning : theory and practice of neural networks, computer vision, natural language processing, and transformers using tensorflow / | GC Q 325.5 .G47 2017 Hands-on machine learning with Scikit-Learn and TensorFlow : concepts, tools, and techniques to build intelligent systems / | GC Q 325.5 .H69 2020 Deep learning for coders with fastai and pytorch : AI applications without a PhD / | GC Q 325.5 .K45 2019 Deep learning / |
Includes index.
Learning Deep Learning is a complete guide to DL. Illuminating both the core concepts and the hands-on programming techniques needed to succeed, this text can be used for students with prior programming experience but with no prior machine learning or statistics experience. After introducing the essential building blocks of deep neural networks, such as artificial neurons and fully connected, convolutional, and recurrent layers, Ekman shows how to use them to build advanced architectures, including the Transformer.
1. The Rosenblatt Perceptron -- 2. Gradient-Based Learning -- 3. Sigmoid neurons and backpropagation -- 4. Fully Connected networks applied to multiclass classification -- 5. Towards DL: Frameworks and networks Tweaks -- 6. Fully Connected Networks Applied to Regression -- 7. Convolutional Neural Networks Applied to Image Classification -- 8. Deeper CNNs and Pretrained Models -- 9. Predicting time sequences with recurrent neural networks -- 10. Long Short-term Memory -- 11. Text Autocompletion with LSTM and Beam Search -- 12. Neural Language Models and Word Embeddings -- 13. Word Embeddings from word2vec and GloVe -- 14. Sequence to sequence networks and natural Language Translation -- 15. Attention and the Transformer -- 16. One-to-many network for image captioning -- 17. Medley of Additional topics -- 18. Summary and Next steps.
There are no comments on this title.