Learning Deep Learning: Theory and Practice of Neural Networks, Computer Vision, NLP, and Transformers using TensorFlow Front Cover

Learning Deep Learning: Theory and Practice of Neural Networks, Computer Vision, NLP, and Transformers using TensorFlow

Description

NVIDIA’s Full-Color Guide to Deep Learning with TensorFlow: All You Need to Get Started and Get Results

Deep learning is a key component of today’s exciting advances in machine learning and artificial intelligence. Learning Deep Learning is a complete guide to deep learning with TensorFlow, the #1 Python library for building these breakthrough applications. Illuminating both the core concepts and the hands-on programming techniques needed to succeed, this book is ideal for developers, data scientists, analysts, and others–including those with no prior machine learning or statistics experience.

After introducing the essential building blocks of deep neural networks, Magnus Ekman shows how to use fully connected feedforward networks and convolutional networks to solve real problems, such as predicting housing prices or classifying images. You’ll learn how to represent words from a natural language, capture semantics, and develop a working natural language translator. With that foundation in place, Ekman then guides you through building a system that inputs images and describes them in natural language.

Throughout, Ekman provides concise, well-annotated code examples using TensorFlow and the Keras API. (For comparison and easy migration between frameworks, complementary PyTorch examples are provided online.) He concludes by previewing trends in deep learning, exploring important ethical issues, and providing resources for further learning.

  • Master core concepts: perceptrons, gradient-based learning, sigmoid neurons, and back propagation
  • See how frameworks make it easier to develop more robust and useful neural networks
  • Discover how convolutional neural networks (CNNs) revolutionize classification and analysis
  • Use recurrent neural networks (RNNs) to optimize for text, speech, and other variable-length sequences
  • Master long short-term memory (LSTM) techniques for natural language generation and other applications
  • Move further into natural language-processing (NLP), including understanding and translation
To access the link, solve the captcha.