Deep Learning (The MIT Press Essential Knowledge series)

1.299,00 EGP

Description

Price: [price_with_discount]
(as of [price_update_date] – Details)


[ad_1]

Customers say

Customers find the book’s introduction and explanations enlightening. They find the content good and valuable for a broad audience. The pacing is described as good, with some advanced topics included as well. Overall, customers describe it as an excellent primer and a good refresher for Machine Learning.

AI-generated from the text of customer reviews

[ad_2]

This Post Has 13 Comments

  1. Excellent but not so easy.
    The back cover indicates “An accessible introduction to AI …” Ok, it is accessible if you have a pretty good background in calculus including partial derivatives and chain rule, regression, matrix algebra operation, advanced geometry, etc.” You get the picture. But, that is not the author’s fault. This is the cognitive entry gate to understanding DNN. You need a foundation going in.I have read several books on DNNs. And, I taught myself how to develop such DNN models. Many of the books I had read before invariably combined some contextual material with some software codes to get you going. Although many of these books were between good and very good; it was refreshing to pick up a book solely concentrated on making you understand the underlying math of DNNs. Be warned, the author does not let a single stone unturned. If you are just into getting a high level understanding on how DNN work, maybe a couple of good articles at Medium will suffice. This book is a lot more than that. The author drills down on the subject.The author also has a pretty original approach to the subject that is much more geometry based than I had ever read elsewhere. He talks of mapping, and different types of spaces. He represents a lot of decisions along a two-dimensional graphs in ways I had not seen done by other authors.This book is very comparable and competitive with “Neural Networks, a Visual Introduction for Beginners” by Michael Taylor. And, I think for the ones with a pretty good background in math, but below the ones of a college grad or masters in math, Taylor’s book is much more accessible and actually teaches you a lot. However, while Taylor is a very good teacher at the introductory level, Kelleher is also an excellent one at the more advanced level. Taylor and Kelleher approach the subject differently at different levels and you will learn a lot from both.From Taylor, I got a pretty good understanding of DNNs. And, I got to develop some pretty good DNNs to explain and simulate the stock market (with only a mediocre level of success, so I still have to keep my day job). From Kelleher, I learned that the DNN structure I was using that included Sigmoid activation functions was really outdated. And, that I really have to learn how to develop DNNs that use long short term memory (LSTM) with rectified linear function (ReLu) (instead of Sigmoid) to improve my DNNs. This will be an ambitious undertaking, as I will have to graduate from using a very simple R package (deepnet) that allows you to code a DNN in essentially a single line of code with all the arguments you need to specify a traditional DNN. But, to develop a DNN with LSTM with ReLu, I will have to use Python Keras with Tensorflow, a far more complex undertaking. Nevertheless, Kelleher imparted to me extensive theoretical knowledge on why I have to move away from Sigmoid activation and towards ReLu with LSTM. Given that, I could not ask more from Kelleher. He much raised my understanding of the subject.If you are in a similar boat as I am, you will appreciate this book a lot. As you will see, or as you know already DNNs is an ongoing process. There is no clear finish line. This is unlike many other model structures such as ARIMA, ECM, VAR, etc. where what you see is what you get; as these model structures have an end point. Once you reached it, you know and understand them. With DNNs, there is always either a topic you thought you understood, but you uncover you actually do not. And, there are a lot of subjects you don’t even know off as the field is evolving rapidly in ever complex and diversified directions. I think DNNs will keep mathematicians busy for a pretty long time. And, that is kind of exciting in itself. When you uncover a quantitative method that seems to ever have room to evolve, it is pretty cool stuff.

  2. Overall, a nice and accessible introduction to Deep Learning
    A decent introduction to the history and technical aspects of Deep Learning. Not for someone who wants to know how to use deep learning, nor is it sufficient for someone who wants to write there own deep learning algorithms. It is not appropriate for the former, but a wonderful primer and starting point for the latter. You will be ready for the recommended additional reading at the end of the book after reading this text.

  3. Compact intro to deep learning
    This book is short and concise, making it a compact intro to the subject. It assumes relatively little background in math (if you’re like me you might want to skip the parts that go through basic concepts multivariable calculus and linear algebra etc.), and the exposition is very clear. The diagrams are helpful, too. A good intro + historical overview of this young (and rapidly growing) field that prepares you for a deeper dive.

  4. a gentle, solid and modern introduction to deep learning
    The author has provided, in this book, a modern (to 2019) introduction to deep learning. The focus of the book is on a limited number of topics, such as backpropagation, treated very deeply (but with few assumptions about technical preparation). In additional, Kelleher has given a pretty up-to-date perspective on this subject. In recent years, due to a number of factors, such as good matrix-calculation hardware, deep learning and neural networks have shot into the vanguard of interest for weak AI. Therefore, Kelleher’s expert presentation, and careful “hand-holding”, as he proceeds to discuss some of the important topics, like the evolution of threshold functions, is particularly timely. I think that the very minimal level of understanding of linear algebra and calculus that is necessary to grasp the technical aspects of his discussion, make this book very valuable book for a broad audience, such as for software engineers at a beginning level in this area, and technical staff generally. Short of a good course, this summary overview is about the best one could hope for in a technical introduction, at a high level. I strongly recommend this book as a very easy, short read, that will be informative about some important basics. With the advent of software and hardware improvements, over the next twenty or thirty years, like quantum computers, deep learning is very likely to remain a significant tool in many technical fields, including physics (which is my primary area of interest).

  5. Fantastic intro to the math
    I saw this book recommended in various places and it did not disappoint. It lays a foundation that takes the mystery out of neural nets. It’s been many years for me since college math, so I found a few parts challenging, but the math really isn’t very hard. This book was written before the rise of transformers but it’s an amazing intro to the fundamentals. Start here and move on to other books if you still feel the need.

  6. Best Introduction to Deep Learning
    I reviewed several books to get an introduction to this subject, but most of them dive into the calculus and/or code right from the start. I’m comfortable with that, but this one starts with basic concepts and builds from the ground up, introducing the math starting with relatively simple ideas like the equation for a line. I think you could do very well reading this book with just a basic foundation in algebra. Calculus would be a plus, but not required.

  7. A quality book about machine learning that exceeded my expectations. I congratulate the author, John D. Kelleher.

  8. This book is by far the best introduction to deep learning I have read. It’s very descriptive with useful diagrams and a gentle approach to the mathematics. If I’d started with this book two years ago when I set out to learn NN programming, it would have saved me a lot of time and internet searches. Highly recommended to anyone who wants to learn how to program neural networks.

  9. A book which introduces deep learning from a fundamental point of view. It is simple for a common person with some math to read, but really complete and deep

Leave a Reply

Your email address will not be published. Required fields are marked *