Deliver to Kenya
IFor best experience Get the App
Full description not available
S**G
A surprisingly poor book--who is the audience?
I am surprised by how poorly written this book is. I eagerly bought it based on all the positive reviews it had received.Bad mistake. Only a few of the reviews clearly state the obvious problems of this book. Oddly enough, these informativereviews tend to attract aggressively negative comments of an almost personal nature.The disconnect between the majority of cloyingly effusive reviews of this book and the reality of how it is writtenis quite flabbergasting. I do not wish to speculate on the reason for this but it does sometimes does occur witha first book in an important area or when dealing with pioneer authors with a cult following.First of all, it is not clear who is the audience--the writing does not provide details at the level oneexpects from a textbook. It also does not provide a good overview ("big picture thinking"). Advanced readerswould also not gain much because it is too superficial, when it comes to the advanced topics (final 35% of book).More than half of this book reads like a bibliographic notes section of a book, and the authors seemto be have no understanding of the didactic intention of a textbook (beyond a collation or importance samplingof various topics). In other words, these portions readlike a prose description of a bibliography, with equations thrown in for annotation. The level ofdetail is more similar to an expanded ACM Computing Surveys article rather than a textbook inseveral chapters. At the other extreme of audience expectation, we have a review of linear algebra in the beginning,which is a waste of useful space that could have been spent on actual explanations in otherchapters. If you don't know linear algebra already, you cannot really hope to followanything (especially in the way the book is written). In any case, the linearalgebra introduced in that chapter is too poorly written to even brush up on known material-- so who is that for?As a practical matter, Part I of the book is mostly redundant/off-topic for a neural network book(containing linear algebra, probability, and so on)and Part III is written in a superficial way--so only a third of the book is remotely useful.Other than a chapter on optimization algorithms (good description of algorithms likeAdam), I do not see even a single chapter that has done a half-decent job of presentingalgorithms with the proper conceptual framework. The presentation style is unnecessarily terse, and dry, and is stylistically more similar to a research paper rather than a book. It is understood that any machine learning book would have some mathematical sophistication, but themain problem is caused by a lack of concern on part of the authors in promoting readability and an inability toput themselves in reader shoes (surprisingly enough, some defensive responses to negative reviews tend to placeblame on math-phobic readers). At the end of the day, it is the author's responsibility to makenotational and organizational choices that are likely to maximize understanding.Good mathematicians have excellent manners while choosing notation (you don't use nestedsubscripts/superscripts/functions if you possess the clarity to do it more simply).And no, math equations are not the same as algorithms-- only a small part of it. Where is the rest?Where is the algorithm described? Where is the conceptual framework?Where is the intuition? Where are the pseudocodes? Where are the illustrations? Where are the examples?No, I am not asking for recipes or Python code. Just some decent writing, details, and explanations.The sections on applications, LSTM and convolutional neural networks are hand-wavy at places and read like "you can do this to achieve that." It is impossible to fully reconstruct the methods from thedescription provided.A large part of the book (including restricted Boltzmann machines)is so tightly integrated with Probabilistic Graphical models (PGM), so that it loses its neural network focus.This portion is also in the latter part of the book that is written in a rather superficial way andtherefore it implicitly creates another prerequisite of being very used to PGM (sort-of knowing it wouldn't be enough). .Keep in mind that the PGM view of neural networks is not the dominant view today, from either a practitioneror a research point of view. So why the focus on PGM, if they don't have the space to elaborate?On the one hand, the authors make a futile attempt at promoting accessibility by discussing redundantpre-requisites like basic linear algebra/probability basics. On the other hand, the PGM-heavy approach implicitlyincreases the pre-requisites to include an even more advanced machine learning topic than neural networks(with a 1200+ page book of its own). What the authors are doing is the equivalent of trying to teach someonehow to multiply two numbers as a special case of tensor multiplication. Even for RNNs with deterministic hidden statesthey feel the need to couch it as a graphical model. It is useful to connect areas, but mixing themis a bad idea. Look at Hinton's course. It does explain the connection between Boltzmann machines and PGMvery nicely, but one can easily follow RBM without having to bear the constant burden of a PGM-centric view.One fact that I think played a role in these types of strategic errors of judgement is the fact that thelead author is a fresh PhD graduate There is no substitute for experience when it comes to maturityin writing ability (irrespective of how good a researcher someone is). Mature writers have the ability to putthemselves in reader shoes and have a good sense of what is conceptually important. Theauthors clearly miss the forest from the trees, with chapter titles like "Confrontingthe partition function." The book is an example of the fact that a first book in an important area with the name ofa pioneer author in it is not necessarily a qualification for being considered a good book.I am not hesitant to call it out. The emperor has no clothes.
D**E
Not very good :(
Have I been reading amazon book reviews backwards and you are supposed to count the white stars?This book is not going to teach you machine learning and I don't even know why they bothered including the math sections because they just restate definitions, of varying relevance, that you may or may not know, in a confusing way.It isn't going to teach you the math or even serve as a refresher on the math. At best, if you already know the math you can decode what they are saying and nod along.It feels like the book is compressed. They write out overly elaborate mathematical symbols and then you just have to think it through and remember that Andrew NG video where he actually explained the concept.So in short the math is overly elaborate and it really doesn't explain anything. The math review section is worthless. They don't have examples or practice problems. They expect you to do all the work, which you should, with another book.
M**R
A rushed, poorly written guide of how the "experts" can't really explain what Deep Learning is
This book, in every sense of the word, is rushed. I think the authors wanted to establish themselves as leaders of this young-ish field, but does so by sacrificing quality. It also shows that Deep Learning theory has been there for a long time, known by another name called Neural Networks. The interesting algorithms are of MLP, Back Propagation and the classical neural networks. The optimization methods such as Adam are the ones that are new and interesting, and the only ones worthy of in this book. So, essentially, what you get from this book is use A for X, B for Y and C for Z type of dry, un-intuitive, badly written waste of paper.As for the structure of the book, it's like an example of how not to structure a book. It has some linear algebra, probability at the start (not good enough, and confuses more people and wastes paper). Goes on to prove other algorithms such as PCA (yeah, ok!). Then, talks about how this architecture works for this and that architecture.So, yeah, if you really want to try out deep learning, don't buy this book. Set up Tensorflow/pytorch/ other library, run the tutorials, find an architecture for the problem you are interested in and start tweaking that. You will have far more fun and would have saved your money.The praise that this book gets is beyond me. Did Musk even read this book? I doubt it.
A**X
Unclear who the intended audience is — deep learning practitioners aren’t it, though...
The book was frankly a disappointment. It was unclear who the intended audience was. If it were the people who wanted to find out academic background behind Deep Learning, then this would be too superficial for them. In some cases, it’s not even clear who and how someone would benefit from the presented material. (For instance, whom was the Linear Algebra chapter written for? It’s woefully impossible to understand if you don’t already know linear algebra. And if you already know it, it’s unnecessary. And if you sort of know it and wanted to brush up, what linear algebra they present in the chapter is not enough to go through the math in the book. So…)If it’s for the people who want to get started with deep learning, it’s completely off topic, since it presents the mathematical nitty-gritty of the deep learning algorithms without mentioning any specifics of how to train a convo-net for example. The amount of information on convolutional networks and LSTMs is worse than on any number of blogs on deep learning or Wikipedia.If you’re really interested in Math behind Deep Learning out of curiousity (perhaps you’re a mathematician who wants to know what this deep learning thing is all about) perhaps this is a book for you. Otherwise, do yourself a favor and watch/read Andrej Karpathy’s Stanford class.
A**R
It does not worth the money for the quality of the print.
I suspect it is pirated, I can’t imagine how it is original copy, leave the detached cover aside binding is poor and print quality of most pages are distorted. I know the content of the book(I’ve read it) and I can’t believe it is being sold like this.
R**T
Torn pages, damaged corners, corrupted figures but sold as new: are you kidding me?
The book came on a protected box and a protective plastic film but still came damaged on every corner. The book itself is advertised as being hard cover but it is made of a really cheap cardboard that folds very easily. The paper itself is also made of a really cheap material.The worst of all is not that but the content. The figures are corrupted (I have included a couple of pictures but they are many more) in multiple places in such a way that it is difficult or impossible to decipher what could have been written.Overall it looks like the book was rushed without proofreading or quality survey: it is the worst book I have bought in many years.
S**N
Wrong Pages!!
Very disappointing. After one month, just recognize that p171-p378 are missing, the pages are repetition of p379-p586. It does not have a refund option!
N**E
Comprehensive literature review of start of art
Deep Learning is a difficult field to follow because there is so much literature and the pace of development is so fast. This book summarises the state of the art in a textbook by some of the leaders in the field. I particularly appreciated the applied math and machine learning basics section, which is very focused on the tools from linear algebra, probability, info theory, numerical computation that are directly relevant to deep learning. It requires a solid undergrad maths background in stats/linear algebra, but you dont' need to be super comfortable with it because they take you through everything if you are a bit rusty.Of course, even though the book is long (700+ pages) it ends up feeling too short because each sub-field might only get 20-30 pages, but the book does give a good enough grounding to start a literature research, and highlighting hte key papers and insights. For myself, what has bee most valuable is seeing how it all fits together, which reading a random paper is hard to get.Of course, there are lots of videos/courses/books out there, but this is the only one I've seen that is as comprehensive as this book. I've read it to cover-to-cover, and it is proving a great foundation to go and read more practical books and do projects using Tensorflow/Keras etc. Highly recommend.
M**Z
Too theoretical
Very theoretical and steep learning curve. Would be much better if it had code and practical examples as well as exercises. Get Deep Learning with Python by Chollet for excellent practical examples using Keras with applications to code straight away. Alternatively the O’Reilly book by Geron which has Jupyter Notebook examples and exercises also, Tensor Flow centric, good definitions and references too.The rewarding book is Chollet's. This one by Goodfellow is better suited to experts in the field, to have an excellent reference book in their hands.
A**V
The book was "written by a robot".
The book was "written by a robot" in the sense that (if you will search inside) - you will never find the phrases like:"in practice this formula means that""This is widely used in the field, because..""this was fist proven as a success approach by..""this is our short review of the competition results..""the intuition behind this is..""this was never proven in practice..""from the common sense point of view..""technologically this is feasible since..""this approach allows to fit the batch into RAM memory.."This books just "mechanically" throws the math expressions at unsuspecting user.Never attempting to make a conceptual/functional/common sense introduction.You will not find a single line of Python/R code in the book.Not a single review of existing NN frameworks/libraries.You will not find anything about technology and/or methodology of data collection, data cleaning, outliers detection, practical hints on training etcNever trying to connect to any publications (i.e. other than the authors) in the field.Never ever bringing the meaning of the formulas into discussion.Deservedly, 3 out 5.Alexei
Trustpilot
2 weeks ago
1 month ago