You don't always need LLM!
The best AI Engineers know when to use what
Here's a simple roadmap to guide you ๐
If youโre serious about NLP (basic โ LLMs),ย
these are the only books you need to backup your theory.
NLP (basic to LLM) roadmap:
1๏ธโฃ ๐๐ฎ๐ป๐ฑ๐-๐ข๐ป ๐ ๐ฎ๐ฐ๐ต๐ถ๐ป๐ฒ ๐๐ฒ๐ฎ๐ฟ๐ป๐ถ๐ป๐ด ๐๐ถ๐๐ต ๐ฆ๐ฐ๐ถ๐ธ๐ถ๐-๐๐ฒ๐ฎ๐ฟ๐ป, ๐๐ฒ๐ฟ๐ฎ๐ & ๐ง๐ฒ๐ป๐๐ผ๐ฟ๐๐น๐ผ๐ - Aurรฉlien Gรฉron
โณ Covers all core ML: regression, trees, SVM, neural nets.
โณ End-to-end projects in TensorFlow & Keras.
(Note: PyTorch is more popular now - but the concepts are same)
2๏ธโฃ ๐ฃ๐๐๐ต๐ผ๐ป ๐ก๐ฎ๐๐๐ฟ๐ฎ๐น ๐๐ฎ๐ป๐ด๐๐ฎ๐ด๐ฒ ๐ฃ๐ฟ๐ผ๐ฐ๐ฒ๐๐๐ถ๐ป๐ด ๐๐ผ๐ผ๐ธ๐ฏ๐ผ๐ผ๐ธ - Zhenya Antiฤ, PhD & Saurabh Chakravarty, Ph.D.
โณ Tokenization, nโgrams, stopโwords, stemming, TFโIDF, BM25, NER.
โณ Hands-on with NLTK, spaCy, PyTorch, Hugging Face.
โณ word2vec, GloVe, FastText (pre-transformer must-knows).
3๏ธโฃ ๐ก๐ฎ๐๐๐ฟ๐ฎ๐น ๐๐ฎ๐ป๐ด๐๐ฎ๐ด๐ฒ ๐ฃ๐ฟ๐ผ๐ฐ๐ฒ๐๐๐ถ๐ป๐ด ๐๐ถ๐๐ต ๐ง๐ฟ๐ฎ๐ป๐๐ณ๐ผ๐ฟ๐บ๐ฒ๐ฟ๐ - Lewis Tunstall, Leandro von Werra, Thomas Wolf
โณ From BERT & DistilBERT to sentence-transformers.
โณ Hugging Face workflows for classification, QA.
โณ Super clear on transformer internals.
4๏ธโฃ ๐๐ฎ๐ป๐ฑ๐-๐ข๐ป ๐๐ฎ๐ฟ๐ด๐ฒ ๐๐ฎ๐ป๐ด๐๐ฎ๐ด๐ฒ ๐ ๐ผ๐ฑ๐ฒ๐น๐ - Jay Alammar & Maarten Grootendorst
โณ Build a transformer from scratch.
โณ Prompting, embeddings, RAG, eval.
โณ Function calling, modern agent pipelines.
Possibly you could also add -
Build a Large Language Model (From Scratch) by Raschka, PhD
That's really it!
(there would be some overlap of topics across the books)
These books are really enough for your theoretical backing - further try to incorporate these principles in your hands-on projects.
--
โป๏ธ Repost to if you found it helpful ๐
โ Join 29.000+ AI/ML builders here: https://lnkd.in/ds_SzEUH
The best AI Engineers know when to use what
Here's a simple roadmap to guide you ๐
If youโre serious about NLP (basic โ LLMs),ย
these are the only books you need to backup your theory.
NLP (basic to LLM) roadmap:
1๏ธโฃ ๐๐ฎ๐ป๐ฑ๐-๐ข๐ป ๐ ๐ฎ๐ฐ๐ต๐ถ๐ป๐ฒ ๐๐ฒ๐ฎ๐ฟ๐ป๐ถ๐ป๐ด ๐๐ถ๐๐ต ๐ฆ๐ฐ๐ถ๐ธ๐ถ๐-๐๐ฒ๐ฎ๐ฟ๐ป, ๐๐ฒ๐ฟ๐ฎ๐ & ๐ง๐ฒ๐ป๐๐ผ๐ฟ๐๐น๐ผ๐ - Aurรฉlien Gรฉron
โณ Covers all core ML: regression, trees, SVM, neural nets.
โณ End-to-end projects in TensorFlow & Keras.
(Note: PyTorch is more popular now - but the concepts are same)
2๏ธโฃ ๐ฃ๐๐๐ต๐ผ๐ป ๐ก๐ฎ๐๐๐ฟ๐ฎ๐น ๐๐ฎ๐ป๐ด๐๐ฎ๐ด๐ฒ ๐ฃ๐ฟ๐ผ๐ฐ๐ฒ๐๐๐ถ๐ป๐ด ๐๐ผ๐ผ๐ธ๐ฏ๐ผ๐ผ๐ธ - Zhenya Antiฤ, PhD & Saurabh Chakravarty, Ph.D.
โณ Tokenization, nโgrams, stopโwords, stemming, TFโIDF, BM25, NER.
โณ Hands-on with NLTK, spaCy, PyTorch, Hugging Face.
โณ word2vec, GloVe, FastText (pre-transformer must-knows).
3๏ธโฃ ๐ก๐ฎ๐๐๐ฟ๐ฎ๐น ๐๐ฎ๐ป๐ด๐๐ฎ๐ด๐ฒ ๐ฃ๐ฟ๐ผ๐ฐ๐ฒ๐๐๐ถ๐ป๐ด ๐๐ถ๐๐ต ๐ง๐ฟ๐ฎ๐ป๐๐ณ๐ผ๐ฟ๐บ๐ฒ๐ฟ๐ - Lewis Tunstall, Leandro von Werra, Thomas Wolf
โณ From BERT & DistilBERT to sentence-transformers.
โณ Hugging Face workflows for classification, QA.
โณ Super clear on transformer internals.
4๏ธโฃ ๐๐ฎ๐ป๐ฑ๐-๐ข๐ป ๐๐ฎ๐ฟ๐ด๐ฒ ๐๐ฎ๐ป๐ด๐๐ฎ๐ด๐ฒ ๐ ๐ผ๐ฑ๐ฒ๐น๐ - Jay Alammar & Maarten Grootendorst
โณ Build a transformer from scratch.
โณ Prompting, embeddings, RAG, eval.
โณ Function calling, modern agent pipelines.
Possibly you could also add -
Build a Large Language Model (From Scratch) by Raschka, PhD
That's really it!
(there would be some overlap of topics across the books)
These books are really enough for your theoretical backing - further try to incorporate these principles in your hands-on projects.
--
โป๏ธ Repost to if you found it helpful ๐
โ Join 29.000+ AI/ML builders here: https://lnkd.in/ds_SzEUH