The Practical Guide to Large Language Models
Apress (Verlag)
979-8-8688-2215-5 (ISBN)
- Titel nicht im Sortiment
- Artikel merken
The book is structured into three parts to facilitate a step-by-step learning journey. Part One covers building production-ready LLM solutions introduces the Hugging Face library and equips readers to solve most of the common NLP challenges without requiring deep knowledge of transformer internals. Part Two focuses on empowering LLMs with RAG and intelligent agents exploring Retrieval-Augmented Generation (RAG) models, demonstrating how to enhance answer quality and develop intelligent agents. Part Three covers LLM advances focusing on expert topics such as model training, principles of transformer architecture and other cutting-edge techniques related to the practical application of language models.
Each chapter includes practical examples, code snippets, and hands-on projects to ensure applicability to real-world scenarios. This book bridges the gap between theory and practice, providing professionals with the tools and insights to develop practical and efficient LLM solutions.
What you will learn:
What are the different types of tasks modern LLMs can solve
How to select the most suitable pre-trained LLM for specific tasks
How to enrich LLM with a custom knowledge base and build intelligent systems
What are the core principles of Language Models, and how to tune them
How to build robust LLM-based AI Applications
Who this book is for:
Data scientists, machine learning engineers, and NLP specialists with basic Python skills, introductory PyTorch knowledge, and a primary understanding of deep learning concepts, ready to start applying Large Language Models in practice.
Ivan Gridin is an artificial intelligence expert, researcher, and author with extensive experience in applying advanced machine-learning techniques in real-world scenarios. His expertise includes natural language processing (NLP), predictive time series modeling, automated machine learning (AutoML), reinforcement learning, and neural architecture search. He also has a strong foundation in mathematics, including stochastic processes, probability theory, optimization, and deep learning. In recent years, he has become a specialist in open-source large language models, including the Hugging Face framework. Building on this expertise, he continues to advance his work in developing intelligent, real-world applications powered by natural language processing. He is a loving husband and father and collector of old math books. You can learn more about him on LinkedIn: https://www.linkedin.com/in/survex/.
Part I: LLM Basics.- Chapter 1. Discovering Transformers.- Chapter 2. LLM Basics: Internals, Deployment and Evaluation.- Chapter 3. Improving Chat Model Responses.- Part II: Empowering LLMs Applications with RAG and Intelligent Agents.- Chapter 4. Enriching the Model’s Knowledge with Retrieval Augmented Generation.- Chapter 5. Building Agent Systems.- Part III: LLM Advances.- Chapter 6. Mastering Model Training.- Chapter 7. Unpacking the Transformers Architecture.
| Erscheint lt. Verlag | 7.4.2026 |
|---|---|
| Zusatzinfo | 119 Illustrations, color; 1 Illustrations, black and white |
| Verlagsort | Berkley |
| Sprache | englisch |
| Maße | 178 x 254 mm |
| Themenwelt | Informatik ► Theorie / Studium ► Künstliche Intelligenz / Robotik |
| Schlagworte | Artificial Intelligence • ChatGPT • Hugging Face • Large Language Models • machine learning • Natural Language Processing • PyTorch |
| ISBN-13 | 979-8-8688-2215-5 / 9798868822155 |
| Zustand | Neuware |
| Informationen gemäß Produktsicherheitsverordnung (GPSR) | |
| Haben Sie eine Frage zum Produkt? |
aus dem Bereich