YuBe Smart

LLMs in Production: From language models to successful products


Price: $59.99 - $47.82
(as of Mar 08, 2025 20:37:42 UTC – Details)



Learn how to put Large Language Model-based applications into production safely and efficiently.

This practical book offers clear, example-rich explanations of how LLMs work, how you can interact with them, and how to integrate LLMs into your own applications. Find out what makes LLMs so different from traditional software and ML, discover best practices for working with them out of the lab, and dodge common pitfalls with experienced advice.

In LLMs in Production you will:

• Grasp the fundamentals of LLMs and the technology behind them
• Evaluate when to use a premade LLM and when to build your own
• Efficiently scale up an ML platform to handle the needs of LLMs
• Train LLM foundation models and finetune an existing LLM
• Deploy LLMs to the cloud and edge devices using complex architectures like PEFT and LoRA
• Build applications leveraging the strengths of LLMs while mitigating their weaknesses

LLMs in Production delivers vital insights into delivering MLOps so you can easily and seamlessly guide one to production usage. Inside, you’ll find practical insights into everything from acquiring an LLM-suitable training dataset, building a platform, and compensating for their immense size. Plus, tips and tricks for prompt engineering, retraining and load testing, handling costs, and ensuring security.

Foreword by Joe Reis.

Purchase of the print book includes a free eBook in PDF and ePub formats from Manning Publications.

About the technology

Most business software is developed and improved iteratively, and can change significantly even after deployment. By contrast, because LLMs are expensive to create and difficult to modify, they require meticulous upfront planning, exacting data standards, and carefully-executed technical implementation. Integrating LLMs into production products impacts every aspect of your operations plan, including the application lifecycle, data pipeline, compute cost, security, and more. Get it wrong, and you may have a costly failure on your hands.

About the book

LLMs in Production teaches you how to develop an LLMOps plan that can take an AI app smoothly from design to delivery. You’ll learn techniques for preparing an LLM dataset, cost-efficient training hacks like LORA and RLHF, and industry benchmarks for model evaluation. Along the way, you’ll put your new skills to use in three exciting example projects: creating and training a custom LLM, building a VSCode AI coding extension, and deploying a small model to a Raspberry Pi.

What’s inside

• Balancing cost and performance
• Retraining and load testing
• Optimizing models for commodity hardware
• Deploying on a Kubernetes cluster

About the reader

For data scientists and ML engineers who know Python and the basics of cloud deployment.

About the author

Christopher Brousseau and Matt Sharp are experienced engineers who have led numerous successful large scale LLM deployments.

Table of Contents

1 Words’ awakening: Why large language models have captured attention
2 Large language models: A deep dive into language modeling
3 Large language model operations: Building a platform for LLMs
4 Data engineering for large language models: Setting up for success
5 Training large language models: How to generate the generator
6 Large language model services: A practical guide
7 Prompt engineering: Becoming an LLM whisperer
8 Large language model applications: Building an interactive experience
9 Creating an LLM project: Reimplementing Llama 3
10 Creating a coding copilot project: This would have helped you earlier
11 Deploying an LLM on a Raspberry Pi: How low can you go?
12 Production, an ever-changing landscape: Things are just getting started
A History of linguistics
B Reinforcement learning with human feedback
C Multimodal latent spaces

From the Publisher

LLMs in Production headerLLMs in Production header

LLMs in Production book page leftLLMs in Production book page left

“Covers all the essential aspects of how to build and deploy LLMs. It goes into the deep and fascinating areas that most other books gloss over.”

Andrew Carr, Cartwheel

LLMs in Production stacked booksLLMs in Production stacked books

“A must-read for anyone looking to harness the potential of LLMs in production environments.”

Jepson Taylor, VEOX Inc.

LLMs in Production page rightLLMs in Production page right

“An exceptional guide that simplifies the building and deployment of complex LLMs.”

Arunkumar Gopalan, Microsoft UK

LLMs in Production book coverLLMs in Production book cover

about the book

LLMs in Production teaches you how to develop an LLMOps plan that can take an AI app smoothly from design to delivery. You’ll learn techniques for preparing an LLM dataset, cost-efficient training hacks like LORA and RLHF, and industry benchmarks for model evaluation. Along the way, you’ll put your new skills to use in three exciting example projects: creating and training a custom LLM, building a VSCode AI coding extension, and deploying a small model to a Raspberry Pi.

about the authors

Christopher Brousseau and Matt Sharp are experienced engineers who have led numerous successful large scale LLM deployments.

Publisher ‏ : ‎ Manning (February 11, 2025)
Language ‏ : ‎ English
Paperback ‏ : ‎ 456 pages
ISBN-10 ‏ : ‎ 1633437205
ISBN-13 ‏ : ‎ 978-1633437203
Item Weight ‏ : ‎ 1.65 pounds
Dimensions ‏ : ‎ 7.38 x 1 x 9.25 inches

Leave a Reply

Your email address will not be published. Required fields are marked *

©2025 YuBe Smart WordPress Video Theme by WPEnjoy