menu
{ "item_title" : "Knowledge Distillation in Computer Vision", "item_author" : [" Linfeng Zhang "], "item_description" : "Discover the cutting-edge advancements in knowledge distillation for computer vision within this comprehensive monograph. As neural networks become increasingly complex, the demand for efficient and lightweight models grows critical, especially for real-world applications. This book uniquely bridges the gap between academic research and industrial implementation, exploring innovative methods to compress and accelerate deep neural networks without sacrificing accuracy. It addresses two fundamental problems in knowledge distillation: constructing effective student and teacher models and selecting the appropriate knowledge to distill. Presenting groundbreaking research on self-distillation and task-irrelevant knowledge distillation, the book offers new perspectives on model optimization. Readers will gain insights into applying these techniques across a wide range of visual tasks, from 2D and 3D object detection to image generation, effectively bridging the gap between AI research and practical deployment. By engaging with this text, readers will learn to enhance model performance, reduce computational costs, and improve model robustness. This book is ideal for researchers, practitioners, and advanced students with a background in computer vision and deep learning. Equip yourself with the knowledge to design and implement knowledge distillation, thereby improving the efficiency of computer vision models. ", "item_img_path" : "https://covers4.booksamillion.com/covers/bam/9/81/950/366/9819503663_b.jpg", "price_data" : { "retail_price" : "54.99", "online_price" : "54.99", "our_price" : "54.99", "club_price" : "54.99", "savings_pct" : "0", "savings_amt" : "0.00", "club_savings_pct" : "0", "club_savings_amt" : "0.00", "discount_pct" : "10", "store_price" : "" } }
Knowledge Distillation in Computer Vision|Linfeng Zhang

Knowledge Distillation in Computer Vision

local_shippingShip to Me
In Stock.
FREE Shipping for Club Members help

Overview

Discover the cutting-edge advancements in knowledge distillation for computer vision within this comprehensive monograph. As neural networks become increasingly complex, the demand for efficient and lightweight models grows critical, especially for real-world applications. This book uniquely bridges the gap between academic research and industrial implementation, exploring innovative methods to compress and accelerate deep neural networks without sacrificing accuracy. It addresses two fundamental problems in knowledge distillation: constructing effective student and teacher models and selecting the appropriate knowledge to distill. Presenting groundbreaking research on self-distillation and task-irrelevant knowledge distillation, the book offers new perspectives on model optimization. Readers will gain insights into applying these techniques across a wide range of visual tasks, from 2D and 3D object detection to image generation, effectively bridging the gap between AI research and practical deployment. By engaging with this text, readers will learn to enhance model performance, reduce computational costs, and improve model robustness. This book is ideal for researchers, practitioners, and advanced students with a background in computer vision and deep learning. Equip yourself with the knowledge to design and implement knowledge distillation, thereby improving the efficiency of computer vision models.

This item is Non-Returnable

Details

  • ISBN-13: 9789819503667
  • ISBN-10: 9819503663
  • Publisher: Springer
  • Publish Date: January 2026
  • Dimensions: 9.19 x 6.35 x 0.34 inches
  • Shipping Weight: 0.49 pounds
  • Page Count: 140

Related Categories

You May Also Like...

    1

BAM Customer Reviews