menu
{ "item_title" : "Lectures on Nonsmooth Optimization", "item_author" : [" Qinian Jin "], "item_description" : "This book provides an in-depth exploration of nonsmooth optimization, covering foundational algorithms, theoretical insights, and a wide range of applications. Nonsmooth optimization, characterized by nondifferentiable objective functions or constraints, plays a crucial role across various fields, including machine learning, imaging, inverse problems, statistics, optimal control, and engineering. Its scope and relevance continue to expand, as many real-world problems are inherently nonsmooth or benefit significantly from nonsmooth regularization techniques. This book covers a variety of algorithms for solving nonsmooth optimization problems, which are foundational and recent. It first introduces basic facts on convex analysis and subdifferetial calculus, various algorithms are then discussed, including subgradient methods, mirror descent methods, proximal algorithms, alternating direction method of multipliers, primal dual splitting methods and semismooth Newton methods. Moreover, error bound conditions are discussed and the derivation of linear convergence is illustrated. A particular chapter is delved into first order methods for nonconvex optimization problems satisfying the Kurdyka-Lojasiewicz condition. The book also addresses the rapid evolution of stochastic algorithms for large-scale optimization. This book is written for a wide-ranging audience, including senior undergraduates, graduate students, researchers, and practitioners who are interested in gaining a comprehensive understanding of nonsmooth optimization. ", "item_img_path" : "https://covers1.booksamillion.com/covers/bam/3/03/191/416/3031914163_b.jpg", "price_data" : { "retail_price" : "199.99", "online_price" : "199.99", "our_price" : "199.99", "club_price" : "199.99", "savings_pct" : "0", "savings_amt" : "0.00", "club_savings_pct" : "0", "club_savings_amt" : "0.00", "discount_pct" : "10", "store_price" : "" } }
Lectures on Nonsmooth Optimization|Qinian Jin

Lectures on Nonsmooth Optimization

local_shippingShip to Me
In Stock.
FREE Shipping for Club Members help

Overview

This book provides an in-depth exploration of nonsmooth optimization, covering foundational algorithms, theoretical insights, and a wide range of applications. Nonsmooth optimization, characterized by nondifferentiable objective functions or constraints, plays a crucial role across various fields, including machine learning, imaging, inverse problems, statistics, optimal control, and engineering. Its scope and relevance continue to expand, as many real-world problems are inherently nonsmooth or benefit significantly from nonsmooth regularization techniques. This book covers a variety of algorithms for solving nonsmooth optimization problems, which are foundational and recent. It first introduces basic facts on convex analysis and subdifferetial calculus, various algorithms are then discussed, including subgradient methods, mirror descent methods, proximal algorithms, alternating direction method of multipliers, primal dual splitting methods and semismooth Newton methods. Moreover, error bound conditions are discussed and the derivation of linear convergence is illustrated. A particular chapter is delved into first order methods for nonconvex optimization problems satisfying the Kurdyka-Lojasiewicz condition. The book also addresses the rapid evolution of stochastic algorithms for large-scale optimization. This book is written for a wide-ranging audience, including senior undergraduates, graduate students, researchers, and practitioners who are interested in gaining a comprehensive understanding of nonsmooth optimization.

This item is Non-Returnable

Details

  • ISBN-13: 9783031914164
  • ISBN-10: 3031914163
  • Publisher: Springer
  • Publish Date: July 2025
  • Dimensions: 9.21 x 6.14 x 1.25 inches
  • Shipping Weight: 2.15 pounds
  • Page Count: 560

Related Categories

You May Also Like...

    1

BAM Customer Reviews