Bültmann & Gerriets
A Tutorial on Hadamard Semidifferentials
von Kenneth Lange
Verlag: Now Publishers Inc
Taschenbuch
ISBN: 978-1-63828-348-5
Erschienen am 13.05.2024
Sprache: Englisch
Format: 234 mm [H] x 156 mm [B] x 5 mm [T]
Gewicht: 134 Gramm
Umfang: 78 Seiten

Preis: 59,50 €
keine Versandkosten (Inland)


Dieser Titel wird erst bei Bestellung gedruckt. Eintreffen bei uns daher ca. am 30. Oktober.

Der Versand innerhalb der Stadt erfolgt in Regel am gleichen Tag.
Der Versand nach außerhalb dauert mit Post/DHL meistens 1-2 Tage.

59,50 €
merken
klimaneutral
Der Verlag produziert nach eigener Angabe noch nicht klimaneutral bzw. kompensiert die CO2-Emissionen aus der Produktion nicht. Daher übernehmen wir diese Kompensation durch finanzielle Förderung entsprechender Projekte. Mehr Details finden Sie in unserer Klimabilanz.
Klappentext

This tutorial presents a brief survey of semidifferentials in the familiar context of finite-dimensional Euclidean space. This restriction exposes the most critical ideas, important connections to convexity and optimization, and a few novel applications. The text delves more deeply into these topics and is highly recommended for a systematic course and self-study.
The main focus of this tutorial is on Hadamard semidifferentials. The Hadamard semidifferential is more general than the Fréchet differential, which is now dominant in undergraduate mathematics education. By slightly changing the definition of the forward directional derivative, the Hadamard semidifferential rescues the chain rule, enforces continuity, and permits differentiation across maxima and minima.
The Hadamard semidifferential also plays well with convex analysis and naturally extends differentiation to smooth embedded submanifolds, topological vector spaces, and metric spaces of shapes and geometries. The current elementary exposition focuses on the more familiar territory of analysis in Euclidean spaces and applies the semidifferential to some representative problems in optimization and statistics. These include algorithms for proximal gradient descent, steepest descent in matrix completion, and variance components models.
This tutorial will be of interest to students in advanced undergraduate and beginning graduate courses in the mathematical sciences.