go home Home | Main Page | Modules | Namespace List | Class Hierarchy | Alphabetical List | Data Structures | File List | Namespace Members | Data Fields | Globals | Related Pages
Public Types | Public Member Functions | Static Public Member Functions | Protected Member Functions | Protected Attributes | Private Attributes
itk::StochasticPreconditionedGradientDescentOptimizer Class Reference

#include <itkStochasticPreconditionedGradientDescentOptimizer.h>

Detailed Description

This class implements a gradient descent optimizer with a decaying gain and preconditioning.

If $C(x)$ is a cost function that has to be minimized, the following iterative algorithm is used to find the optimal parameters $x$:

\[ x(k+1) = x(k) - a(k) P dC/dx \]

The gain $a(k)$ at each iteration $k$ is defined by:

\[ a(k) =  a / (A + k + 1)^alpha \]


It is very suitable to be used in combination with a stochastic estimate of the gradient $dC/dx$. For example, in image registration problems it is often advantageous to compute the metric derivative ( $dC/dx$) on a new set of randomly selected image samples in each iteration. You may set the parameter NewSamplesEveryIteration to "true" to achieve this effect. For more information on this strategy, you may have a look at:

[1] S. Klein, M. Staring, J.P.W. Pluim, "Evaluation of Optimization Methods for Nonrigid Medical Image Registration using Mutual Information and B-Splines" IEEE Transactions on Image Processing, vol. 16 (12), December 2007.

This class also serves as a base class for other preconditioned GradientDescent type algorithms, like the AdaptiveStochasticPreconditionedGradientDescentOptimizer.

See also
StochasticPreconditionedGradientDescent, AdaptiveStochasticPreconditionedGradientDescentOptimizer

Definition at line 56 of file itkStochasticPreconditionedGradientDescentOptimizer.h.

Inheritance diagram for itk::StochasticPreconditionedGradientDescentOptimizer:
Inheritance graph

Public Types

using ConstPointer = SmartPointer< const Self >
using Pointer = SmartPointer< Self >
using PreconditionType = vnl_sparse_matrix< PreconditionValueType >
using PreconditionValueType = DerivativeType::ValueType
using ScaledCostFunctionPointer = ScaledCostFunctionType::Pointer
using ScaledCostFunctionType = ScaledSingleValuedCostFunction
using ScalesType = NonLinearOptimizer::ScalesType
using Self = StochasticPreconditionedGradientDescentOptimizer
enum  StopConditionType
using Superclass = PreconditionedGradientDescentOptimizer
- Public Types inherited from itk::PreconditionedGradientDescentOptimizer
using ConstPointer = SmartPointer< const Self >
using Pointer = SmartPointer< Self >
using PreconditionType = vnl_sparse_matrix< PreconditionValueType >
using PreconditionValueType = DerivativeType::ValueType
using ScaledCostFunctionPointer = ScaledCostFunctionType::Pointer
using ScaledCostFunctionType = ScaledSingleValuedCostFunction
using ScalesType = NonLinearOptimizer::ScalesType
using Self = PreconditionedGradientDescentOptimizer
enum  StopConditionType { MaximumNumberOfIterations , MetricError , MinimumStepSize }
using Superclass = ScaledSingleValuedNonLinearOptimizer
- Public Types inherited from itk::ScaledSingleValuedNonLinearOptimizer
using ConstPointer = SmartPointer< const Self >
using Pointer = SmartPointer< Self >
using ScaledCostFunctionPointer = ScaledCostFunctionType::Pointer
using ScaledCostFunctionType = ScaledSingleValuedCostFunction
using ScalesType = NonLinearOptimizer::ScalesType
using Self = ScaledSingleValuedNonLinearOptimizer
using Superclass = SingleValuedNonLinearOptimizer

Public Member Functions

virtual void AdvanceOneStep ()
virtual const char * GetClassName () const
virtual double GetCurrentTime () const
virtual double GetInitialTime () const
virtual double GetParam_a () const
virtual double GetParam_A () const
virtual double GetParam_alpha () const
 ITK_DISALLOW_COPY_AND_MOVE (StochasticPreconditionedGradientDescentOptimizer)
virtual void SetInitialTime (double _arg)
virtual void SetParam_a (double _arg)
virtual void SetParam_A (double _arg)
virtual void SetParam_alpha (double _arg)
virtual void StartOptimization ()
- Public Member Functions inherited from itk::PreconditionedGradientDescentOptimizer
virtual void AdvanceOneStep ()
const cholmod_common * GetCholmodCommon () const
const cholmod_factor * GetCholmodFactor () const
virtual const char * GetClassName () const
virtual double GetConditionNumber () const
virtual unsigned int GetCurrentIteration () const
virtual double GetDiagonalWeight () const
virtual const DerivativeType & GetGradient ()
virtual double GetLargestEigenValue () const
virtual const doubleGetLearningRate ()
virtual double GetMinimumGradientElementMagnitude () const
virtual const unsigned long & GetNumberOfIterations ()
virtual const DerivativeType & GetSearchDirection ()
virtual double GetSparsity () const
virtual const StopConditionTypeGetStopCondition ()
virtual const doubleGetValue ()
 ITK_DISALLOW_COPY_AND_MOVE (PreconditionedGradientDescentOptimizer)
virtual void MetricErrorResponse (ExceptionObject &err)
virtual void ResumeOptimization ()
virtual void SetDiagonalWeight (double _arg)
virtual void SetLearningRate (double _arg)
virtual void SetMinimumGradientElementMagnitude (double _arg)
virtual void SetNumberOfIterations (unsigned long _arg)
virtual void SetPreconditionMatrix (PreconditionType &precondition)
virtual void StartOptimization ()
virtual void StopOptimization ()
- Public Member Functions inherited from itk::ScaledSingleValuedNonLinearOptimizer
virtual const char * GetClassName () const
const ParametersType & GetCurrentPosition () const override
virtual bool GetMaximize () const
virtual const ScaledCostFunctionTypeGetScaledCostFunction ()
virtual const ParametersType & GetScaledCurrentPosition ()
bool GetUseScales () const
virtual void InitializeScales ()
 ITK_DISALLOW_COPY_AND_MOVE (ScaledSingleValuedNonLinearOptimizer)
virtual void MaximizeOff ()
virtual void MaximizeOn ()
void SetCostFunction (CostFunctionType *costFunction) override
virtual void SetMaximize (bool _arg)
virtual void SetUseScales (bool arg)

Static Public Member Functions

static Pointer New ()
- Static Public Member Functions inherited from itk::PreconditionedGradientDescentOptimizer
static Pointer New ()
- Static Public Member Functions inherited from itk::ScaledSingleValuedNonLinearOptimizer
static Pointer New ()

Protected Member Functions

virtual double Compute_a (double k) const
 StochasticPreconditionedGradientDescentOptimizer ()
virtual void UpdateCurrentTime ()
virtual ~StochasticPreconditionedGradientDescentOptimizer ()
- Protected Member Functions inherited from itk::PreconditionedGradientDescentOptimizer
virtual void CholmodSolve (const DerivativeType &gradient, DerivativeType &searchDirection, int solveType=CHOLMOD_A)
 PreconditionedGradientDescentOptimizer ()
void PrintSelf (std::ostream &os, Indent indent) const
virtual ~PreconditionedGradientDescentOptimizer ()
- Protected Member Functions inherited from itk::ScaledSingleValuedNonLinearOptimizer
virtual void GetScaledDerivative (const ParametersType &parameters, DerivativeType &derivative) const
virtual MeasureType GetScaledValue (const ParametersType &parameters) const
virtual void GetScaledValueAndDerivative (const ParametersType &parameters, MeasureType &value, DerivativeType &derivative) const
void PrintSelf (std::ostream &os, Indent indent) const override
 ScaledSingleValuedNonLinearOptimizer ()
void SetCurrentPosition (const ParametersType &param) override
virtual void SetScaledCurrentPosition (const ParametersType &parameters)
 ~ScaledSingleValuedNonLinearOptimizer () override=default

Protected Attributes

double m_CurrentTime { 0.0 }
- Protected Attributes inherited from itk::PreconditionedGradientDescentOptimizer
cholmod_common * m_CholmodCommon
cholmod_factor * m_CholmodFactor { nullptr }
cholmod_sparse * m_CholmodGradient { nullptr }
double m_ConditionNumber { 1.0 }
DerivativeType m_Gradient
double m_LargestEigenValue { 1.0 }
double m_LearningRate { 1.0 }
DerivativeType m_SearchDirection
double m_Sparsity { 1.0 }
StopConditionType m_StopCondition { MaximumNumberOfIterations }
- Protected Attributes inherited from itk::ScaledSingleValuedNonLinearOptimizer
ScaledCostFunctionPointer m_ScaledCostFunction
ParametersType m_ScaledCurrentPosition

Private Attributes

double m_InitialTime { 0.0 }
double m_Param_a { 1.0 }
double m_Param_A { 1.0 }
double m_Param_alpha { 0.602 }

Additional Inherited Members

- Protected Types inherited from itk::PreconditionedGradientDescentOptimizer
using cholmod_l = int CInt

Member Typedef Documentation

◆ ConstPointer

◆ Pointer

◆ PreconditionType

Definition at line 86 of file itkPreconditionedGradientDescentOptimizer.h.

◆ PreconditionValueType

Some typedefs for computing the SelfHessian

Definition at line 82 of file itkPreconditionedGradientDescentOptimizer.h.

◆ ScaledCostFunctionPointer

Definition at line 79 of file itkScaledSingleValuedNonLinearOptimizer.h.

◆ ScaledCostFunctionType

Definition at line 78 of file itkScaledSingleValuedNonLinearOptimizer.h.

◆ ScalesType

using itk::ScaledSingleValuedNonLinearOptimizer::ScalesType = NonLinearOptimizer::ScalesType

Definition at line 77 of file itkScaledSingleValuedNonLinearOptimizer.h.

◆ Self

Standard ITK.

Definition at line 62 of file itkStochasticPreconditionedGradientDescentOptimizer.h.

◆ Superclass

Member Enumeration Documentation

◆ StopConditionType

Codes of stopping conditions The MinimumStepSize stopcondition never occurs, but may be implemented in inheriting classes.

Definition at line 92 of file itkPreconditionedGradientDescentOptimizer.h.

Constructor & Destructor Documentation

◆ StochasticPreconditionedGradientDescentOptimizer()

itk::StochasticPreconditionedGradientDescentOptimizer::StochasticPreconditionedGradientDescentOptimizer ( )

◆ ~StochasticPreconditionedGradientDescentOptimizer()

virtual itk::StochasticPreconditionedGradientDescentOptimizer::~StochasticPreconditionedGradientDescentOptimizer ( )

Member Function Documentation

◆ AdvanceOneStep()

virtual void itk::StochasticPreconditionedGradientDescentOptimizer::AdvanceOneStep ( )

Sets a new LearningRate before calling the Superclass' implementation, and updates the current time.

Reimplemented from itk::PreconditionedGradientDescentOptimizer.

◆ Compute_a()

virtual double itk::StochasticPreconditionedGradientDescentOptimizer::Compute_a ( double  k) const

Function to compute the parameter at time/iteration k.

◆ GetClassName()

virtual const char * itk::StochasticPreconditionedGradientDescentOptimizer::GetClassName ( ) const

◆ GetCurrentTime()

virtual double itk::StochasticPreconditionedGradientDescentOptimizer::GetCurrentTime ( ) const

Get the current time. This equals the CurrentIteration in this base class but may be different in inheriting classes, such as the AccelerateGradientDescent.

◆ GetInitialTime()

virtual double itk::StochasticPreconditionedGradientDescentOptimizer::GetInitialTime ( ) const

◆ GetParam_a()

virtual double itk::StochasticPreconditionedGradientDescentOptimizer::GetParam_a ( ) const

◆ GetParam_A()

virtual double itk::StochasticPreconditionedGradientDescentOptimizer::GetParam_A ( ) const

◆ GetParam_alpha()

virtual double itk::StochasticPreconditionedGradientDescentOptimizer::GetParam_alpha ( ) const


itk::StochasticPreconditionedGradientDescentOptimizer::ITK_DISALLOW_COPY_AND_MOVE ( StochasticPreconditionedGradientDescentOptimizer  )

◆ New()

static Pointer itk::StochasticPreconditionedGradientDescentOptimizer::New ( )

Method for creation through the object factory.

◆ SetInitialTime()

virtual void itk::StochasticPreconditionedGradientDescentOptimizer::SetInitialTime ( double  _arg)

Set/Get the initial time. Should be >=0. This function is superfluous, since Param_A does effectively the same. However, in inheriting classes, like the AcceleratedGradientDescent the initial time may have a different function than Param_A. Default: 0.0

◆ SetParam_a()

virtual void itk::StochasticPreconditionedGradientDescentOptimizer::SetParam_a ( double  _arg)

Set/Get a.

◆ SetParam_A()

virtual void itk::StochasticPreconditionedGradientDescentOptimizer::SetParam_A ( double  _arg)

Set/Get A.

◆ SetParam_alpha()

virtual void itk::StochasticPreconditionedGradientDescentOptimizer::SetParam_alpha ( double  _arg)

Set/Get alpha.

◆ StartOptimization()

virtual void itk::StochasticPreconditionedGradientDescentOptimizer::StartOptimization ( )

Set current time to 0 and call superclass' implementation.

Reimplemented from itk::PreconditionedGradientDescentOptimizer.

Reimplemented in elastix::PreconditionedGradientDescent< TElastix >.

◆ UpdateCurrentTime()

virtual void itk::StochasticPreconditionedGradientDescentOptimizer::UpdateCurrentTime ( )

Function to update the current time This function just increments the CurrentTime by 1. Inheriting functions may implement something smarter, for example, dependent on the progress.

Reimplemented in itk::AdaptiveStochasticPreconditionedGradientDescentOptimizer.

Field Documentation

◆ m_CurrentTime

double itk::StochasticPreconditionedGradientDescentOptimizer::m_CurrentTime { 0.0 }

The current time, which serves as input for Compute_a

Definition at line 141 of file itkStochasticPreconditionedGradientDescentOptimizer.h.

◆ m_InitialTime

double itk::StochasticPreconditionedGradientDescentOptimizer::m_InitialTime { 0.0 }


Definition at line 150 of file itkStochasticPreconditionedGradientDescentOptimizer.h.

◆ m_Param_a

double itk::StochasticPreconditionedGradientDescentOptimizer::m_Param_a { 1.0 }

Parameters, as described by Spall.

Definition at line 145 of file itkStochasticPreconditionedGradientDescentOptimizer.h.

◆ m_Param_A

double itk::StochasticPreconditionedGradientDescentOptimizer::m_Param_A { 1.0 }

◆ m_Param_alpha

double itk::StochasticPreconditionedGradientDescentOptimizer::m_Param_alpha { 0.602 }

Generated on Wed 12 Apr 2023 for elastix by doxygen 1.9.6 elastix logo