|
|
Home | Main Page | Topics | Namespace List | Class Hierarchy | Alphabetical List | Data Structures | File List | Namespace Members | Data Fields | Globals | Related Pages |
#include <itkStandardStochasticGradientDescentOptimizer.h>
This class implements a gradient descent optimizer with a decaying gain.
If 

![\[ x(k+1) = x(k) - a(k) dC/dx \]](form_78.png)
The gain 

![\[ a(k) = a / (A + k + 1)^alpha \]](form_79.png)
.
It is very suitable to be used in combination with a stochastic estimate of the gradient 

NewSamplesEveryIteration to "true" to achieve this effect. For more information on this strategy, you may have a look at:
S. Klein, M. Staring, J.P.W. Pluim, "Comparison of gradient approximation techniques for optimisation of mutual information in nonrigid registration", in: SPIE Medical Imaging: Image Processing, Editor(s): J.M. Fitzpatrick, J.M. Reinhardt, SPIE press, 2005, vol. 5747, Proceedings of SPIE, pp. 192-203.
Or:
S. Klein, M. Staring, J.P.W. Pluim, "Evaluation of Optimization Methods for Nonrigid Medical Image Registration using Mutual Information and B-Splines" IEEE Transactions on Image Processing, 2007, nr. 16(12), December.
This class also serves as a base class for other StochasticGradient type algorithms, like the AcceleratedStochasticGradientOptimizer.
Definition at line 64 of file itkStandardStochasticGradientDescentOptimizer.h.
Static Public Member Functions | |
| static Pointer | New () |
| Static Public Member Functions inherited from itk::StochasticGradientDescentOptimizer | |
| static Pointer | New () |
| Static Public Member Functions inherited from itk::ScaledSingleValuedNonLinearOptimizer | |
| static Pointer | New () |
Protected Member Functions | |
| virtual double | Compute_a (double k) const |
| virtual double | Compute_beta (double k) const |
| StandardStochasticGradientOptimizer () | |
| virtual void | UpdateCurrentTime () |
| ~StandardStochasticGradientOptimizer () override=default | |
| Protected Member Functions inherited from itk::StochasticGradientDescentOptimizer | |
| void | PrintSelf (std::ostream &os, Indent indent) const override |
| StochasticGradientDescentOptimizer () | |
| ~StochasticGradientDescentOptimizer () override=default | |
| Protected Member Functions inherited from itk::ScaledSingleValuedNonLinearOptimizer | |
| virtual void | GetScaledDerivative (const ParametersType ¶meters, DerivativeType &derivative) const |
| virtual MeasureType | GetScaledValue (const ParametersType ¶meters) const |
| virtual void | GetScaledValueAndDerivative (const ParametersType ¶meters, MeasureType &value, DerivativeType &derivative) const |
| void | PrintSelf (std::ostream &os, Indent indent) const override |
| ScaledSingleValuedNonLinearOptimizer () | |
| void | SetCurrentPosition (const ParametersType ¶m) override |
| virtual void | SetScaledCurrentPosition (const ParametersType ¶meters) |
| ~ScaledSingleValuedNonLinearOptimizer () override=default | |
Private Attributes | |
| double | m_InitialTime { 0.0 } |
| double | m_Param_A { 1.0 } |
| double | m_Param_a { 1.0 } |
| double | m_Param_alpha { 0.602 } |
| double | m_Param_beta {} |
Additional Inherited Members | |
| Protected Types inherited from itk::StochasticGradientDescentOptimizer | |
| using | ThreadInfoType = MultiThreaderBase::WorkUnitInfo |
| using itk::StandardStochasticGradientOptimizer::ConstPointer = SmartPointer<const Self> |
Definition at line 73 of file itkStandardStochasticGradientDescentOptimizer.h.
| using itk::StandardStochasticGradientOptimizer::Pointer = SmartPointer<Self> |
Definition at line 72 of file itkStandardStochasticGradientDescentOptimizer.h.
Standard ITK.
Definition at line 70 of file itkStandardStochasticGradientDescentOptimizer.h.
Definition at line 71 of file itkStandardStochasticGradientDescentOptimizer.h.
Codes of stopping conditions The MinimumStepSize stopcondition never occurs, but may be implemented in inheriting classes
Definition at line 82 of file itkStochasticGradientDescentOptimizer.h.
|
protected |
|
overrideprotecteddefault |
|
overridevirtual |
Sets a new LearningRate before calling the Superclass' implementation, and updates the current time.
Reimplemented from itk::StochasticGradientDescentOptimizer.
|
protectedvirtual |
Function to compute the step size for SGD at time/iteration k.
|
protectedvirtual |
Function to compute the step size for SQN at time/iteration k.
|
virtual |
Get the current time. This equals the CurrentIteration in this base class but may be different in inheriting classes, such as the AccelerateStochasticGradient
|
virtual |
|
virtual |
|
virtual |
|
virtual |
|
virtual |
| itk::StandardStochasticGradientOptimizer::ITK_DISALLOW_COPY_AND_MOVE | ( | StandardStochasticGradientOptimizer | ) |
| itk::StandardStochasticGradientOptimizer::itkOverrideGetNameOfClassMacro | ( | StandardStochasticGradientOptimizer | ) |
Run-time type information (and related methods).
|
static |
Method for creation through the object factory.
|
inlinevirtual |
Set the current time to the initial time. This can be useful to 'reset' the optimisation, for example if you changed the cost function while optimisation. Be careful with this function.
Definition at line 132 of file itkStandardStochasticGradientDescentOptimizer.h.
|
virtual |
Set/Get the initial time. Should be >=0. This function is superfluous, since Param_A does effectively the same. However, in inheriting classes, like the AcceleratedStochasticGradient the initial time may have a different function than Param_A. Default: 0.0
|
virtual |
Set/Get A.
|
virtual |
Set/Get a.
|
virtual |
Set/Get alpha.
|
virtual |
Set/Get beta.
|
override |
Set current time to 0 and call superclass' implementation.
|
protectedvirtual |
Function to update the current time This function just increments the CurrentTime by 1. Inheriting functions may implement something smarter, for example, dependent on the progress
Reimplemented in itk::AdaptiveStochasticLBFGSOptimizer.
|
protected |
The current time, which serves as input for Compute_a
Definition at line 157 of file itkStandardStochasticGradientDescentOptimizer.h.
|
private |
Settings
Definition at line 170 of file itkStandardStochasticGradientDescentOptimizer.h.
|
private |
Definition at line 166 of file itkStandardStochasticGradientDescentOptimizer.h.
|
private |
Parameters, as described by Spall.
Definition at line 164 of file itkStandardStochasticGradientDescentOptimizer.h.
|
private |
Definition at line 167 of file itkStandardStochasticGradientDescentOptimizer.h.
|
private |
Definition at line 165 of file itkStandardStochasticGradientDescentOptimizer.h.
|
protected |
Constant step size or others, different value of k.
Definition at line 160 of file itkStandardStochasticGradientDescentOptimizer.h.
Generated on 1774142652 for elastix by 1.15.0 |