shark::IRpropPlus Class Reference

This class offers methods for the usage of the improved Resilient-Backpropagation-algorithm with weight-backtracking. More...

#include <shark/Algorithms/GradientDescent/Rprop.h>

+ Inheritance diagram for shark::IRpropPlus:

Public Member Functions

SHARK_EXPORT_SYMBOL IRpropPlus ()
 
std::string name () const
 From INameable: return the class name. More...
 
SHARK_EXPORT_SYMBOL void init (ObjectiveFunctionType const &objectiveFunction, SearchPointType const &startingPoint)
 initializes the optimizer using a predefined starting point More...
 
SHARK_EXPORT_SYMBOL void init (ObjectiveFunctionType const &objectiveFunction, SearchPointType const &startingPoint, double initDelta)
 
SHARK_EXPORT_SYMBOL void step (ObjectiveFunctionType const &objectiveFunction)
 Carry out one step of the optimizer for the supplied objective function. More...
 
SHARK_EXPORT_SYMBOL void setDerivativeThreshold (double derivativeThreshold)
 
SHARK_EXPORT_SYMBOL void read (InArchive &archive)
 Read the component from the supplied archive. More...
 
SHARK_EXPORT_SYMBOL void write (OutArchive &archive) const
 Write the component to the supplied archive. More...
 
- Public Member Functions inherited from shark::RpropPlus
SHARK_EXPORT_SYMBOL RpropPlus ()
 
- Public Member Functions inherited from shark::RpropMinus
SHARK_EXPORT_SYMBOL RpropMinus ()
 
void setEtaMinus (double etaMinus)
 set decrease factor More...
 
void setEtaPlus (double etaPlus)
 set increase factor More...
 
void setMaxDelta (double d)
 set upper limit on update More...
 
void setMinDelta (double d)
 set lower limit on update More...
 
double maxDelta () const
 return the maximal step size component More...
 
RealVector const & derivative () const
 Returns the derivative at the current point. Can be used for stopping criteria. More...
 
- Public Member Functions inherited from shark::AbstractSingleObjectiveOptimizer< RealVector >
std::size_t numInitPoints () const
 By default most single objective optimizers only require a single point. More...
 
virtual void init (ObjectiveFunctionType const &function, std::vector< SearchPointType > const &initPoints)
 Initialize the optimizer for the supplied objective function using a set of initialisation points. More...
 
virtual const SolutionTypesolution () const
 returns the current solution of the optimizer More...
 
- Public Member Functions inherited from shark::AbstractOptimizer< RealVector, double, SingleObjectiveResultSet< RealVector > >
const Featuresfeatures () const
 
virtual void updateFeatures ()
 
bool requiresValue () const
 
bool requiresFirstDerivative () const
 
bool requiresSecondDerivative () const
 
bool canSolveConstrained () const
 
bool requiresClosestFeasible () const
 
virtual ~AbstractOptimizer ()
 
virtual void init (ObjectiveFunctionType const &function)
 Initialize the optimizer for the supplied objective function. More...
 
- Public Member Functions inherited from shark::INameable
virtual ~INameable ()
 
- Public Member Functions inherited from shark::ISerializable
virtual ~ISerializable ()
 Virtual d'tor. More...
 
void load (InArchive &archive, unsigned int version)
 Versioned loading of components, calls read(...). More...
 
void save (OutArchive &archive, unsigned int version) const
 Versioned storing of components, calls write(...). More...
 
 BOOST_SERIALIZATION_SPLIT_MEMBER ()
 

Protected Attributes

double m_oldError
 The error of the last iteration. More...
 
double m_derivativeThreshold
 A threshold below which partial derivatives are set to zero. More...
 
- Protected Attributes inherited from shark::RpropPlus
RealVector m_deltaw
 The final update values for all weights. More...
 
- Protected Attributes inherited from shark::RpropMinus
ObjectiveFunctionType::FirstOrderDerivative m_derivative
 
double m_increaseFactor
 The increase factor \( \eta^+ \), set to 1.2 by default. More...
 
double m_decreaseFactor
 The decrease factor \( \eta^- \), set to 0.5 by default. More...
 
double m_maxDelta
 The upper limit of the increments \( \Delta w_i^{(t)} \), set to 1e100 by default. More...
 
double m_minDelta
 The lower limit of the increments \( \Delta w_i^{(t)} \), set to 0.0 by default. More...
 
size_t m_parameterSize
 
RealVector m_oldDerivative
 The last error gradient. More...
 
RealVector m_delta
 The absolute update values (increment) for all weights. More...
 
- Protected Attributes inherited from shark::AbstractSingleObjectiveOptimizer< RealVector >
SolutionType m_best
 Current solution of the optimizer. More...
 
- Protected Attributes inherited from shark::AbstractOptimizer< RealVector, double, SingleObjectiveResultSet< RealVector > >
Features m_features
 

Additional Inherited Members

- Public Types inherited from shark::AbstractSingleObjectiveOptimizer< RealVector >
typedef base_type::SearchPointType SearchPointType
 
typedef base_type::SolutionType SolutionType
 
typedef base_type::ResultType ResultType
 
typedef base_type::ObjectiveFunctionType ObjectiveFunctionType
 
- Public Types inherited from shark::AbstractOptimizer< RealVector, double, SingleObjectiveResultSet< RealVector > >
enum  Feature
 Models features that the optimizer requires from the objective function. More...
 
typedef RealVector SearchPointType
 
typedef double ResultType
 
typedef SingleObjectiveResultSet< RealVector > SolutionType
 
typedef AbstractObjectiveFunction< RealVector, ResultTypeObjectiveFunctionType
 
typedef TypedFlags< FeatureFeatures
 
typedef TypedFeatureNotAvailableException< FeatureFeatureNotAvailableException
 
- Protected Member Functions inherited from shark::AbstractOptimizer< RealVector, double, SingleObjectiveResultSet< RealVector > >
void checkFeatures (ObjectiveFunctionType const &objectiveFunction)
 Convenience function that checks whether the features of the supplied objective function match with the required features of the optimizer. More...
 

Detailed Description

This class offers methods for the usage of the improved Resilient-Backpropagation-algorithm with weight-backtracking.

The Rprop algorithm is an improvement of the algorithms with adaptive learning rates (as the Adaptive Backpropagation algorithm by Silva and Ameida, please see AdpBP.h for a description of the working of such an algorithm), that uses increments for the update of the weights, that are independent from the absolute partial derivatives. This makes sense, because large flat regions in the search space (plateaus) lead to small absolute partial derivatives and so the increments are chosen small, but the increments should be large to skip the plateau. In contrast, the absolute partial derivatives are very large at the "slopes" of very "narrow canyons", which leads to large increments that will skip the minimum lying at the bottom of the canyon, but it would make more sense to chose small increments to hit the minimum.
So, the Rprop algorithm only uses the signs of the partial derivatives and not the absolute values to adapt the parameters.
Instead of individual learning rates, it uses the parameter \(\Delta_i^{(t)}\) for weight \(w_i,\ i = 1, \dots, n\) in iteration \(t\), where the parameter will be adapted before the change of the weights:

\( \Delta_i^{(t)} = \Bigg\{ \begin{array}{ll} min( \eta^+ \cdot \Delta_i^{(t-1)}, \Delta_{max} ), & \mbox{if\ } \frac{\partial E^{(t-1)}}{\partial w_i} \cdot \frac{\partial E^{(t)}}{\partial w_i} > 0 \\ max( \eta^- \cdot \Delta_i^{(t-1)}, \Delta_{min} ), & \mbox{if\ } \frac{\partial E^{(t-1)}}{\partial w_i} \cdot \frac{\partial E^{(t)}}{\partial w_i} < 0 \\ \Delta_i^{(t-1)}, & \mbox{otherwise} \end{array} \)

The parameters \(\eta^+ > 1\) and \(0 < \eta^- < 1\) control the speed of the adaptation. To stabilize the increments, they are restricted to the interval \([\Delta_{min}, \Delta_{max}]\).
After the adaptation of the \(\Delta_i\) the update for the weights will be calculated as

\( \Delta w_i^{(t)} := - \mbox{sign} \left( \frac{\partial E^{(t)}}{\partial w_i}\right) \cdot \Delta_i^{(t)} \)

Furthermore, weight-backtracking will take place to increase the stability of the method. In contrast to the original Rprop algorithm with weight-backtracking (see RpropPlus) this weight-backtracking is improved by additionally taken the error of the last iteration \(t - 1\) into account.
The idea of this modification is, that a change of the sign of the partial derivation \(\frac{\partial E}{\partial w_i}\) only states, that a minimum was skipped and not, whether this step lead to an approach to the minimum or not.
By using the old error value the improved weight-backtracking only undoes changes, when the error has increased and only the parameters \(w_i\) are reset to the old values, where a sign change of \(\frac{\partial E}{\partial w_i}\) has taken place.
So the new weight-backtracking rule is:

\( \mbox{if\ } \frac{\partial E^{(t-1)}}{\partial w_i} \cdot \frac{\partial E^{(t)}}{\partial w_i} < 0 \mbox{\ then} \{ \)

\( \begin{array}{lll} \Delta w_i^{(t)} = \bigg\{ & - \Delta w_i^{(t-1)}, & \mbox{if\ } E^{(t)} > E^{(t - 1)} \\ & 0, & otherwise \\ \frac{\partial E^{(t)}}{\partial w_i} := 0 \end{array} \)

\(\}\)

, where the assignment of zero to the partial derivative of the error leads to a freezing of the increment in the next iteration.

This modification of the weight backtracking leads to a better optimization on artifical, paraboloidal error surfaces.

For further information about the algorithm, please refer to:

Christian Igel and Michael Hüsken,
"Empirical Evaluation of the Improved Rprop Learning Algorithm".
In Neurocomputing Journal, 2002, in press

Author
C. Igel, M. Hüsken
Date
1999
Changes:
none
Status:
stable

Definition at line 387 of file Rprop.h.

Constructor & Destructor Documentation

◆ IRpropPlus()

SHARK_EXPORT_SYMBOL shark::IRpropPlus::IRpropPlus ( )

Member Function Documentation

◆ init() [1/2]

SHARK_EXPORT_SYMBOL void shark::IRpropPlus::init ( ObjectiveFunctionType const &  function,
SearchPointType const &  startingPoint 
)
virtual

initializes the optimizer using a predefined starting point

Reimplemented from shark::RpropPlus.

Referenced by main(), run_one_trial(), and trainProblem().

◆ init() [2/2]

SHARK_EXPORT_SYMBOL void shark::IRpropPlus::init ( ObjectiveFunctionType const &  objectiveFunction,
SearchPointType const &  startingPoint,
double  initDelta 
)
virtual

Reimplemented from shark::RpropPlus.

◆ name()

std::string shark::IRpropPlus::name ( ) const
inlinevirtual

From INameable: return the class name.

Reimplemented from shark::RpropPlus.

Definition at line 393 of file Rprop.h.

References shark::RpropMinus::init(), shark::RpropMinus::read(), SHARK_EXPORT_SYMBOL, shark::RpropMinus::step(), and shark::RpropMinus::write().

◆ read()

SHARK_EXPORT_SYMBOL void shark::IRpropPlus::read ( InArchive archive)
virtual

Read the component from the supplied archive.

Parameters
[in,out]archiveThe archive to read from.

Reimplemented from shark::RpropPlus.

◆ setDerivativeThreshold()

SHARK_EXPORT_SYMBOL void shark::IRpropPlus::setDerivativeThreshold ( double  derivativeThreshold)

◆ step()

SHARK_EXPORT_SYMBOL void shark::IRpropPlus::step ( ObjectiveFunctionType const &  function)
virtual

Carry out one step of the optimizer for the supplied objective function.

Parameters
[in]functionThe objective function to initialize for.

Reimplemented from shark::RpropPlus.

Referenced by main(), run_one_trial(), and trainProblem().

◆ write()

SHARK_EXPORT_SYMBOL void shark::IRpropPlus::write ( OutArchive archive) const
virtual

Write the component to the supplied archive.

Parameters
[in,out]archiveThe archive to write to.

Reimplemented from shark::RpropPlus.

Member Data Documentation

◆ m_derivativeThreshold

double shark::IRpropPlus::m_derivativeThreshold
protected

A threshold below which partial derivatives are set to zero.

Definition at line 411 of file Rprop.h.

◆ m_oldError

double shark::IRpropPlus::m_oldError
protected

The error of the last iteration.

Definition at line 409 of file Rprop.h.


The documentation for this class was generated from the following file: