PLearn 0.1
Public Member Functions | Static Public Member Functions | Public Attributes | Static Public Attributes | Static Protected Member Functions | Private Types | Private Member Functions
PLearn::Optimizer Class Reference

#include <Optimizer.h>

Inheritance diagram for PLearn::Optimizer:
Inheritance graph
[legend]
Collaboration diagram for PLearn::Optimizer:
Collaboration graph
[legend]

List of all members.

Public Member Functions

 Optimizer ()
 Default constructor.
virtual void build ()
 Post-constructor.
virtual void reset ()
virtual void setToOptimize (const VarArray &the_params, Var the_cost, VarArray the_other_costs=VarArray(0), TVec< VarArray > the_other_params=TVec< VarArray >(0), real the_other_weight=1)
void remote_setToOptimize (const VarArray &params, Var cost)
 Remote version of setToOptimize.
virtual OptimizerdeepCopy (CopiesMap &copies) const
virtual void makeDeepCopyFromShallowCopy (CopiesMap &copies)
 Does the necessary operations to transform a shallow copy (this) into a deep copy by deep-copying all the members that need to be.
virtual bool optimizeN (VecStatsCollector &stats_coll)=0
 Main optimization method, to be defined in subclasses.
bool remote_optimizeN (PP< VecStatsCollector > stats_coll)
void verifyGradient (real minval, real maxval, real step)
 verify gradient with uniform random initialization of parameters using step for the finite difference approximation of the gradient
void verifyGradient (real step)
 verify gradient at the current value of the parameters using step for the finite difference approximation of the gradient
virtual void setPartialUpdateVars (const VarArray &the_partial_update_vars)
void computeRepartition (Vec v, int n, real mini, real maxi, Vec res, int &noutliers)
 Compute the repartition of v by splitting the interval [mini,maxi] into n intervals.
real collectGradientStats (const Vec &gradient)
 Collect various statistics on the gradient.
void computeGradient (const Vec &gradient)
 Given an optimizer, compute the gradient of the cost function and store it in the "gradient" Vec.
void computeOppositeGradient (const Vec &gradient)
 Given an optimizer, compute the opposite of the gradient of the cost function and store it in the "gradient" Vec.

Static Public Member Functions

static string _classname_ ()
static OptionList_getOptionList_ ()
static RemoteMethodMap_getRemoteMethodMap_ ()
static bool _isa_ (const Object *o)
static void _static_initialize_ ()
static const PPathdeclaringFile ()

Public Attributes

VarArray params
Var cost
VarArray partial_update_vars
 Vars that are partially updated.
VarArray proppath
bool early_stop
 Boolean used in subclasses to notify of early stopping.
int nstages
 number of steps to perform when calling optimizeN
int stage
 current number of steps performed
VarArray other_costs
 Other costs (for regularisation for example)
TVec< VarArrayother_params
 Parameters of other costs to update (usually a subset of params)
TVec< VarArrayother_proppaths
 Propagation paths of other_costs.
real other_weight
 Weight for all the other costs.

Static Public Attributes

static StaticInitializer _static_initializer_

Static Protected Member Functions

static void declareOptions (OptionList &ol)
 Declare options (data fields) for the class.
static void declareMethods (RemoteMethodMap &rmm)
 Declare the methods that are remote-callable.

Private Types

typedef Object inherited

Private Member Functions

void build_ ()
 Object-specific post-constructor.

Detailed Description

Definition at line 60 of file Optimizer.h.


Member Typedef Documentation


Constructor & Destructor Documentation

PLearn::Optimizer::Optimizer ( )

Default constructor.

Definition at line 53 of file Optimizer.cc.

                    :
    early_stop(false),
    nstages(1),
    stage(0)
{}

Member Function Documentation

string PLearn::Optimizer::_classname_ ( ) [static]
OptionList & PLearn::Optimizer::_getOptionList_ ( ) [static]
RemoteMethodMap & PLearn::Optimizer::_getRemoteMethodMap_ ( ) [static]
bool PLearn::Optimizer::_isa_ ( const Object o) [static]
StaticInitializer Optimizer::_static_initializer_ & PLearn::Optimizer::_static_initialize_ ( ) [static]
void PLearn::Optimizer::build ( ) [virtual]

Post-constructor.

The normal implementation should call simply inherited::build(), then this class's build_(). This method should be callable again at later times, after modifying some option fields to change the "architecture" of the object.

Reimplemented from PLearn::Object.

Reimplemented in PLearn::AdaptGradientOptimizer, PLearn::ConjGradientOptimizer, PLearn::AutoScaledGradientOptimizer, PLearn::OnlineGramNaturalGradientOptimizer, and PLearn::GradientOptimizer.

Definition at line 66 of file Optimizer.cc.

References PLearn::Object::build(), and build_().

Here is the call graph for this function:

void PLearn::Optimizer::build_ ( ) [private]

Object-specific post-constructor.

This method should be redefined in subclasses and do the actual building of the object according to previously set option fields. Constructors can just set option fields, and then call build_. This method is NOT virtual, and will typically be called only from three places: a constructor, the public virtual build() method, and possibly the public virtual read method (which calls its parent's read). build_() can assume that its parent's build_() has already been called.

Reimplemented from PLearn::Object.

Reimplemented in PLearn::AdaptGradientOptimizer, PLearn::ConjGradientOptimizer, PLearn::AutoScaledGradientOptimizer, PLearn::OnlineGramNaturalGradientOptimizer, and PLearn::GradientOptimizer.

Definition at line 75 of file Optimizer.cc.

References cost, other_costs, other_params, other_weight, params, and setToOptimize().

Referenced by build().

Here is the call graph for this function:

Here is the caller graph for this function:

real PLearn::Optimizer::collectGradientStats ( const Vec gradient)

Collect various statistics on the gradient.

void PLearn::Optimizer::computeGradient ( const Vec gradient)

Given an optimizer, compute the gradient of the cost function and store it in the "gradient" Vec.

Definition at line 245 of file Optimizer.cc.

References PLearn::VarArray::clearGradient(), PLearn::VarArray::copyGradientTo(), cost, PLearn::VarArray::fbprop(), params, and proppath.

Referenced by PLearn::ConjGradientOptimizer::computeCostAndDerivative(), and PLearn::ConjGradientOptimizer::computeDerivative().

                                                   {
    // Clear all what's left from previous computations
    this->proppath.clearGradient();
    this->params.clearGradient();
    this->cost->gradient[0] = 1;
    this->proppath.fbprop();
    this->params.copyGradientTo(gradient);
}

Here is the call graph for this function:

Here is the caller graph for this function:

void PLearn::Optimizer::computeOppositeGradient ( const Vec gradient)

Given an optimizer, compute the opposite of the gradient of the cost function and store it in the "gradient" Vec.

Definition at line 261 of file Optimizer.cc.

References PLearn::VarArray::clearGradient(), PLearn::VarArray::copyGradientTo(), cost, PLearn::VarArray::fbprop(), params, and proppath.

Referenced by PLearn::AdaptGradientOptimizer::build_(), and PLearn::ConjGradientOptimizer::optimizeN().

                                                           {
    // Clear all what's left from previous computations
    this->proppath.clearGradient();
    this->params.clearGradient();
    // We want the opposite of the gradient, thus the -1
    this->cost->gradient[0] = -1;
    this->proppath.fbprop();
    this->params.copyGradientTo(gradient);
#ifdef DEBUGCG
    gs->setcolor("blue");
    gs->drawCircle(this->params[0]->value[0],this->params[0]->value[1],0.02);
#endif

}

Here is the call graph for this function:

Here is the caller graph for this function:

void PLearn::Optimizer::computeRepartition ( Vec  v,
int  n,
real  mini,
real  maxi,
Vec  res,
int noutliers 
)

Compute the repartition of v by splitting the interval [mini,maxi] into n intervals.

The result is stored into res.

Definition at line 219 of file Optimizer.cc.

References PLearn::TVec< T >::clear(), i, j, PLearn::TVec< T >::length(), and n.

                             {
    res.clear();
    noutliers = 0;
    for (int i=0; i<v.length(); i++) {
        real k = (v[i] - mini) / (maxi - mini);
        int j = int(k*n);
        if (j >= n) {
            noutliers++;
            j = n-1;
        }
        if (j < 0) {
            noutliers++;
            j = 0;
        }
        res[j]++;
    }
    for (int i = 0; i<n; i++) {
        res[i] /= v.length();
    }
}

Here is the call graph for this function:

void PLearn::Optimizer::declareMethods ( RemoteMethodMap rmm) [static, protected]

Declare the methods that are remote-callable.

Reimplemented from PLearn::Object.

Definition at line 108 of file Optimizer.cc.

References PLearn::Object::_getRemoteMethodMap_(), PLearn::declareMethod(), PLearn::RemoteMethodMap::inherited(), remote_optimizeN(), and remote_setToOptimize().

{
    // Insert a backpointer to remote methods; note that this
    // different than for declareOptions()
    rmm.inherited(inherited::_getRemoteMethodMap_());
        
    declareMethod(rmm, "setToOptimize", &Optimizer::remote_setToOptimize,
            (BodyDoc("Set cost to minimize with respect to given parameters"),
             ArgDoc("params", "List of parameters (variables) to optimize"),
             ArgDoc("cost", "Cost to be minimized")));

    declareMethod(rmm, "optimizeN", &Optimizer::remote_optimizeN,
            (BodyDoc("Launch nstages steps of optimization."),
             ArgDoc("stats", "VecStatsCollector to collect training statistics"),
             RetDoc("Boolean value indicating whether a stopping criterion "
                    "has been met.")));
}

Here is the call graph for this function:

void PLearn::Optimizer::declareOptions ( OptionList ol) [static, protected]

Declare options (data fields) for the class.

Redefine this in subclasses: call declareOption(...) for each option, and then call inherited::declareOptions(options). Please call the inherited method AT THE END to get the options listed in a consistent order (from most recently defined to least recently defined).

  static void MyDerivedClass::declareOptions(OptionList& ol)
  {
      declareOption(ol, "inputsize", &MyObject::inputsize_,
                    OptionBase::buildoption,
                    "The size of the input; it must be provided");
      declareOption(ol, "weights", &MyObject::weights,
                    OptionBase::learntoption,
                    "The learned model weights");
      inherited::declareOptions(ol);
  }
Parameters:
olList of options that is progressively being constructed for the current class.

Reimplemented from PLearn::Object.

Reimplemented in PLearn::AdaptGradientOptimizer, PLearn::ConjGradientOptimizer, PLearn::AutoScaledGradientOptimizer, PLearn::OnlineGramNaturalGradientOptimizer, and PLearn::GradientOptimizer.

Definition at line 93 of file Optimizer.cc.

References PLearn::OptionBase::buildoption, PLearn::declareOption(), PLearn::Object::declareOptions(), early_stop, PLearn::OptionBase::learntoption, and nstages.

Referenced by PLearn::OnlineGramNaturalGradientOptimizer::declareOptions(), PLearn::GradientOptimizer::declareOptions(), PLearn::ConjGradientOptimizer::declareOptions(), PLearn::AutoScaledGradientOptimizer::declareOptions(), and PLearn::AdaptGradientOptimizer::declareOptions().

{
    declareOption(ol, "nstages", &Optimizer::nstages, OptionBase::buildoption, 
        "Number of iterations to perform on the next call to optimizeN(..).");

    declareOption(ol, "early_stop", &Optimizer::early_stop,
                  OptionBase::learntoption, 
        "Whether an early stopping criterion has been met.");

    inherited::declareOptions(ol);
}

Here is the call graph for this function:

Here is the caller graph for this function:

static const PPath& PLearn::Optimizer::declaringFile ( ) [inline, static]
Optimizer * PLearn::Optimizer::deepCopy ( CopiesMap copies) const [virtual]
void PLearn::Optimizer::makeDeepCopyFromShallowCopy ( CopiesMap copies) [virtual]

Does the necessary operations to transform a shallow copy (this) into a deep copy by deep-copying all the members that need to be.

This needs to be overridden by every class that adds "complex" data members to the class, such as Vec, Mat, PP<Something>, etc. Typical implementation:

  void CLASS_OF_THIS::makeDeepCopyFromShallowCopy(CopiesMap& copies)
  {
      inherited::makeDeepCopyFromShallowCopy(copies);
      deepCopyField(complex_data_member1, copies);
      deepCopyField(complex_data_member2, copies);
      ...
  }
Parameters:
copiesA map used by the deep-copy mechanism to keep track of already-copied objects.

Reimplemented from PLearn::Object.

Reimplemented in PLearn::AdaptGradientOptimizer, PLearn::ConjGradientOptimizer, PLearn::AutoScaledGradientOptimizer, PLearn::OnlineGramNaturalGradientOptimizer, and PLearn::GradientOptimizer.

Definition at line 190 of file Optimizer.cc.

References cost, PLearn::deepCopyField(), PLearn::Object::makeDeepCopyFromShallowCopy(), other_costs, other_params, other_proppaths, params, partial_update_vars, proppath, and PLearn::varDeepCopyField().

Referenced by PLearn::OnlineGramNaturalGradientOptimizer::makeDeepCopyFromShallowCopy(), and PLearn::ConjGradientOptimizer::makeDeepCopyFromShallowCopy().

Here is the call graph for this function:

Here is the caller graph for this function:

virtual bool PLearn::Optimizer::optimizeN ( VecStatsCollector stats_coll) [pure virtual]

Main optimization method, to be defined in subclasses.

Return true iff no further optimization is possible.

Implemented in PLearn::AdaptGradientOptimizer, PLearn::ConjGradientOptimizer, PLearn::AutoScaledGradientOptimizer, PLearn::OnlineGramNaturalGradientOptimizer, and PLearn::GradientOptimizer.

bool PLearn::Optimizer::remote_optimizeN ( PP< VecStatsCollector stats_coll) [inline]

Definition at line 133 of file Optimizer.h.

References PLearn::PP< T >::isNotNull(), and PLASSERT.

Referenced by declareMethods().

                                                            {
        PLASSERT( stats_coll.isNotNull() );
        return optimizeN(*stats_coll);
    }

Here is the call graph for this function:

Here is the caller graph for this function:

void PLearn::Optimizer::remote_setToOptimize ( const VarArray params,
Var  cost 
)

Remote version of setToOptimize.

Definition at line 150 of file Optimizer.cc.

References setToOptimize().

Referenced by declareMethods().

Here is the call graph for this function:

Here is the caller graph for this function:

void PLearn::Optimizer::reset ( ) [virtual]

Reimplemented in PLearn::ConjGradientOptimizer.

Definition at line 84 of file Optimizer.cc.

References early_stop, and stage.

Referenced by PLearn::ConjGradientOptimizer::reset().

{
    stage = 0;
    early_stop = false;
}

Here is the caller graph for this function:

virtual void PLearn::Optimizer::setPartialUpdateVars ( const VarArray the_partial_update_vars) [inline, virtual]

Definition at line 146 of file Optimizer.h.

    {
        partial_update_vars = the_partial_update_vars;
    }
void PLearn::Optimizer::setToOptimize ( const VarArray the_params,
Var  the_cost,
VarArray  the_other_costs = VarArray(0),
TVec< VarArray the_other_params = TVec<VarArray>(0),
real  the_other_weight = 1 
) [virtual]

Reimplemented in PLearn::AutoScaledGradientOptimizer.

Definition at line 129 of file Optimizer.cc.

References cost, PLearn::VarArray::fprop(), i, PLearn::TVec< T >::length(), other_costs, other_params, other_proppaths, other_weight, params, PLearn::propagationPath(), PLearn::propagationPathToParentsOfPath(), proppath, and PLearn::TVec< T >::resize().

Referenced by build_(), remote_setToOptimize(), and PLearn::AutoScaledGradientOptimizer::setToOptimize().

{
    params = the_params;//displayVarGraph(params, true, 333, "p1", false);
    cost = the_cost;//displayVarGraph(cost[0], true, 333, "c1", false);
    proppath = propagationPath(params,cost);//displayVarGraph(proppath, true, 333, "x1", false);
    VarArray path_from_all_sources_to_direct_parents = propagationPathToParentsOfPath(params, cost);
    path_from_all_sources_to_direct_parents.fprop();//displayVarGraph(path_from_all_sources_to_direct_parents, true, 333, "x1", false);

    // This is probably not complete. Maybe a 
    // path_from_all_sources_to_direct_parents should also be computed and fproped
    other_costs = the_other_costs;
    other_params = the_other_params;
    other_proppaths.resize(other_costs.length());
    for(int i=0; i<other_proppaths.length(); i++)
        other_proppaths[i] = propagationPath(other_params[i],other_costs[i]);
    other_weight = the_other_weight;
}

Here is the call graph for this function:

Here is the caller graph for this function:

void PLearn::Optimizer::verifyGradient ( real  step)

verify gradient at the current value of the parameters using step for the finite difference approximation of the gradient

Definition at line 208 of file Optimizer.cc.

References cost, PLearn::VarArray::nelems(), and params.

{
    Func f(params,cost);
    Vec p(params.nelems());
    params >> p;
    f->verifyGradient(p, step);
}

Here is the call graph for this function:

void PLearn::Optimizer::verifyGradient ( real  minval,
real  maxval,
real  step 
)

verify gradient with uniform random initialization of parameters using step for the finite difference approximation of the gradient

Definition at line 202 of file Optimizer.cc.

References cost, and params.

{
    Func f(params,cost);
    f->verifyGradient(minval, maxval, step);
}

Member Data Documentation

Boolean used in subclasses to notify of early stopping.

Definition at line 77 of file Optimizer.h.

Referenced by PLearn::AdaptGradientOptimizer::build_(), declareOptions(), PLearn::AdaptGradientOptimizer::optimizeN(), PLearn::ConjGradientOptimizer::optimizeN(), and reset().

Parameters of other costs to update (usually a subset of params)

Definition at line 84 of file Optimizer.h.

Referenced by build_(), makeDeepCopyFromShallowCopy(), PLearn::GradientOptimizer::optimizeN(), and setToOptimize().

Propagation paths of other_costs.

Definition at line 86 of file Optimizer.h.

Referenced by makeDeepCopyFromShallowCopy(), PLearn::GradientOptimizer::optimizeN(), and setToOptimize().

Weight for all the other costs.

Definition at line 88 of file Optimizer.h.

Referenced by build_(), PLearn::GradientOptimizer::optimizeN(), and setToOptimize().

Vars that are partially updated.

Definition at line 72 of file Optimizer.h.

Referenced by makeDeepCopyFromShallowCopy(), and PLearn::GradientOptimizer::optimizeN().


The documentation for this class was generated from the following files:
 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Friends Defines