PLearn 0.1
Public Member Functions | Static Public Member Functions | Public Attributes | Static Public Attributes | Protected Member Functions | Static Protected Member Functions | Private Types | Private Member Functions
PLearn::RBMGenericParameters Class Reference

Stores and learns the parameters between two layers of an RBM. More...

#include <RBMGenericParameters.h>

Inheritance diagram for PLearn::RBMGenericParameters:
Inheritance graph
[legend]
Collaboration diagram for PLearn::RBMGenericParameters:
Collaboration graph
[legend]

List of all members.

Public Member Functions

 RBMGenericParameters (real the_learning_rate=0)
 Default constructor.
 RBMGenericParameters (string down_types, string up_types, real the_learning_rate=0)
 Constructor from two string prototymes.
virtual void accumulatePosStats (const Vec &down_values, const Vec &up_values)
 Accumulates positive phase statistics to *_pos_stats.
virtual void accumulateNegStats (const Vec &down_values, const Vec &up_values)
 Accumulates negative phase statistics to *_neg_stats.
virtual void update ()
 Updates parameters according to contrastive divergence gradient.
virtual void clearStats ()
 Clear all information accumulated during stats.
virtual void computeUnitActivations (int start, int length, const Vec &activations) const
 Computes the vectors of activation of "length" units, starting from "start", and concatenates them into "activations".
virtual void bpropUpdate (const Vec &input, const Vec &output, Vec &input_gradient, const Vec &output_gradient)
 Adapt based on the output gradient: this method should only be called just after a corresponding fprop; it should be called with the same arguments as fprop for the first two arguments (and output should not have been modified since then).
virtual void forget ()
 reset the parameters to the state they would be BEFORE starting training.
virtual int nParameters (bool share_up_params, bool share_down_params) const
 optionally perform some processing after training, or after a series of fprop/bpropUpdate calls to prepare the model for truly out-of-sample operation.
virtual Vec makeParametersPointHere (const Vec &global_parameters, bool share_up_params, bool share_down_params)
 Make the parameters data be sub-vectors of the given global_parameters.
virtual string classname () const
virtual OptionListgetOptionList () const
virtual OptionMapgetOptionMap () const
virtual RemoteMethodMapgetRemoteMethodMap () const
virtual RBMGenericParametersdeepCopy (CopiesMap &copies) const
virtual void build ()
 Post-constructor.
virtual void makeDeepCopyFromShallowCopy (CopiesMap &copies)
 Transforms a shallow copy into a deep copy.

Static Public Member Functions

static string _classname_ ()
static OptionList_getOptionList_ ()
static RemoteMethodMap_getRemoteMethodMap_ ()
static Object_new_instance_for_typemap_ ()
static bool _isa_ (const Object *o)
static void _static_initialize_ ()
static const PPathdeclaringFile ()

Public Attributes

Mat weights
 Matrix containing unit-to-unit weights (output_size × input_size)
TVec< Vecup_units_params
 Element i contains inner parameters (like the bias) of up unit i.
TVec< Vecdown_units_params
 Element i contains inner parameters (like the bias) of down unit i.
Mat weights_pos_stats
 Accumulates positive contribution to the weights' gradient.
Mat weights_neg_stats
 Accumulates negative contribution to the weights' gradient.
TVec< Vecup_units_params_pos_stats
 Accumulates positive contribution to the gradient of up_units_params.
TVec< Vecup_units_params_neg_stats
 Accumulates negative contribution to the gradient of up_units_params.
TVec< Vecdown_units_params_pos_stats
 Accumulates positive contribution to the gradient of down_units_params.
TVec< Vecdown_units_params_neg_stats
 Accumulates negative contribution to the gradient of down_units_params.

Static Public Attributes

static StaticInitializer _static_initializer_

Protected Member Functions

virtual void computeLinearUnitActivations (int i, const Vec &activations) const
 Computes the activations vector of unit "i", assuming it is linear "i" indexes an up unit if "going_up", else a down unit.
virtual void computeQuadraticUnitActivations (int i, const Vec &activations) const
 Computes the activations vector of unit "i", assuming it is quadratic "i" indexes an up unit if "going_up", else a down unit.

Static Protected Member Functions

static void declareOptions (OptionList &ol)
 Declares the class options.

Private Types

typedef RBMParameters inherited

Private Member Functions

void build_ ()
 This does the actual building.

Detailed Description

Stores and learns the parameters between two layers of an RBM.

Todo:
: yes
Deprecated:
Use ../RBMMixedConnection.h instead

Definition at line 56 of file RBMGenericParameters.h.


Member Typedef Documentation

Reimplemented from PLearn::RBMParameters.

Reimplemented in PLearn::RBMJointGenericParameters.

Definition at line 58 of file RBMGenericParameters.h.


Constructor & Destructor Documentation

PLearn::RBMGenericParameters::RBMGenericParameters ( real  the_learning_rate = 0)

Default constructor.

Definition at line 53 of file RBMGenericParameters.cc.

                                                                   :
    inherited(the_learning_rate)
{
}
PLearn::RBMGenericParameters::RBMGenericParameters ( string  down_types,
string  up_types,
real  the_learning_rate = 0 
)

Constructor from two string prototymes.

Definition at line 58 of file RBMGenericParameters.cc.

References build().

                                                                     :
    inherited( down_types, up_types, the_learning_rate )
{
    // We're not sure inherited::build() has been called
    build();
}

Here is the call graph for this function:


Member Function Documentation

string PLearn::RBMGenericParameters::_classname_ ( ) [static]

Reimplemented from PLearn::RBMParameters.

Reimplemented in PLearn::RBMJointGenericParameters.

Definition at line 51 of file RBMGenericParameters.cc.

OptionList & PLearn::RBMGenericParameters::_getOptionList_ ( ) [static]

Reimplemented from PLearn::RBMParameters.

Reimplemented in PLearn::RBMJointGenericParameters.

Definition at line 51 of file RBMGenericParameters.cc.

RemoteMethodMap & PLearn::RBMGenericParameters::_getRemoteMethodMap_ ( ) [static]

Reimplemented from PLearn::RBMParameters.

Reimplemented in PLearn::RBMJointGenericParameters.

Definition at line 51 of file RBMGenericParameters.cc.

bool PLearn::RBMGenericParameters::_isa_ ( const Object o) [static]

Reimplemented from PLearn::RBMParameters.

Reimplemented in PLearn::RBMJointGenericParameters.

Definition at line 51 of file RBMGenericParameters.cc.

Object * PLearn::RBMGenericParameters::_new_instance_for_typemap_ ( ) [static]

Reimplemented from PLearn::Object.

Reimplemented in PLearn::RBMJointGenericParameters.

Definition at line 51 of file RBMGenericParameters.cc.

StaticInitializer RBMGenericParameters::_static_initializer_ & PLearn::RBMGenericParameters::_static_initialize_ ( ) [static]

Reimplemented from PLearn::RBMParameters.

Reimplemented in PLearn::RBMJointGenericParameters.

Definition at line 51 of file RBMGenericParameters.cc.

void PLearn::RBMGenericParameters::accumulateNegStats ( const Vec down_values,
const Vec up_values 
) [virtual]

Accumulates negative phase statistics to *_neg_stats.

Implements PLearn::RBMParameters.

Definition at line 226 of file RBMGenericParameters.cc.

References PLearn::RBMParameters::down_layer_size, down_units_params, down_units_params_neg_stats, PLearn::RBMParameters::down_units_types, PLearn::externalProductAcc(), i, PLearn::RBMParameters::neg_count, PLearn::RBMParameters::up_layer_size, up_units_params, up_units_params_neg_stats, PLearn::RBMParameters::up_units_types, and weights_neg_stats.

{
    // weights_neg_stats += up_values * down_values'
    externalProductAcc( weights_neg_stats, up_values, down_values );

    for( int i=0 ; i<down_layer_size ; i++ )
    {
        // the bias is updated the same way for 'l' and 'g' units
        down_units_params_neg_stats[i][0] += down_values[i];

        // update also 'g' units' quadratic term
        if( down_units_types[i] == 'g' )
            down_units_params_neg_stats[i][1] +=
                2 * down_units_params[i][1] * down_values[i] * down_values[i];
    }

    for( int i=0 ; i<up_layer_size ; i++ )
    {
        // the bias is updated the same way for 'l' and 'g' units
        up_units_params_neg_stats[i][0] += up_values[i];

        // update also 'g' units' quadratic term
        if( up_units_types[i] == 'g' )
            up_units_params_neg_stats[i][1] +=
                2 * up_units_params[i][1] * up_values[i] * up_values[i];
    }

    neg_count++;
}

Here is the call graph for this function:

void PLearn::RBMGenericParameters::accumulatePosStats ( const Vec down_values,
const Vec up_values 
) [virtual]

Accumulates positive phase statistics to *_pos_stats.

Implements PLearn::RBMParameters.

Definition at line 195 of file RBMGenericParameters.cc.

References PLearn::RBMParameters::down_layer_size, down_units_params, down_units_params_pos_stats, PLearn::RBMParameters::down_units_types, PLearn::externalProductAcc(), i, PLearn::RBMParameters::pos_count, PLearn::RBMParameters::up_layer_size, up_units_params, up_units_params_pos_stats, PLearn::RBMParameters::up_units_types, and weights_pos_stats.

{
    // weights_pos_stats += up_values * down_values'
    externalProductAcc( weights_pos_stats, up_values, down_values );

    for( int i=0 ; i<down_layer_size ; i++ )
    {
        // the bias is updated the same way for 'l' and 'g' units
        down_units_params_pos_stats[i][0] += down_values[i];

        // update also 'g' units' quadratic term
        if( down_units_types[i] == 'g' )
            down_units_params_pos_stats[i][1] +=
                2 * down_units_params[i][1] * down_values[i] * down_values[i];
    }

    for( int i=0 ; i<up_layer_size ; i++ )
    {
        // the bias is updated the same way for 'l' and 'g' units
        up_units_params_pos_stats[i][0] += up_values[i];

        // update also 'g' units' quadratic term
        if( up_units_types[i] == 'g' )
            up_units_params_pos_stats[i][1] +=
                2 * up_units_params[i][1] * up_values[i] * up_values[i];
    }

    pos_count++;
}

Here is the call graph for this function:

void PLearn::RBMGenericParameters::bpropUpdate ( const Vec input,
const Vec output,
Vec input_gradient,
const Vec output_gradient 
) [virtual]

Adapt based on the output gradient: this method should only be called just after a corresponding fprop; it should be called with the same arguments as fprop for the first two arguments (and output should not have been modified since then).

this version allows to obtain the input gradient as well

Since sub-classes are supposed to learn ONLINE, the object is 'ready-to-be-used' just after any bpropUpdate. N.B. A DEFAULT IMPLEMENTATION IS PROVIDED IN THE SUPER-CLASS, WHICH JUST CALLS bpropUpdate(input, output, input_gradient, output_gradient) AND IGNORES INPUT GRADIENT. this version allows to obtain the input gradient as well N.B. THE DEFAULT IMPLEMENTATION IN SUPER-CLASS JUST RAISES A PLERROR.

Reimplemented in PLearn::RBMJointGenericParameters.

Definition at line 396 of file RBMGenericParameters.cc.

References PLERROR.

{
    PLERROR( "RBMGenericParameters::bpropUpdate() not implemented yet.\n"
             "If you only have linear units on up and down layer, you should\n"
             "consider using RBMLLParameters instead.\n" );
}
void PLearn::RBMGenericParameters::build ( ) [virtual]

Post-constructor.

The normal implementation should call simply inherited::build(), then this class's build_(). This method should be callable again at later times, after modifying some option fields to change the "architecture" of the object.

Reimplemented from PLearn::RBMParameters.

Reimplemented in PLearn::RBMJointGenericParameters.

Definition at line 173 of file RBMGenericParameters.cc.

References PLearn::RBMParameters::build(), and build_().

Referenced by PLearn::RBMJointGenericParameters::build(), and RBMGenericParameters().

Here is the call graph for this function:

Here is the caller graph for this function:

void PLearn::RBMGenericParameters::build_ ( ) [private]

This does the actual building.

Reimplemented from PLearn::RBMParameters.

Reimplemented in PLearn::RBMJointGenericParameters.

Definition at line 97 of file RBMGenericParameters.cc.

References clearStats(), PLearn::RBMParameters::down_layer_size, down_units_params, down_units_params_neg_stats, down_units_params_pos_stats, PLearn::RBMParameters::down_units_types, forget(), i, PLearn::TMat< T >::length(), PLearn::OnlineLearningModule::output_size, PLERROR, PLearn::TVec< T >::resize(), PLearn::TMat< T >::resize(), PLearn::RBMParameters::up_layer_size, up_units_params, up_units_params_neg_stats, up_units_params_pos_stats, PLearn::RBMParameters::up_units_types, weights, weights_neg_stats, weights_pos_stats, and PLearn::TMat< T >::width().

Referenced by build().

{
    if( up_layer_size == 0 || down_layer_size == 0 )
        return;

    output_size = 0;
    bool needs_forget = false; // do we need to reinitialize the parameters?

    if( weights.length() != up_layer_size ||
        weights.width() != down_layer_size )
    {
        weights.resize( up_layer_size, down_layer_size );
        needs_forget = true;
    }

    weights_pos_stats.resize( up_layer_size, down_layer_size );
    weights_neg_stats.resize( up_layer_size, down_layer_size );

    down_units_params.resize( down_layer_size );
    down_units_params_pos_stats.resize( down_layer_size );
    down_units_params_neg_stats.resize( down_layer_size );
    for( int i=0 ; i<down_layer_size ; i++ )
    {
        char dut_i = down_units_types[i];
        if( dut_i == 'l' ) // linear activation unit
        {
            down_units_params[i].resize(1);
            down_units_params_pos_stats[i].resize(1);
            down_units_params_neg_stats[i].resize(1);
        }
        else if( dut_i == 'q' ) // quadratic
        {
            down_units_params[i].resize(2);
            down_units_params_pos_stats[i].resize(2);
            down_units_params_neg_stats[i].resize(2);
        }
        else
            PLERROR( "RBMGenericParameters::build_() - value '%c' for"
                     " down_units_types[%d]\n"
                     "is unknown. Supported values are 'l' and 'q'.\n",
                     dut_i, i );
    }

    up_units_params.resize( up_layer_size );
    up_units_params_pos_stats.resize( up_layer_size );
    up_units_params_neg_stats.resize( up_layer_size );
    for( int i=0 ; i<up_layer_size ; i++ )
    {
        char uut_i = up_units_types[i];
        if( uut_i == 'l' ) // linear activation unit
        {
            up_units_params[i].resize(1);
            up_units_params_pos_stats[i].resize(1);
            up_units_params_neg_stats[i].resize(1);
            output_size += 1;
        }
        else if( uut_i == 'q' )
        {
            up_units_params[i].resize(2);
            up_units_params_pos_stats[i].resize(2);
            up_units_params_neg_stats[i].resize(2);
            output_size += 2;
        }
        else
            PLERROR( "RBMGenericParameters::build_() - value '%c' for"
                     " up_units_types[%d]\n"
                     "is unknown. Supported values are 'l' and 'q'.\n",
                     uut_i, i );
    }

    if( needs_forget )
        forget();

    clearStats();
}

Here is the call graph for this function:

Here is the caller graph for this function:

string PLearn::RBMGenericParameters::classname ( ) const [virtual]

Reimplemented from PLearn::Object.

Reimplemented in PLearn::RBMJointGenericParameters.

Definition at line 51 of file RBMGenericParameters.cc.

void PLearn::RBMGenericParameters::clearStats ( ) [virtual]
void PLearn::RBMGenericParameters::computeLinearUnitActivations ( int  i,
const Vec activations 
) const [protected, virtual]

Computes the activations vector of unit "i", assuming it is linear "i" indexes an up unit if "going_up", else a down unit.

Reimplemented in PLearn::RBMJointGenericParameters.

Definition at line 304 of file RBMGenericParameters.cc.

References i, PLearn::TVec< T >::length(), PLASSERT, PLearn::product(), and PLearn::transposeProduct().

{
    PLASSERT( activations.length() == 1 );

    if( going_up )
    {
        PLASSERT( up_units_types[i] == 'l' );

        // activations[0] = sum_j weights(i,j) input_vec[j] + b[i]
        product( activations, weights.subMatRows(i,1), input_vec );
        activations[0] += up_units_params[i][0];
    }
    else
    {
        PLASSERT( down_units_types[i] == 'l' );

        // activations[0] = sum_j weights(j,i) input_vec[j] + b[i]
        transposeProduct( activations, weights.subMatColumns(i,1), input_vec );
        activations[0] += down_units_params[i][0];
    }
}

Here is the call graph for this function:

void PLearn::RBMGenericParameters::computeQuadraticUnitActivations ( int  i,
const Vec activations 
) const [protected, virtual]

Computes the activations vector of unit "i", assuming it is quadratic "i" indexes an up unit if "going_up", else a down unit.

Reimplemented in PLearn::RBMJointGenericParameters.

Definition at line 327 of file RBMGenericParameters.cc.

References i, PLearn::TVec< T >::length(), PLASSERT, PLearn::product(), and PLearn::transposeProduct().

{
    PLASSERT( activations.length() == 2 );

    if( going_up )
    {
        PLASSERT( up_units_types[i] == 'q' );

        // activations[0] = -(sum_j weights(i,j) input_vec[j] + b[i])
        //                    / (2 * up_units_params[i][1]^2)
        product( activations, weights.subMatRows(i,1), input_vec );
        real a_i = up_units_params[i][1];
        activations[0] = -(activations[0] + up_units_params[i][0])
                           / (2 * a_i * a_i);

        // activations[1] = 1 / (2 * up_units_params[i][1]^2)
        activations[1] = 1. / (2. * a_i * a_i);
    }
    else
    {
        PLASSERT( down_units_types[i] == 'q' );

        // activations[0] = -(sum_j weights(j,i) input_vec[j] + b[i])
        //                    / (2 * down_units_params[i][1]^2)
        transposeProduct( activations, weights.subMatColumns(i,1), input_vec );
        real a_i = down_units_params[i][1];
        activations[0] = -(activations[0] + down_units_params[i][0])
                           / (2 * a_i * a_i);

        // activations[1] = 1 / (2 * down_units_params[i][1]^2)
        activations[1] = 1. / (2. * a_i * a_i);
    }
}

Here is the call graph for this function:

void PLearn::RBMGenericParameters::computeUnitActivations ( int  start,
int  length,
const Vec activations 
) const [virtual]

Computes the vectors of activation of "length" units, starting from "start", and concatenates them into "activations".

"start" indexes an up unit if "going_up", else a down unit.

Implements PLearn::RBMParameters.

Reimplemented in PLearn::RBMJointGenericParameters.

Definition at line 363 of file RBMGenericParameters.cc.

References i, PLASSERT, PLERROR, and PLearn::TVec< T >::subVec().

{
    string units_types;
    if( going_up )
        units_types = up_units_types;
    else
        units_types = down_units_types;

    PLASSERT( start+length <= (int) units_types.length() );
    int cur_pos = 0; // position index inside activations

    for( int i=start ; i<start+length ; i++ )
    {
        char ut_i = units_types[i];
        if( ut_i == 'l' )
        {
            computeLinearUnitActivations( i, activations.subVec(cur_pos, 1) );
            cur_pos++;
        }
        else if( ut_i == 'q' )
        {
            computeQuadraticUnitActivations( i,
                                             activations.subVec(cur_pos, 2) );
            cur_pos += 2;
        }
        else
            PLERROR( "RBMGenericParameters::computeUnitActivations():\n"
                     "value '%c' for units_types[%d] is unknown.\n"
                     "Supported values are 'l' and 'q'.\n", ut_i, i );
    }
}

Here is the call graph for this function:

void PLearn::RBMGenericParameters::declareOptions ( OptionList ol) [static, protected]

Declares the class options.

Reimplemented from PLearn::RBMParameters.

Reimplemented in PLearn::RBMJointGenericParameters.

Definition at line 66 of file RBMGenericParameters.cc.

References PLearn::declareOption(), PLearn::RBMParameters::declareOptions(), down_units_params, PLearn::OptionBase::learntoption, up_units_params, and weights.

Referenced by PLearn::RBMJointGenericParameters::declareOptions().

{
    // ### Declare all of this object's options here.
    // ### For the "flags" of each option, you should typically specify
    // ### one of OptionBase::buildoption, OptionBase::learntoption or
    // ### OptionBase::tuningoption. If you don't provide one of these three,
    // ### this option will be ignored when loading values from a script.
    // ### You can also combine flags, for example with OptionBase::nosave:
    // ### (OptionBase::buildoption | OptionBase::nosave)

    declareOption(ol, "weights", &RBMGenericParameters::weights,
                  OptionBase::learntoption,
                  "Matrix containing unit-to-unit weights (output_size ×"
                  " input_size)");

    declareOption(ol, "up_units_params",
                  &RBMGenericParameters::up_units_params,
                  OptionBase::learntoption,
                  "Element i contains inner parameters (like the bias) of up"
                  " unit i");

    declareOption(ol, "down_units_params",
                  &RBMGenericParameters::down_units_params,
                  OptionBase::learntoption,
                  "Element i contains inner parameters (like the bias) of down"
                  " unit i");

    // Now call the parent class' declareOptions
    inherited::declareOptions(ol);
}

Here is the call graph for this function:

Here is the caller graph for this function:

static const PPath& PLearn::RBMGenericParameters::declaringFile ( ) [inline, static]

Reimplemented from PLearn::RBMParameters.

Reimplemented in PLearn::RBMJointGenericParameters.

Definition at line 174 of file RBMGenericParameters.h.

:

RBMGenericParameters * PLearn::RBMGenericParameters::deepCopy ( CopiesMap copies) const [virtual]

Reimplemented from PLearn::RBMParameters.

Reimplemented in PLearn::RBMJointGenericParameters.

Definition at line 51 of file RBMGenericParameters.cc.

void PLearn::RBMGenericParameters::forget ( ) [virtual]

reset the parameters to the state they would be BEFORE starting training.

Note that this method is necessarily called from build().

Implements PLearn::OnlineLearningModule.

Reimplemented in PLearn::RBMJointGenericParameters.

Definition at line 407 of file RBMGenericParameters.cc.

References PLearn::clear(), PLearn::TMat< T >::clear(), clearStats(), d, PLearn::RBMParameters::down_layer_size, down_units_params, PLearn::RBMParameters::initialization_method, PLearn::max(), PLearn::OnlineLearningModule::random_gen, PLearn::sqrt(), PLearn::RBMParameters::up_layer_size, up_units_params, and weights.

Referenced by build_().

{
    if( initialization_method == "zero" )
        weights.clear();
    else
    {
        if( !random_gen )
            random_gen = new PRandom();

        real d = 1. / max( down_layer_size, up_layer_size );
        if( initialization_method == "uniform_sqrt" )
            d = sqrt( d );

        random_gen->fill_random_uniform( weights, -d, d );
    }

    for( int i=0 ; i<down_layer_size ; i++ )
        down_units_params[i].clear();

    for( int i=0 ; i<up_layer_size ; i++ )
        up_units_params[i].clear();

    clearStats();
}

Here is the call graph for this function:

Here is the caller graph for this function:

OptionList & PLearn::RBMGenericParameters::getOptionList ( ) const [virtual]

Reimplemented from PLearn::Object.

Reimplemented in PLearn::RBMJointGenericParameters.

Definition at line 51 of file RBMGenericParameters.cc.

OptionMap & PLearn::RBMGenericParameters::getOptionMap ( ) const [virtual]

Reimplemented from PLearn::Object.

Reimplemented in PLearn::RBMJointGenericParameters.

Definition at line 51 of file RBMGenericParameters.cc.

RemoteMethodMap & PLearn::RBMGenericParameters::getRemoteMethodMap ( ) const [virtual]

Reimplemented from PLearn::Object.

Reimplemented in PLearn::RBMJointGenericParameters.

Definition at line 51 of file RBMGenericParameters.cc.

void PLearn::RBMGenericParameters::makeDeepCopyFromShallowCopy ( CopiesMap copies) [virtual]
Vec PLearn::RBMGenericParameters::makeParametersPointHere ( const Vec global_parameters,
bool  share_up_params,
bool  share_down_params 
) [virtual]

Make the parameters data be sub-vectors of the given global_parameters.

The argument should have size >= nParameters. The result is a Vec that starts just after this object's parameters end, i.e. n = nParameters(share_up_params,share_down_params); result = global_parameters.subVec(n,global_parameters.size()-n); This allows to easily chain calls of this method on multiple RBMParameters.

The argument should have size >= nParameters. The result is a Vec that starts just after this object's parameters end, i.e. result = global_parameters.subVec(nParameters(),global_parameters.size()-nParameters()); This allows to easily chain calls of this method on multiple RBMParameters.

Implements PLearn::RBMParameters.

Definition at line 462 of file RBMGenericParameters.cc.

References PLearn::TVec< T >::data(), down_units_params, i, PLearn::TVec< T >::length(), m, PLearn::TMat< T >::makeSharedValue(), PLearn::TVec< T >::makeSharedValue(), n, nParameters(), PLERROR, PLearn::TVec< T >::size(), PLearn::TMat< T >::size(), PLearn::TVec< T >::subVec(), up_units_params, and weights.

{
    int n = nParameters(share_up_params,share_down_params);
    int m = global_parameters.size();
    if (m<n)
        PLERROR("RBMLLParameters::makeParametersPointHere: argument has length %d, should be longer than nParameters()=%d",m,n);
    real* p = global_parameters.data();
    weights.makeSharedValue(p,weights.size());
    p+=weights.size();
    if(share_up_params)
        for (int i=0;i<up_units_params.length();i++)
        {
            up_units_params[i].makeSharedValue(p,up_units_params[i].size());
            p+=up_units_params[i].size();
        }
    if (share_down_params)
        for (int i=0;i<down_units_params.length();i++)
        {
            down_units_params[i].makeSharedValue(p,down_units_params[i].size());
            p+=down_units_params[i].size();
        }
    return global_parameters.subVec(n,m-n);
}

Here is the call graph for this function:

int PLearn::RBMGenericParameters::nParameters ( bool  share_up_params,
bool  share_down_params 
) const [virtual]

optionally perform some processing after training, or after a series of fprop/bpropUpdate calls to prepare the model for truly out-of-sample operation.

return the number of parameters

THE DEFAULT IMPLEMENTATION PROVIDED IN THE SUPER-CLASS DOES NOT DO ANYTHING. return the number of parameters (with flags to specify if the up-parameters and/or the down-parameters should be counted).

Implements PLearn::RBMParameters.

Definition at line 445 of file RBMGenericParameters.cc.

References down_units_params, PLearn::TVec< T >::length(), m, PLearn::TMat< T >::size(), up_units_params, and weights.

Referenced by makeParametersPointHere().

{
    int m = weights.size();
    if (share_up_params)
        for (int i=0;i<up_units_params.length();i++)
            m += up_units_params[i].size();
    if (share_down_params)
        for (int i=0;i<down_units_params.length();i++)
            m += down_units_params[i].size();
    return m;
}

Here is the call graph for this function:

Here is the caller graph for this function:

void PLearn::RBMGenericParameters::update ( ) [virtual]

Member Data Documentation

Reimplemented from PLearn::RBMParameters.

Reimplemented in PLearn::RBMJointGenericParameters.

Definition at line 174 of file RBMGenericParameters.h.

Accumulates negative contribution to the gradient of down_units_params.

Definition at line 89 of file RBMGenericParameters.h.

Referenced by accumulateNegStats(), PLearn::RBMJointGenericParameters::build_(), build_(), clearStats(), makeDeepCopyFromShallowCopy(), and update().

Accumulates positive contribution to the gradient of down_units_params.

Definition at line 87 of file RBMGenericParameters.h.

Referenced by accumulatePosStats(), PLearn::RBMJointGenericParameters::build_(), build_(), clearStats(), makeDeepCopyFromShallowCopy(), and update().

Accumulates negative contribution to the gradient of up_units_params.

Definition at line 85 of file RBMGenericParameters.h.

Referenced by accumulateNegStats(), PLearn::RBMJointGenericParameters::build_(), build_(), clearStats(), makeDeepCopyFromShallowCopy(), and update().

Accumulates positive contribution to the gradient of up_units_params.

Definition at line 83 of file RBMGenericParameters.h.

Referenced by accumulatePosStats(), PLearn::RBMJointGenericParameters::build_(), build_(), clearStats(), makeDeepCopyFromShallowCopy(), and update().

Accumulates negative contribution to the weights' gradient.

Definition at line 80 of file RBMGenericParameters.h.

Referenced by accumulateNegStats(), PLearn::RBMJointGenericParameters::build_(), build_(), clearStats(), makeDeepCopyFromShallowCopy(), and update().

Accumulates positive contribution to the weights' gradient.

Definition at line 77 of file RBMGenericParameters.h.

Referenced by accumulatePosStats(), PLearn::RBMJointGenericParameters::build_(), build_(), clearStats(), makeDeepCopyFromShallowCopy(), and update().


The documentation for this class was generated from the following files:
 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Friends Defines