PLearn 0.1
Public Member Functions | Static Public Member Functions | Public Attributes | Static Public Attributes | Protected Member Functions | Static Protected Member Functions | Protected Attributes | Private Types | Private Member Functions | Private Attributes
PLearn::DivisiveNormalizationKernel Class Reference

#include <DivisiveNormalizationKernel.h>

Inheritance diagram for PLearn::DivisiveNormalizationKernel:
Inheritance graph
[legend]
Collaboration diagram for PLearn::DivisiveNormalizationKernel:
Collaboration graph
[legend]

List of all members.

Public Member Functions

 DivisiveNormalizationKernel ()
 Default constructor.
 DivisiveNormalizationKernel (Ker the_source, bool the_remove_bias=false)
 Created from an existing kernel.
virtual void build ()
 Simply calls inherited::build() then build_().
virtual void makeDeepCopyFromShallowCopy (CopiesMap &copies)
 Transforms a shallow copy into a deep copy.
virtual string classname () const
virtual OptionListgetOptionList () const
virtual OptionMapgetOptionMap () const
virtual RemoteMethodMapgetRemoteMethodMap () const
virtual
DivisiveNormalizationKernel
deepCopy (CopiesMap &copies) const
virtual real evaluate (const Vec &x1, const Vec &x2) const
 Overridden.
virtual real evaluate_i_j (int i, int j) const
 returns evaluate(data(i),data(j))
virtual real evaluate_i_x (int i, const Vec &x, real squared_norm_of_x=-1) const
 Return evaluate(data(i),x).
virtual real evaluate_x_i (const Vec &x, int i, real squared_norm_of_x=-1) const
 returns evaluate(x,data(i)) [default version calls evaluate_i_x if kernel is_symmetric]
virtual real evaluate_i_x_again (int i, const Vec &x, real squared_norm_of_x=-1, bool first_time=false) const
 Return evaluate(data(i),x), where x is the same as in the precedent call to this same function (except if 'first_time' is true).
virtual real evaluate_x_i_again (const Vec &x, int i, real squared_norm_of_x=-1, bool first_time=false) const
virtual void computeGramMatrix (Mat K) const
 Call evaluate_i_j to fill each of the entries (i,j) of symmetric matrix K.
virtual void setDataForKernelMatrix (VMat the_data)
 ** Subclasses may override these methods to provide efficient kernel matrix access **

Static Public Member Functions

static string _classname_ ()
static OptionList_getOptionList_ ()
static RemoteMethodMap_getRemoteMethodMap_ ()
static Object_new_instance_for_typemap_ ()
static bool _isa_ (const Object *o)
static void _static_initialize_ ()
static const PPathdeclaringFile ()

Public Attributes

bool data_will_change
bool remove_bias

Static Public Attributes

static StaticInitializer _static_initializer_

Protected Member Functions

real computeAverage (const Vec &x, bool on_row, real squared_norm_of_x=-1) const
 Return the average of K(x,x_i) or K(x_i,x), depending on the value of 'on_row' (true or false, respectively).

Static Protected Member Functions

static void declareOptions (OptionList &ol)
 Declares this class' options.

Protected Attributes

Vec average_col
Vec average_row
real avg_evaluate_i_x_again
 The last average computed in evaluate_i_x_again().
real avg_evaluate_x_i_again
 The last average computed in evaluate_x_i_again().

Private Types

typedef SourceKernel inherited

Private Member Functions

void build_ ()
 This does the actual building.

Private Attributes

Vec all_k_x
 Used to store the values of the source kernel.

Detailed Description

Definition at line 52 of file DivisiveNormalizationKernel.h.


Member Typedef Documentation

Reimplemented from PLearn::SourceKernel.

Definition at line 57 of file DivisiveNormalizationKernel.h.


Constructor & Destructor Documentation

PLearn::DivisiveNormalizationKernel::DivisiveNormalizationKernel ( )

Default constructor.

Definition at line 52 of file DivisiveNormalizationKernel.cc.

    : data_will_change(false),
      remove_bias(false)
{}
PLearn::DivisiveNormalizationKernel::DivisiveNormalizationKernel ( Ker  the_source,
bool  the_remove_bias = false 
)

Created from an existing kernel.

Definition at line 57 of file DivisiveNormalizationKernel.cc.

References build(), and PLearn::SourceKernel::source_kernel.

    : data_will_change(false),
      remove_bias(the_remove_bias)
{
    source_kernel = the_source;
    build();
}

Here is the call graph for this function:


Member Function Documentation

string PLearn::DivisiveNormalizationKernel::_classname_ ( ) [static]

Reimplemented from PLearn::SourceKernel.

Definition at line 72 of file DivisiveNormalizationKernel.cc.

OptionList & PLearn::DivisiveNormalizationKernel::_getOptionList_ ( ) [static]

Reimplemented from PLearn::SourceKernel.

Definition at line 72 of file DivisiveNormalizationKernel.cc.

RemoteMethodMap & PLearn::DivisiveNormalizationKernel::_getRemoteMethodMap_ ( ) [static]

Reimplemented from PLearn::SourceKernel.

Definition at line 72 of file DivisiveNormalizationKernel.cc.

bool PLearn::DivisiveNormalizationKernel::_isa_ ( const Object o) [static]

Reimplemented from PLearn::SourceKernel.

Definition at line 72 of file DivisiveNormalizationKernel.cc.

Object * PLearn::DivisiveNormalizationKernel::_new_instance_for_typemap_ ( ) [static]

Reimplemented from PLearn::SourceKernel.

Definition at line 72 of file DivisiveNormalizationKernel.cc.

StaticInitializer DivisiveNormalizationKernel::_static_initializer_ & PLearn::DivisiveNormalizationKernel::_static_initialize_ ( ) [static]

Reimplemented from PLearn::SourceKernel.

Definition at line 72 of file DivisiveNormalizationKernel.cc.

void PLearn::DivisiveNormalizationKernel::build ( ) [virtual]

Simply calls inherited::build() then build_().

Reimplemented from PLearn::SourceKernel.

Definition at line 103 of file DivisiveNormalizationKernel.cc.

References PLearn::SourceKernel::build(), and build_().

Referenced by DivisiveNormalizationKernel().

Here is the call graph for this function:

Here is the caller graph for this function:

void PLearn::DivisiveNormalizationKernel::build_ ( ) [private]

This does the actual building.

Reimplemented from PLearn::SourceKernel.

Definition at line 112 of file DivisiveNormalizationKernel.cc.

Referenced by build().

{
    // ### This method should do the real building of the object,
    // ### according to set 'options', in *any* situation. 
    // ### Typical situations include:
    // ###  - Initial building of an object from a few user-specified options
    // ###  - Building of a "reloaded" object: i.e. from the complete set of all serialised options.
    // ###  - Updating or "re-building" of an object after a few "tuning" options have been modified.
    // ### You should assume that the parent class' build_() has already been called.
}

Here is the caller graph for this function:

string PLearn::DivisiveNormalizationKernel::classname ( ) const [virtual]

Reimplemented from PLearn::SourceKernel.

Definition at line 72 of file DivisiveNormalizationKernel.cc.

real PLearn::DivisiveNormalizationKernel::computeAverage ( const Vec x,
bool  on_row,
real  squared_norm_of_x = -1 
) const [inline, protected]

Return the average of K(x,x_i) or K(x_i,x), depending on the value of 'on_row' (true or false, respectively).

Definition at line 126 of file DivisiveNormalizationKernel.cc.

References all_k_x, PLearn::Kernel::is_symmetric, PLearn::Kernel::n_examples, PLearn::TVec< T >::resize(), PLearn::SourceKernel::source_kernel, and PLearn::sum().

Referenced by evaluate(), evaluate_i_x(), evaluate_i_x_again(), evaluate_x_i(), and evaluate_x_i_again().

                                                                                                        {
    all_k_x.resize(n_examples);
    if (is_symmetric || !on_row) {
        source_kernel->evaluate_all_i_x(x, all_k_x, squared_norm_of_x);
    } else {
        source_kernel->evaluate_all_x_i(x, all_k_x, squared_norm_of_x);
    }
    return sum(all_k_x) / real(n_examples);
}

Here is the call graph for this function:

Here is the caller graph for this function:

void PLearn::DivisiveNormalizationKernel::computeGramMatrix ( Mat  K) const [virtual]

Call evaluate_i_j to fill each of the entries (i,j) of symmetric matrix K.

Reimplemented from PLearn::SourceKernel.

Definition at line 139 of file DivisiveNormalizationKernel.cc.

                                                               {
    // Uses default Kernel implementation.
    Kernel::computeGramMatrix(K);
}
void PLearn::DivisiveNormalizationKernel::declareOptions ( OptionList ol) [static, protected]

Declares this class' options.

Reimplemented from PLearn::SourceKernel.

Definition at line 77 of file DivisiveNormalizationKernel.cc.

References average_col, average_row, PLearn::OptionBase::buildoption, data_will_change, PLearn::declareOption(), PLearn::SourceKernel::declareOptions(), PLearn::OptionBase::learntoption, and remove_bias.

{
    // Build options.

    declareOption(ol, "data_will_change", &DivisiveNormalizationKernel::data_will_change, OptionBase::buildoption,
                  "If set to 1, then the Gram matrix will be always recomputed, even if\n"
                  "it's not completely sure the data has changed.");

    declareOption(ol, "remove_bias", &DivisiveNormalizationKernel::remove_bias, OptionBase::buildoption,
                  "If set to 1, then the bias induced by the K(x_i,x_i) will be removed.\n");

    // Learnt options.

    declareOption(ol, "average_col", &DivisiveNormalizationKernel::average_col, OptionBase::learntoption,
                  "The average of the underlying kernel over each column of the Gram matrix.");

    declareOption(ol, "average_row", &DivisiveNormalizationKernel::average_row, OptionBase::learntoption,
                  "The average of the underlying kernel over each row of the Gram matrix.");

    // Now call the parent class' declareOptions
    inherited::declareOptions(ol);
}

Here is the call graph for this function:

static const PPath& PLearn::DivisiveNormalizationKernel::declaringFile ( ) [inline, static]

Reimplemented from PLearn::SourceKernel.

Definition at line 129 of file DivisiveNormalizationKernel.h.

DivisiveNormalizationKernel * PLearn::DivisiveNormalizationKernel::deepCopy ( CopiesMap copies) const [virtual]

Reimplemented from PLearn::SourceKernel.

Definition at line 72 of file DivisiveNormalizationKernel.cc.

real PLearn::DivisiveNormalizationKernel::evaluate ( const Vec x1,
const Vec x2 
) const [virtual]

Overridden.

Reimplemented from PLearn::SourceKernel.

Definition at line 147 of file DivisiveNormalizationKernel.cc.

References computeAverage(), PLearn::SourceKernel::source_kernel, and PLearn::sqrt().

                                                                             {
    real avg_1 = computeAverage(x1, true);
    real avg_2 = computeAverage(x2, false);
    return source_kernel->evaluate(x1, x2) / sqrt(avg_1 * avg_2);
}

Here is the call graph for this function:

real PLearn::DivisiveNormalizationKernel::evaluate_i_j ( int  i,
int  j 
) const [virtual]

returns evaluate(data(i),data(j))

Reimplemented from PLearn::SourceKernel.

Definition at line 156 of file DivisiveNormalizationKernel.cc.

References average_col, average_row, PLearn::SourceKernel::source_kernel, and PLearn::sqrt().

                                                                 {
    return source_kernel->evaluate_i_j(i,j) / sqrt(average_row[i] * average_col[j]);
}

Here is the call graph for this function:

real PLearn::DivisiveNormalizationKernel::evaluate_i_x ( int  i,
const Vec x,
real  squared_norm_of_x = -1 
) const [virtual]

Return evaluate(data(i),x).

[squared_norm_of_x is just a hint that may allow to speed up computation if it is already known, but it's optional]

Reimplemented from PLearn::SourceKernel.

Definition at line 163 of file DivisiveNormalizationKernel.cc.

References average_row, computeAverage(), PLearn::SourceKernel::source_kernel, and PLearn::sqrt().

                                                                                                {
    return source_kernel->evaluate_i_x(i, x, squared_norm_of_x)
        / sqrt(average_row[i] * computeAverage(x, false, squared_norm_of_x));
}

Here is the call graph for this function:

real PLearn::DivisiveNormalizationKernel::evaluate_i_x_again ( int  i,
const Vec x,
real  squared_norm_of_x = -1,
bool  first_time = false 
) const [virtual]

Return evaluate(data(i),x), where x is the same as in the precedent call to this same function (except if 'first_time' is true).

This can be used to speed up successive computations of K(x_i, x) (default version just calls evaluate_i_x).

Reimplemented from PLearn::Kernel.

Definition at line 171 of file DivisiveNormalizationKernel.cc.

References average_row, avg_evaluate_i_x_again, computeAverage(), PLearn::SourceKernel::source_kernel, and PLearn::sqrt().

                                                                                                                       {
    if (first_time) {
        avg_evaluate_i_x_again = computeAverage(x, false, squared_norm_of_x);
    }
    return source_kernel->evaluate_i_x_again(i, x, squared_norm_of_x, first_time)
        / sqrt(average_row[i] * avg_evaluate_i_x_again);
}

Here is the call graph for this function:

real PLearn::DivisiveNormalizationKernel::evaluate_x_i ( const Vec x,
int  i,
real  squared_norm_of_x = -1 
) const [virtual]

returns evaluate(x,data(i)) [default version calls evaluate_i_x if kernel is_symmetric]

Reimplemented from PLearn::SourceKernel.

Definition at line 182 of file DivisiveNormalizationKernel.cc.

References average_col, computeAverage(), PLearn::SourceKernel::source_kernel, and PLearn::sqrt().

                                                                                                {
    return source_kernel->evaluate_x_i(x, i, squared_norm_of_x)
        / sqrt(average_col[i] * computeAverage(x, true, squared_norm_of_x));
}

Here is the call graph for this function:

real PLearn::DivisiveNormalizationKernel::evaluate_x_i_again ( const Vec x,
int  i,
real  squared_norm_of_x = -1,
bool  first_time = false 
) const [virtual]

Reimplemented from PLearn::Kernel.

Definition at line 190 of file DivisiveNormalizationKernel.cc.

References average_col, avg_evaluate_x_i_again, computeAverage(), PLearn::SourceKernel::source_kernel, and PLearn::sqrt().

                                                                                                                       {
    if (first_time) {
        avg_evaluate_x_i_again = computeAverage(x, true, squared_norm_of_x);
    }
    return source_kernel->evaluate_x_i_again(x, i, squared_norm_of_x, first_time)
        / sqrt(average_col[i] * avg_evaluate_x_i_again);
}

Here is the call graph for this function:

OptionList & PLearn::DivisiveNormalizationKernel::getOptionList ( ) const [virtual]

Reimplemented from PLearn::SourceKernel.

Definition at line 72 of file DivisiveNormalizationKernel.cc.

OptionMap & PLearn::DivisiveNormalizationKernel::getOptionMap ( ) const [virtual]

Reimplemented from PLearn::SourceKernel.

Definition at line 72 of file DivisiveNormalizationKernel.cc.

RemoteMethodMap & PLearn::DivisiveNormalizationKernel::getRemoteMethodMap ( ) const [virtual]

Reimplemented from PLearn::SourceKernel.

Definition at line 72 of file DivisiveNormalizationKernel.cc.

void PLearn::DivisiveNormalizationKernel::makeDeepCopyFromShallowCopy ( CopiesMap copies) [virtual]

Transforms a shallow copy into a deep copy.

Reimplemented from PLearn::SourceKernel.

Definition at line 201 of file DivisiveNormalizationKernel.cc.

References PLearn::SourceKernel::makeDeepCopyFromShallowCopy(), and PLERROR.

{
    inherited::makeDeepCopyFromShallowCopy(copies);

    // ### Call deepCopyField on all "pointer-like" fields 
    // ### that you wish to be deepCopied rather than 
    // ### shallow-copied.
    // ### ex:
    // deepCopyField(trainvec, copies);

    // ### Remove this line when you have fully implemented this method.
    PLERROR("DivisiveNormalizationKernel::makeDeepCopyFromShallowCopy not fully (correctly) implemented yet!");
}

Here is the call graph for this function:

void PLearn::DivisiveNormalizationKernel::setDataForKernelMatrix ( VMat  the_data) [virtual]

** Subclasses may override these methods to provide efficient kernel matrix access **

This method sets the data VMat that will be used to define the kernel matrix. It may precompute values from this that may later accelerate the evaluation of a kernel matrix element

Reimplemented from PLearn::SourceKernel.

Definition at line 218 of file DivisiveNormalizationKernel.cc.

References average_col, average_row, PLearn::Kernel::data, data_will_change, PLearn::TVec< T >::fill(), i, PLearn::Kernel::is_symmetric, j, PLearn::TVec< T >::length(), PLearn::VMat::length(), n, remove_bias, PLearn::TVec< T >::resize(), PLearn::SourceKernel::setDataForKernelMatrix(), and PLearn::SourceKernel::source_kernel.

                                                                      {
    bool there_was_data_and_it_changed = data && !(data->looksTheSameAs(the_data));
    // Set the data for this kernel as well as for the underlying kernel.
    inherited::setDataForKernelMatrix(the_data);
    // Check whether we need to recompute the Gram matrix and its average.
    int n = the_data->length();
    if (   data_will_change
           || average_row.length() != n
           || there_was_data_and_it_changed) {
        // Compute the underlying Gram matrix.
        Mat gram(n, n);
        source_kernel->computeGramMatrix(gram);
        // Compute the row (and column) average.
        average_row.resize(n);
        average_row.fill(0);
        if (is_symmetric) {
            average_col = average_row;
        } else {
            average_col.resize(n);
            average_col.fill(0);
        }
        real k_x_x;
        for (int i = 0; i < n; i++) {
            if (is_symmetric) {
                real v;
                k_x_x = gram(i,i);
                if (!remove_bias) {
                    average_row[i] += k_x_x;
                }
                for (int j = i + 1; j < n; j++) {
                    v = gram(i,j);
                    average_row[i] += v;
                    average_row[j] += v;
                }
            } else {
                for (int j = 0; j < n; j++) {
                    if (!remove_bias || j != i) {
                        average_row[i] += gram(i,j);
                        average_col[i] += gram(j,i);
                        if (j == i) {
                        }
                    }
                }
            }
        }
        real n_terms_in_sum;    // The number of terms summed in average_row.
        if (remove_bias) {
            // The diagonal terms were not added.
            n_terms_in_sum = real(n - 1);
        } else {
            n_terms_in_sum = real(n);
        }
        average_row /= n_terms_in_sum;
        if (!is_symmetric) {
            average_col /= n_terms_in_sum;
        }
    }
}

Here is the call graph for this function:


Member Data Documentation

Reimplemented from PLearn::SourceKernel.

Definition at line 129 of file DivisiveNormalizationKernel.h.

Used to store the values of the source kernel.

Definition at line 60 of file DivisiveNormalizationKernel.h.

Referenced by computeAverage().

The last average computed in evaluate_i_x_again().

Definition at line 74 of file DivisiveNormalizationKernel.h.

Referenced by evaluate_i_x_again().

The last average computed in evaluate_x_i_again().

Definition at line 77 of file DivisiveNormalizationKernel.h.

Referenced by evaluate_x_i_again().

Definition at line 85 of file DivisiveNormalizationKernel.h.

Referenced by declareOptions(), and setDataForKernelMatrix().

Definition at line 86 of file DivisiveNormalizationKernel.h.

Referenced by declareOptions(), and setDataForKernelMatrix().


The documentation for this class was generated from the following files:
 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Friends Defines