PLearn 0.1
Public Member Functions | Static Public Member Functions | Public Attributes | Static Public Attributes | Protected Member Functions | Static Protected Member Functions | Protected Attributes | Private Types | Private Member Functions
PLearn::PDistribution Class Reference

Base class for PLearn probability distributions. More...

#include <PDistribution.h>

Inheritance diagram for PLearn::PDistribution:
Inheritance graph
[legend]
Collaboration diagram for PLearn::PDistribution:
Collaboration graph
[legend]

List of all members.

Public Member Functions

 PDistribution ()
 Default constructor.
virtual void build ()
 Simply calls inherited::build() then build_().
virtual void makeDeepCopyFromShallowCopy (CopiesMap &copies)
 Transforms a shallow copy into a deep copy.
virtual string classname () const
virtual OptionListgetOptionList () const
virtual OptionMapgetOptionMap () const
virtual RemoteMethodMapgetRemoteMethodMap () const
virtual PDistributiondeepCopy (CopiesMap &copies) const
virtual int outputsize () const
 Returned value depends on outputs_def.
virtual void forget ()
 (Re-)initializes the PLearner in its fresh state (that state may depend on the 'seed' option) And sets 'stage' back to 0 (this is the stage of a fresh learner!)
virtual void train ()
 The role of the train method is to bring the learner up to stage==nstages, updating the train_stats collector with training costs measured on-line in the process.
virtual void computeOutput (const Vec &input, Vec &output) const
 Produce outputs according to what is specified in outputs_def.
virtual void computeCostsFromOutputs (const Vec &input, const Vec &output, const Vec &target, Vec &costs) const
 Computes negative log likelihood (NLL).
virtual bool setPredictorPredictedSizes (int the_predictor_size, int the_predicted_size, bool call_parent=true)
 Set the 'predictor' and 'predicted' sizes for this distribution.
virtual void setPredictor (const Vec &predictor, bool call_parent=true) const
 Set the value for the predictor part of a conditional probability.
virtual TVec< string > getTestCostNames () const
 Return [ "NLL" ] (the only cost computed by a PDistribution).
virtual TVec< string > getTrainCostNames () const
 Return [ ].
virtual real log_density (const Vec &y) const
 Return log of probability density log(p(y | x)).
virtual real density (const Vec &y) const
 Return probability density p(y | x) (default version simply returns exp(log_density(y))).
virtual real survival_fn (const Vec &y) const
 Return survival function: P(Y>y | x).
virtual real cdf (const Vec &y) const
 Return cdf: P(Y<y | x).
virtual void expectation (Vec &mu) const
 Return E[Y | x].
virtual void missingExpectation (const Vec &input, Vec &mu)
 Return E[Y | x] where Y is the missing part in the 'input' vector, and x in the observed part (thus discarding any current setting of predictor and predicted sizes).
virtual void variance (Mat &cov) const
 Return Var[Y | x].
virtual void resetGenerator (long g_seed)
 Reset the random number generator used by generate() using the given seed.
virtual void generate (Vec &y) const
 Return a pseudo-random sample generated from the conditional distribution, of density p(y | x).
Vec remote_generate ()
 Remote interface for 'generate'.
virtual void generateN (const Mat &Y) const
 X must be a N x n_predicted matrix.
virtual void generateJoint (Vec &xy)
 Generates a pseudo-random sample (x,y) from the JOINT distribution, of density p(x, y) i.e., generates a predictor and a predicted part, regardless of any previously set predictor.
void generateJoint (Vec &x, Vec &y)
 Generates a pseudo-random sample (x,y) from the JOINT distribution, of density p(x, y), split in two vectors i.e., generates a predictor and a predicted part, regardless of any previously set predictor.
virtual void generatePredictor (Vec &x)
 Generates a pseudo-random sample x from the marginal distribution of predictors, of density p(x), i.e., generates a predictor part, regardless of any previously set predictor.
virtual void generatePredicted (Vec &y)
 Generates a pseudo-random sample y from the marginal distribution of predicted parts, of density p(y) (and NOT p(y | x)).
virtual void generatePredictorGivenPredicted (Vec &x, const Vec &y)
 Generates a pseudo-random sample x from the reversed conditional distribution, of density p(x | y) (and NOT p(y | x)).
int getNPredicted () const
 'Get' accessor for n_predicted.
int getNPredictor () const
 'Get' accessor for n_predictor.

Static Public Member Functions

static string _classname_ ()
static OptionList_getOptionList_ ()
static RemoteMethodMap_getRemoteMethodMap_ ()
static Object_new_instance_for_typemap_ ()
static bool _isa_ (const Object *o)
static void _static_initialize_ ()
static const PPathdeclaringFile ()

Public Attributes

real lower_bound
real upper_bound
int n_curve_points
string outputs_def

Static Public Attributes

static StaticInitializer _static_initializer_

Protected Member Functions

void splitCond (const Vec &input) const
 Split an input into the part corresponding to the predictor (in 'predictor_part'), and the predicted (in 'predicted_part').
virtual void unknownOutput (char def, const Vec &input, Vec &output, int &k) const
 Called in computeOutput when an unknown character is found.

Static Protected Member Functions

static void declareOptions (OptionList &ol)
 Declares this class' options.
static void declareMethods (RemoteMethodMap &rmm)
 Declare the methods that are remote-callable.

Protected Attributes

Vec store_expect
 Global storage to save memory allocations.
Vec store_result
Mat store_cov
real delta_curve
 The step when plotting the curve (upper case outputs_def).
Vec predictor_part
 Used to store the x part in p(y|x).
Vec predicted_part
 Used to store the y part in p(y|x).
int predictor_size
 User-provided sizes of the 'predictor' and 'predicted' sizes.
int predicted_size
int n_predictor
 Learnt sizes of the 'predictor' and 'predicted' sizes.
int n_predicted

Private Types

typedef PLearner inherited

Private Member Functions

void build_ ()
 This does the actual building.

Detailed Description

Base class for PLearn probability distributions.

PDistributions derive from PLearner, as some of them may be fitted to data by training, but they have additional methods allowing e.g. to compute density or generate data points.

By default, a PDistribution may be conditional to a predictor part x, in order to represent the conditional distribution of P(Y | X = x). An unconditional distribution should derive from UnconditionalDistribution as it has a simpler interface.

Since we want to be able to compute for instance P(Y = y | X = x), both the predictor part 'x' and the predicted part 'y' must be considered as input from the PLearner framework point of view. Thus one must specify the size of the predictor part by the 'predictor_size' option, and the size of the predicted by the 'predicted_size' option, satisfying the following equality:

predictor_size + predicted_size == inputsize (1)

Optionally, 'predictor_size' or 'predicted_size' (but not both) may be set to -1, and the PDistribution will automatically guess the other size so that equation (1) is satisfied (actually, in order to preserve the user-provided values of 'predictor_size' and 'predicted_size', the guessed values are stored in the learnt options 'n_predictor' and 'n_predicted'). This way, unconditional distributions can be created by setting 'predictor_size' to 0 and 'predicted_size' to -1.

The default implementations of the learner-type methods for computing outputs and costs work as follows:

For conditional distributions, the input must always be made of both the 'predictor' part (x) and the 'predicted' part (y), even if the output may not need the predicted part (e.g. to compute E[Y | X = x]). The exception is when computeOutput(..) needs to be called successively with the same value of 'x': in this case, after a first call with both 'x' and 'y', one may only provide 'y' as input in later calls, and 'x' will be assumed to be unchanged. Or, alternatively, one can set the 'predictor_part' option first, either through the options system or using the setPredictor(..) method.

IMPLEMENTATION NOTES: Note that many methods are declared as 'const' because of the 'const' plague, but are actually not true 'const' methods. This is also why a lot of stuff is mutable. TODO: Would it be possible to remove some 'const' stuff for cleaner code?

Definition at line 99 of file PDistribution.h.


Member Typedef Documentation


Constructor & Destructor Documentation

PLearn::PDistribution::PDistribution ( )

Default constructor.

Definition at line 53 of file PDistribution.cc.

References PLearn::PLearner::random_gen.

                            :
    delta_curve(0.1),
    predictor_size(0),
    predicted_size(-1),
    n_predictor(-1),
    n_predicted(-1),
    lower_bound(0.),
    upper_bound(0.),
    n_curve_points(-1),
    outputs_def("l")
{
    random_gen = new PRandom();
}

Member Function Documentation

string PLearn::PDistribution::_classname_ ( ) [static]
OptionList & PLearn::PDistribution::_getOptionList_ ( ) [static]
RemoteMethodMap & PLearn::PDistribution::_getRemoteMethodMap_ ( ) [static]
bool PLearn::PDistribution::_isa_ ( const Object o) [static]
Object * PLearn::PDistribution::_new_instance_for_typemap_ ( ) [static]
StaticInitializer PDistribution::_static_initializer_ & PLearn::PDistribution::_static_initialize_ ( ) [static]
void PLearn::PDistribution::build ( ) [virtual]

Simply calls inherited::build() then build_().

Reimplemented from PLearn::PLearner.

Reimplemented in PLearn::ConditionalDensityNet, PLearn::GaussianContinuumDistribution, PLearn::GaussianProcessRegressor, PLearn::PConditionalDistribution, PLearn::LocallyMagnifiedDistribution, PLearn::NeighborhoodBoxVolumeDensityEstimator, PLearn::TransformationLearner, PLearn::GaussianDistribution, PLearn::GaussMix, PLearn::HistogramDistribution, PLearn::KernelDensityEstimator, PLearn::ManifoldParzen2, PLearn::MixtureDistribution, PLearn::NGramDistribution, PLearn::NonLocalManifoldParzen, PLearn::ParzenWindow, PLearn::RandomGaussMix, PLearn::RBMDistribution, PLearn::SpiralDistribution, PLearn::UnconditionalDistribution, PLearn::UniformDistribution, PLearn::GaussianDBNClassification, PLearn::GaussianDBNRegression, PLearn::GaussPartSupervisedDBN, PLearn::HintonDeepBeliefNet, PLearn::PartSupervisedDBN, PLearn::SupervisedDBN, PLearn::UnfrozenDeepBeliefNet, PLearn::GaussianProcessRegressor, and PLearn::ManifoldKNNDistribution.

Definition at line 219 of file PDistribution.cc.

References PLearn::PLearner::build(), and build_().

Referenced by PLearn::TransformationLearner::build(), PLearn::SupervisedDBN::build(), PLearn::PConditionalDistribution::build(), PLearn::PartSupervisedDBN::build(), PLearn::NGramDistribution::build(), PLearn::LocallyMagnifiedDistribution::build(), PLearn::HistogramDistribution::build(), PLearn::HintonDeepBeliefNet::build(), PLearn::GaussPartSupervisedDBN::build(), PLearn::GaussMix::build(), PLearn::GaussianDBNRegression::build(), PLearn::GaussianDBNClassification::build(), and PLearn::ConditionalDensityNet::build().

Here is the call graph for this function:

Here is the caller graph for this function:

void PLearn::PDistribution::build_ ( ) [private]

This does the actual building.

Reimplemented from PLearn::PLearner.

Reimplemented in PLearn::ConditionalDensityNet, PLearn::GaussianContinuumDistribution, PLearn::GaussianProcessRegressor, PLearn::PConditionalDistribution, PLearn::LocallyMagnifiedDistribution, PLearn::NeighborhoodBoxVolumeDensityEstimator, PLearn::TransformationLearner, PLearn::GaussianDistribution, PLearn::GaussMix, PLearn::HistogramDistribution, PLearn::KernelDensityEstimator, PLearn::ManifoldParzen2, PLearn::MixtureDistribution, PLearn::NGramDistribution, PLearn::NonLocalManifoldParzen, PLearn::ParzenWindow, PLearn::RandomGaussMix, PLearn::RBMDistribution, PLearn::SpiralDistribution, PLearn::UnconditionalDistribution, PLearn::UniformDistribution, PLearn::GaussianDBNClassification, PLearn::GaussianDBNRegression, PLearn::GaussPartSupervisedDBN, PLearn::HintonDeepBeliefNet, PLearn::PartSupervisedDBN, PLearn::SupervisedDBN, PLearn::UnfrozenDeepBeliefNet, PLearn::GaussianProcessRegressor, and PLearn::ManifoldKNNDistribution.

Definition at line 228 of file PDistribution.cc.

References delta_curve, lower_bound, n_curve_points, predicted_size, predictor_part, predictor_size, resetGenerator(), PLearn::PLearner::seed_, setPredictor(), setPredictorPredictedSizes(), and upper_bound.

Referenced by build().

{
    // Reset the random number generator seed.
    resetGenerator(seed_);

    // Typical code for a PDistribution: the class makes the operations it
    // needs when the predictor and predicted sizes are defined, and when the
    // predictor is defined. In the build_() method, it should not call the
    // parent's methods since they should have already been called during the
    // parent's build.
    PDistribution::setPredictorPredictedSizes(predictor_size, predicted_size,
                                              false);
    PDistribution::setPredictor(predictor_part, false);

    // Set the step between two points in the output curve.
    if (n_curve_points > 0)
        delta_curve = (upper_bound - lower_bound) / real(n_curve_points);
}

Here is the call graph for this function:

Here is the caller graph for this function:

real PLearn::PDistribution::cdf ( const Vec y) const [virtual]
string PLearn::PDistribution::classname ( ) const [virtual]
void PLearn::PDistribution::computeCostsFromOutputs ( const Vec input,
const Vec output,
const Vec target,
Vec costs 
) const [virtual]

Computes negative log likelihood (NLL).

If the first output is neither the log density nor the density, an error will be raised.

Implements PLearn::PLearner.

Reimplemented in PLearn::GaussianProcessRegressor, PLearn::GaussianDBNClassification, PLearn::GaussianDBNRegression, PLearn::GaussPartSupervisedDBN, PLearn::HintonDeepBeliefNet, PLearn::PartSupervisedDBN, PLearn::SupervisedDBN, and PLearn::GaussianProcessRegressor.

Definition at line 352 of file PDistribution.cc.

References c, outputs_def, pl_log, PLERROR, and PLearn::TVec< T >::resize().

Referenced by PLearn::GaussianDBNClassification::computeCostsFromOutputs(), PLearn::HintonDeepBeliefNet::computeCostsFromOutputs(), PLearn::PartSupervisedDBN::computeCostsFromOutputs(), PLearn::GaussPartSupervisedDBN::computeCostsFromOutputs(), PLearn::SupervisedDBN::computeCostsFromOutputs(), and PLearn::GaussianDBNRegression::computeCostsFromOutputs().

{
    costs.resize(1);
    char c = outputs_def[0];
    if(c == 'l')
    {
        costs[0] = -output[0];
    }
    else if(c == 'd')
    {
        costs[0] = -pl_log(output[0]);
    }
    else
        PLERROR("In PDistribution::computeCostsFromOutputs currently can only "
                "compute' NLL cost from log likelihood or density returned as "
                "first output");
}

Here is the call graph for this function:

Here is the caller graph for this function:

void PLearn::PDistribution::computeOutput ( const Vec input,
Vec output 
) const [virtual]

Produce outputs according to what is specified in outputs_def.

Reimplemented from PLearn::PLearner.

Reimplemented in PLearn::GaussianContinuumDistribution, PLearn::GaussianProcessRegressor, PLearn::PConditionalDistribution, PLearn::ManifoldParzen2, PLearn::NonLocalManifoldParzen, and PLearn::GaussianProcessRegressor.

Definition at line 250 of file PDistribution.cc.

References cdf(), delta_curve, density(), expectation(), i, j, log_density(), lower_bound, n_curve_points, n_predicted, outputs_def, outputsize(), PLERROR, predicted_part, PLearn::TVec< T >::resize(), setPredictor(), splitCond(), PLearn::square(), store_cov, store_expect, store_result, PLearn::TVec< T >::subVec(), survival_fn(), PLearn::TVec< T >::toMat(), unknownOutput(), and variance().

Referenced by PLearn::NonLocalManifoldParzen::computeOutput(), PLearn::ManifoldParzen2::computeOutput(), and PLearn::GaussianContinuumDistribution::computeOutput().

{
    // TODO Add an output to generate samples.

    // Set the 'predictor' (x in P(Y = y| X=x)) and 'predicted' (y) parts.
    splitCond(input);

    string::size_type l = outputs_def.length();
    output.resize(outputsize());

    int k = 0;
    for(unsigned int i=0; i<l; i++)
    {
        switch(outputs_def[i])
        {
        case 'l':
            output[k++] = log_density(predicted_part);
            break;
        case 'd':
            output[k++] = density(predicted_part);
            break;
        case 'c':
            output[k++] = cdf(predicted_part);
            break;
        case 's':
            output[k++] = survival_fn(predicted_part);
            break;
        case 'e':
            store_expect = output.subVec(k, n_predicted);
            expectation(store_expect);
            k += n_predicted;
            break;
        case 'v':
            store_cov =
                output.subVec(k, square(n_predicted)).toMat(n_predicted,n_predicted);
            variance(store_cov);
            k += square(n_predicted);
            break;
        case 'E':
        case 'V':
            if (n_predicted > 1)
                PLERROR("In PDistribution::computeOutput - Can only plot "
                        "histogram of expectation or variance for "
                        "one-dimensional expected part");
            if (n_predicted == 0)
                PLERROR("In PDistribution::computeOutput - Cannot plot "
                        "histogram of expectation or variance for "
                        "unconditional distributions");
        case 'L':
        case 'D':
        case 'C':
        case 'S':
            real t;
            store_result.resize(1);
            store_result[0] = lower_bound;
            for (int j = 0; j < n_curve_points; j++) {
                switch(outputs_def[i]) {
                case 'L':
                    t = log_density(store_result);
                    break;
                case 'D':
                    t = density(store_result);
                    break;
                case 'C':
                    t = cdf(store_result);
                    break;
                case 'S':
                    t = survival_fn(store_result);
                    break;
                case 'E':
                    setPredictor(store_result);
                    expectation(store_expect);
                    t = store_expect[0];
                    break;
                case 'V':
                    setPredictor(store_result);
                    store_cov = store_expect.toMat(1,1);
                    variance(store_cov);
                    t = store_expect[0];
                    break;
                default:
                    PLERROR("In PDistribution::computeOutput - This should "
                            "never happen");
                    t = 0; // To make the compiler happy.
                }
                output[j + k] = t;
                store_result[0] += delta_curve;
            }
            k += n_curve_points;
            break;
        default:
            // Maybe a subclass knows about this output?
            // TODO This is quite ugly. See how to do this better.
            unknownOutput(outputs_def[i], input, output, k);
            break;
        }
    }
}

Here is the call graph for this function:

Here is the caller graph for this function:

void PLearn::PDistribution::declareMethods ( RemoteMethodMap rmm) [static, protected]

Declare the methods that are remote-callable.

Reimplemented from PLearn::PLearner.

Reimplemented in PLearn::TransformationLearner, and PLearn::GaussianDistribution.

Definition at line 198 of file PDistribution.cc.

References PLearn::PLearner::_getRemoteMethodMap_(), PLearn::declareMethod(), PLearn::RemoteMethodMap::inherited(), log_density(), and remote_generate().

{
    // Insert a backpointer to remote methods; note that this
    // different from declareOptions()
    rmm.inherited(inherited::_getRemoteMethodMap_());

    declareMethod(
        rmm, "log_density", &PDistribution::log_density,
        (BodyDoc("Compute the log density of a data point"),
         ArgDoc("sample", "The data point"),
         RetDoc("The log density")));

    declareMethod(
        rmm, "generate", &PDistribution::remote_generate,
        (BodyDoc("Generate a sample"),
         RetDoc("The generated sample")));
}

Here is the call graph for this function:

void PLearn::PDistribution::declareOptions ( OptionList ol) [static, protected]

Declares this class' options.

Reimplemented from PLearn::PLearner.

Reimplemented in PLearn::ConditionalDensityNet, PLearn::GaussianContinuumDistribution, PLearn::GaussianProcessRegressor, PLearn::PConditionalDistribution, PLearn::LocallyMagnifiedDistribution, PLearn::NeighborhoodBoxVolumeDensityEstimator, PLearn::TransformationLearner, PLearn::GaussianDistribution, PLearn::GaussMix, PLearn::HistogramDistribution, PLearn::KernelDensityEstimator, PLearn::ManifoldParzen2, PLearn::MixtureDistribution, PLearn::NGramDistribution, PLearn::NonLocalManifoldParzen, PLearn::ParzenWindow, PLearn::RandomGaussMix, PLearn::RBMDistribution, PLearn::SpiralDistribution, PLearn::UnconditionalDistribution, PLearn::UniformDistribution, PLearn::GaussianDBNClassification, PLearn::GaussianDBNRegression, PLearn::GaussPartSupervisedDBN, PLearn::HintonDeepBeliefNet, PLearn::PartSupervisedDBN, PLearn::SupervisedDBN, PLearn::UnfrozenDeepBeliefNet, PLearn::GaussianProcessRegressor, and PLearn::ManifoldKNNDistribution.

Definition at line 115 of file PDistribution.cc.

References PLearn::OptionBase::buildoption, PLearn::declareOption(), PLearn::PLearner::declareOptions(), PLearn::OptionBase::learntoption, lower_bound, n_curve_points, n_predicted, n_predictor, outputs_def, predicted_size, predictor_part, predictor_size, and upper_bound.

Referenced by PLearn::TransformationLearner::declareOptions(), PLearn::SupervisedDBN::declareOptions(), PLearn::PConditionalDistribution::declareOptions(), PLearn::PartSupervisedDBN::declareOptions(), PLearn::NGramDistribution::declareOptions(), PLearn::LocallyMagnifiedDistribution::declareOptions(), PLearn::HistogramDistribution::declareOptions(), PLearn::HintonDeepBeliefNet::declareOptions(), PLearn::GaussPartSupervisedDBN::declareOptions(), PLearn::GaussMix::declareOptions(), PLearn::GaussianDBNRegression::declareOptions(), PLearn::GaussianDBNClassification::declareOptions(), and PLearn::ConditionalDensityNet::declareOptions().

{

    // Build options.

    declareOption(
        ol, "outputs_def", &PDistribution::outputs_def,
                           OptionBase::buildoption,
        "Defines what will be given in output. This is a string where the\n"
        "characters have the following meaning:\n"
        "- 'l' : log_density\n"
        "- 'd' : density\n"
        "- 'c' : cdf\n"
        "- 's' : survival_fn\n"
        "- 'e' : expectation\n"
        "- 'v' : variance.\n"
        "\n"
        "If these options are specified in lower case they give the value\n"
        "associated with a given observation. In upper case, a curve is\n"
        "evaluated at regular intervals and produced in output (as a\n"
        "histogram). For 'L', 'D', 'C', 'S', it is the predicted part that\n"
        "varies, while for 'E' and 'V' it is the predictor part (for\n"
        "conditional distributions).\n"
        "The number of curve points is given by the 'n_curve_points' option.\n"
        "Note that the upper case letters only work for scalar variables, in\n"
        "order to produce a one-dimensional curve."
        );
    // TODO Make it TVec<string> for better clarity?

    declareOption(ol, "predictor_size",  &PDistribution::predictor_size,
                                  OptionBase::buildoption,
        "The (user-provided) size of the predictor x in p(y|x). A value of\n"
        "-1 means the algorithm should find it out by itself.");

    declareOption(ol, "predicted_size", &PDistribution::predicted_size,
                                        OptionBase::buildoption,
        "The (user-provided) size of the predicted y in p(y|x). A value of\n"
        "-1 means the algorithm should find it out by itself.");

    declareOption(ol, "predictor_part", &PDistribution::predictor_part,
                                        OptionBase::buildoption,
        "In conditional distributions, the predictor part (x in P(Y|X=x)).\n");

    declareOption(ol, "n_curve_points", &PDistribution::n_curve_points,
                                        OptionBase::buildoption,
        "The number of points for which the output is evaluated when\n"
        "outputs_defs is upper case (produces a histogram).\n"
        "The lower_bound and upper_bound options specify where the curve\n"
        "begins and ends. Note that these options (upper case letters) only\n"
        "work for scalar variables.");

    declareOption(ol, "lower_bound",  &PDistribution::lower_bound,
                                      OptionBase::buildoption,
        "The lower bound of scalar Y values to compute a histogram of the\n"
        "distribution when upper case outputs_def are specified.");

    declareOption(ol, "upper_bound",  &PDistribution::upper_bound,
                                      OptionBase::buildoption,
        "The upper bound of scalar Y values to compute a histogram of the\n"
        "distribution when upper case outputs_def are specified.");

    // Learnt options.

    declareOption(ol, "n_predictor",  &PDistribution::n_predictor,
                                      OptionBase::learntoption,
        "The (true) size of the predictor x in p(y|x). If 'predictor_size'\n"
        "is non-negative, 'n_predictor' is set to 'predictor_size'.\n"
        "Otherwise, it is set to the data dimension minus 'predicted_size'.");

    declareOption(ol, "n_predicted",  &PDistribution::n_predicted,
                                      OptionBase::learntoption,
        "The (true) size of the predicted y in p(y|x). If 'predicted_size'\n"
        "is non-negative, 'n_predicted' is set to 'predicted_size'.\n"
        "Otherwise, it is set to the data dimension minus 'predictor_size'.");

    // Now call the parent class' declareOptions
    inherited::declareOptions(ol);

}

Here is the call graph for this function:

Here is the caller graph for this function:

static const PPath& PLearn::PDistribution::declaringFile ( ) [inline, static]
PDistribution * PLearn::PDistribution::deepCopy ( CopiesMap copies) const [virtual]
real PLearn::PDistribution::density ( const Vec y) const [virtual]

Return probability density p(y | x) (default version simply returns exp(log_density(y))).

Reimplemented in PLearn::HistogramDistribution, PLearn::NGramDistribution, PLearn::GaussianDBNClassification, PLearn::GaussianDBNRegression, PLearn::GaussPartSupervisedDBN, PLearn::HintonDeepBeliefNet, PLearn::PartSupervisedDBN, and PLearn::SupervisedDBN.

Definition at line 563 of file PDistribution.cc.

References PLearn::exp(), and log_density().

Referenced by computeOutput(), and PLearn::PConditionalDistribution::computeOutput().

{ return exp(log_density(y)); }

Here is the call graph for this function:

Here is the caller graph for this function:

void PLearn::PDistribution::expectation ( Vec mu) const [virtual]
void PLearn::PDistribution::forget ( ) [virtual]
void PLearn::PDistribution::generate ( Vec y) const [virtual]
void PLearn::PDistribution::generateJoint ( Vec xy) [virtual]

Generates a pseudo-random sample (x,y) from the JOINT distribution, of density p(x, y) i.e., generates a predictor and a predicted part, regardless of any previously set predictor.

Reimplemented in PLearn::UnconditionalDistribution.

Definition at line 584 of file PDistribution.cc.

References generate(), n_predicted, n_predictor, and setPredictorPredictedSizes().

Referenced by generateJoint(), generatePredicted(), and generatePredictor().

{
    // get old sizes
    int old_n_predictor = n_predictor;
    int old_n_predicted = n_predicted;

    // set all inputs as predicted to generate a joint sample
    setPredictorPredictedSizes(0, -1);
    generate( xy );

    // restore old sizes
    setPredictorPredictedSizes(old_n_predictor, old_n_predicted);
}

Here is the call graph for this function:

Here is the caller graph for this function:

void PLearn::PDistribution::generateJoint ( Vec x,
Vec y 
)

Generates a pseudo-random sample (x,y) from the JOINT distribution, of density p(x, y), split in two vectors i.e., generates a predictor and a predicted part, regardless of any previously set predictor.

Definition at line 598 of file PDistribution.cc.

References generateJoint(), n_predicted, n_predictor, and PLearn::TVec< T >::subVec().

{
    Vec joint_sample;
    generateJoint( joint_sample );
    x = joint_sample.subVec(0, n_predictor);
    y = joint_sample.subVec(n_predictor, n_predicted);
}

Here is the call graph for this function:

void PLearn::PDistribution::generateN ( const Mat Y) const [virtual]

X must be a N x n_predicted matrix.

that will be filled. This will call generate N times to fill the N rows of the matrix.

Reimplemented in PLearn::RBMDistribution.

Definition at line 395 of file PDistribution.cc.

References generate(), i, PLearn::TMat< T >::length(), N, n_predicted, PLERROR, PLearn::PLearner::report_progress, and PLearn::TMat< T >::width().

{
    Vec v;
    if (Y.width() != n_predicted)
        PLERROR("In PDistribution::generateN - Matrix width (%d) differs from "
                "n_predicted (%d)", Y.width(), n_predicted);
    int N = Y.length();
    PP<ProgressBar> pb =
        report_progress ? new ProgressBar("Generating samples", N)
                        : NULL;
    for(int i=0; i<N; i++)
    {
        v = Y(i);
        generate(v);
        if (pb)
            pb->update(i);
    }
}

Here is the call graph for this function:

void PLearn::PDistribution::generatePredicted ( Vec y) [virtual]

Generates a pseudo-random sample y from the marginal distribution of predicted parts, of density p(y) (and NOT p(y | x)).

i.e., generates a predicted part, regardless of any previously set predictor.

Reimplemented in PLearn::UnconditionalDistribution.

Definition at line 612 of file PDistribution.cc.

References generateJoint(), and x.

{
    Vec x;
    generateJoint(x, y);
}

Here is the call graph for this function:

void PLearn::PDistribution::generatePredictor ( Vec x) [virtual]

Generates a pseudo-random sample x from the marginal distribution of predictors, of density p(x), i.e., generates a predictor part, regardless of any previously set predictor.

Reimplemented in PLearn::UnconditionalDistribution.

Definition at line 606 of file PDistribution.cc.

References generateJoint().

{
    Vec y;
    generateJoint(x, y);
}

Here is the call graph for this function:

void PLearn::PDistribution::generatePredictorGivenPredicted ( Vec x,
const Vec y 
) [virtual]

Generates a pseudo-random sample x from the reversed conditional distribution, of density p(x | y) (and NOT p(y | x)).

i.e., generates a "predictor" part given a "predicted" part, regardless of any previously set predictor.

Reimplemented in PLearn::UnconditionalDistribution.

Definition at line 618 of file PDistribution.cc.

References PLERROR.

{ PLERROR("generatePredictorGivenPredicted not implemented for this\n"
          "PDistribution\n"); }
int PLearn::PDistribution::getNPredicted ( ) const [inline]

'Get' accessor for n_predicted.

Definition at line 340 of file PDistribution.h.

{ return n_predicted; }
int PLearn::PDistribution::getNPredictor ( ) const [inline]

'Get' accessor for n_predictor.

Definition at line 343 of file PDistribution.h.

{ return n_predictor; }
OptionList & PLearn::PDistribution::getOptionList ( ) const [virtual]
OptionMap & PLearn::PDistribution::getOptionMap ( ) const [virtual]
RemoteMethodMap & PLearn::PDistribution::getRemoteMethodMap ( ) const [virtual]
TVec< string > PLearn::PDistribution::getTestCostNames ( ) const [virtual]

Return [ "NLL" ] (the only cost computed by a PDistribution).

Implements PLearn::PLearner.

Reimplemented in PLearn::GaussianProcessRegressor, PLearn::GaussianDBNClassification, PLearn::GaussianDBNRegression, PLearn::GaussPartSupervisedDBN, PLearn::HintonDeepBeliefNet, PLearn::PartSupervisedDBN, PLearn::SupervisedDBN, and PLearn::GaussianProcessRegressor.

Definition at line 374 of file PDistribution.cc.

References PLearn::TVec< T >::append(), and PLearn::TVec< T >::isEmpty().

Referenced by PLearn::ConditionalDensityNet::getTrainCostNames().

{
    TVec<string> nll_cost;
    if (nll_cost.isEmpty())
        nll_cost.append("NLL");
    return nll_cost;
}

Here is the call graph for this function:

Here is the caller graph for this function:

TVec< string > PLearn::PDistribution::getTrainCostNames ( ) const [virtual]

Return [ ].

Implements PLearn::PLearner.

Reimplemented in PLearn::ConditionalDensityNet, PLearn::GaussianProcessRegressor, PLearn::GaussMix, PLearn::GaussPartSupervisedDBN, PLearn::HintonDeepBeliefNet, PLearn::PartSupervisedDBN, PLearn::SupervisedDBN, and PLearn::GaussianProcessRegressor.

Definition at line 385 of file PDistribution.cc.

{
    // Default = no train cost computed. This may be overridden in subclasses.
    TVec<string> no_cost;
    return no_cost;
}
real PLearn::PDistribution::log_density ( const Vec y) const [virtual]
void PLearn::PDistribution::makeDeepCopyFromShallowCopy ( CopiesMap copies) [virtual]

Transforms a shallow copy into a deep copy.

Reimplemented from PLearn::PLearner.

Reimplemented in PLearn::ConditionalDensityNet, PLearn::GaussianContinuumDistribution, PLearn::GaussianProcessRegressor, PLearn::PConditionalDistribution, PLearn::LocallyMagnifiedDistribution, PLearn::NeighborhoodBoxVolumeDensityEstimator, PLearn::TransformationLearner, PLearn::GaussianDistribution, PLearn::GaussMix, PLearn::HistogramDistribution, PLearn::KernelDensityEstimator, PLearn::ManifoldParzen2, PLearn::MixtureDistribution, PLearn::NGramDistribution, PLearn::NonLocalManifoldParzen, PLearn::ParzenWindow, PLearn::RandomGaussMix, PLearn::RBMDistribution, PLearn::SpiralDistribution, PLearn::UnconditionalDistribution, PLearn::UniformDistribution, PLearn::GaussianDBNClassification, PLearn::GaussianDBNRegression, PLearn::GaussPartSupervisedDBN, PLearn::HintonDeepBeliefNet, PLearn::PartSupervisedDBN, PLearn::SupervisedDBN, PLearn::UnfrozenDeepBeliefNet, PLearn::GaussianProcessRegressor, and PLearn::ManifoldKNNDistribution.

Definition at line 417 of file PDistribution.cc.

References PLearn::deepCopyField(), PLearn::PLearner::makeDeepCopyFromShallowCopy(), predicted_part, predictor_part, store_cov, store_expect, and store_result.

Referenced by PLearn::TransformationLearner::makeDeepCopyFromShallowCopy(), PLearn::SupervisedDBN::makeDeepCopyFromShallowCopy(), PLearn::PConditionalDistribution::makeDeepCopyFromShallowCopy(), PLearn::PartSupervisedDBN::makeDeepCopyFromShallowCopy(), PLearn::NGramDistribution::makeDeepCopyFromShallowCopy(), PLearn::LocallyMagnifiedDistribution::makeDeepCopyFromShallowCopy(), PLearn::HistogramDistribution::makeDeepCopyFromShallowCopy(), PLearn::HintonDeepBeliefNet::makeDeepCopyFromShallowCopy(), PLearn::GaussPartSupervisedDBN::makeDeepCopyFromShallowCopy(), PLearn::GaussMix::makeDeepCopyFromShallowCopy(), PLearn::GaussianDBNRegression::makeDeepCopyFromShallowCopy(), PLearn::GaussianDBNClassification::makeDeepCopyFromShallowCopy(), and PLearn::ConditionalDensityNet::makeDeepCopyFromShallowCopy().

Here is the call graph for this function:

Here is the caller graph for this function:

void PLearn::PDistribution::missingExpectation ( const Vec input,
Vec mu 
) [virtual]

Return E[Y | x] where Y is the missing part in the 'input' vector, and x in the observed part (thus discarding any current setting of predictor and predicted sizes).

The values in return vector 'mu' are ordered like the missing values in 'input'. This method must be implemented in sub-classes, as the default behavior is to throw an error.

Reimplemented in PLearn::GaussMix.

Definition at line 575 of file PDistribution.cc.

References PLERROR.

{ PLERROR("missingExpectation not implemented for this PDistribution"); }
int PLearn::PDistribution::outputsize ( ) const [virtual]

Returned value depends on outputs_def.

Implements PLearn::PLearner.

Reimplemented in PLearn::GaussianContinuumDistribution, PLearn::GaussianProcessRegressor, PLearn::GaussMix, PLearn::ManifoldParzen2, PLearn::NonLocalManifoldParzen, and PLearn::GaussianProcessRegressor.

Definition at line 430 of file PDistribution.cc.

References i, n_curve_points, n_predicted, outputs_def, and PLearn::square().

Referenced by computeOutput(), PLearn::GaussMix::outputsize(), PLearn::NonLocalManifoldParzen::outputsize(), and PLearn::GaussianContinuumDistribution::outputsize().

{
    int l = 0;
    for (size_t i=0; i<outputs_def.length(); i++) {
        if (outputs_def[i]=='L' || outputs_def[i]=='D' || outputs_def[i]=='C'
         || outputs_def[i]=='S' || outputs_def[i]=='E' || outputs_def[i]=='V')
            l+=n_curve_points;
        else if (outputs_def[i]=='e')
            l += n_predicted;
        else if (outputs_def[i]=='v')
            // Variance is full (n x n) matrix.
            l += square(n_predicted);
        else l++;
    }
    return l;
}

Here is the call graph for this function:

Here is the caller graph for this function:

Vec PLearn::PDistribution::remote_generate ( )

Remote interface for 'generate'.

Definition at line 451 of file PDistribution.cc.

References generate(), and PLearn::sample().

Referenced by declareMethods().

{
    Vec sample;
    generate(sample);
    return sample;
}

Here is the call graph for this function:

Here is the caller graph for this function:

void PLearn::PDistribution::resetGenerator ( long  g_seed) [virtual]

Reset the random number generator used by generate() using the given seed.

Default behavior is to call random_gen->manual_seed(g_seed) and to save the given seed. This method is called in build(). Exception: if 'g_seed' is zero, then do nothing.

Reimplemented in PLearn::ConditionalDensityNet, PLearn::KernelDensityEstimator, PLearn::MixtureDistribution, PLearn::RBMDistribution, and PLearn::UniformDistribution.

Definition at line 461 of file PDistribution.cc.

References PLearn::PLearner::random_gen, and PLearn::PLearner::seed_.

Referenced by build_(), PLearn::SupervisedDBN::forget(), PLearn::HintonDeepBeliefNet::forget(), forget(), PLearn::PartSupervisedDBN::forget(), PLearn::GaussPartSupervisedDBN::forget(), PLearn::GaussianDBNClassification::forget(), PLearn::GaussianDBNRegression::forget(), PLearn::RBMDistribution::resetGenerator(), PLearn::KernelDensityEstimator::resetGenerator(), and PLearn::UniformDistribution::resetGenerator().

{
    if (g_seed != 0) {
        seed_ = g_seed;
        random_gen->manual_seed(g_seed);
    }
}

Here is the caller graph for this function:

void PLearn::PDistribution::setPredictor ( const Vec predictor,
bool  call_parent = true 
) const [virtual]

Set the value for the predictor part of a conditional probability.

This needs to be implemented in subclasses if there is something special to do (like precomputing some data). The default behavior is just to fill 'predictor_part' with the first 'n_predictor' elements of 'predictor'. As with 'setPredictorPredictedSizes(..)', the boolean 'call_parent' indicates whether or not one should call inherited::setPredictor(..) with the same arguments.

Reimplemented in PLearn::GaussMix, PLearn::MixtureDistribution, PLearn::UnconditionalDistribution, PLearn::GaussianDBNClassification, PLearn::GaussianDBNRegression, PLearn::GaussPartSupervisedDBN, PLearn::HintonDeepBeliefNet, PLearn::PartSupervisedDBN, and PLearn::SupervisedDBN.

Definition at line 472 of file PDistribution.cc.

References PLearn::TVec< T >::length(), n_predictor, PLASSERT, predictor_part, and PLearn::TVec< T >::subVec().

Referenced by build_(), computeOutput(), PLearn::SupervisedDBN::setPredictor(), PLearn::GaussianDBNClassification::setPredictor(), PLearn::GaussPartSupervisedDBN::setPredictor(), PLearn::PartSupervisedDBN::setPredictor(), PLearn::HintonDeepBeliefNet::setPredictor(), PLearn::GaussMix::setPredictor(), PLearn::GaussianDBNRegression::setPredictor(), and splitCond().

{
    // Default behavior: only fill 'predictor_part' with first elements of
    // 'predictor'.
    PLASSERT( predictor.length()      >= n_predictor );
    PLASSERT( predictor_part.length() == n_predictor );
    if (predictor != predictor_part)
        predictor_part << predictor.subVec(0, n_predictor);
}

Here is the call graph for this function:

Here is the caller graph for this function:

bool PLearn::PDistribution::setPredictorPredictedSizes ( int  the_predictor_size,
int  the_predicted_size,
bool  call_parent = true 
) [virtual]

Set the 'predictor' and 'predicted' sizes for this distribution.

'the_predictor_size' is the size of the predictor, i.e. of x in p(y|x). 'the_predicted_size' is the size of the predicted, i.e. of y in p(y|x). This is a virtual method: if 'call_parent' is set to true, then the inherited::setPredictorPredictedSizes(..) method will also be called, with the same arguments: this is useful in the build process, where each class is able to call only its own method by setting 'call_parent' to false. If 'call_parent' is true, returns 'true' iff the predictor or predicted sizes have been modified from their previous value. If 'call_parent' is false, returns 'false'.

Reimplemented in PLearn::GaussMix, PLearn::MixtureDistribution, PLearn::GaussianDBNClassification, PLearn::GaussianDBNRegression, PLearn::GaussPartSupervisedDBN, PLearn::HintonDeepBeliefNet, PLearn::PartSupervisedDBN, and PLearn::SupervisedDBN.

Definition at line 485 of file PDistribution.cc.

References PLearn::PLearner::inputsize_, n_predicted, n_predictor, PLASSERT, PLERROR, predicted_part, predicted_size, predictor_part, predictor_size, and PLearn::TVec< T >::resize().

Referenced by build_(), generateJoint(), PLearn::GaussianDBNRegression::setPredictorPredictedSizes(), PLearn::GaussianDBNClassification::setPredictorPredictedSizes(), PLearn::HintonDeepBeliefNet::setPredictorPredictedSizes(), PLearn::GaussMix::setPredictorPredictedSizes(), PLearn::GaussPartSupervisedDBN::setPredictorPredictedSizes(), PLearn::PartSupervisedDBN::setPredictorPredictedSizes(), and PLearn::SupervisedDBN::setPredictorPredictedSizes().

{
    PLASSERT( (the_predictor_size  >= 0 || the_predictor_size  == -1) &&
            (the_predicted_size >= 0 || the_predicted_size == -1) );
    int backup_n_predictor = n_predictor;
    int backup_n_predicted = n_predicted;
    n_predictor = predictor_size = the_predictor_size;
    n_predicted = predicted_size = the_predicted_size;
    if (n_predictor < 0) {
        if (n_predicted < 0)
            PLERROR("In PDistribution::setPredictorPredictedSizes - You need"
                    "to specify at least one non-negative value");
        if (inputsize_ >= 0) {
            if (n_predicted > inputsize_)
                PLERROR("In PDistribution::setPredictorPredictedSizes - "
                        "'n_predicted' (%d) cannot be > inputsize (%d)",
                        n_predicted, inputsize_);
            n_predictor = inputsize_ - n_predicted;
        }
    } else if (n_predicted < 0) {
        if (inputsize_ >= 0) {
            if (n_predictor > inputsize_)
                PLERROR("In PDistribution::setPredictorPredictedSizes - "
                        "'n_predictor' (%d) cannot be > inputsize (%d)",
                        n_predictor, inputsize_);
            n_predicted = inputsize_ - n_predictor;
        }
    }
    if (inputsize_ >= 0 && n_predictor + n_predicted != inputsize_)
        PLERROR("In PDistribution::setPredictorPredictedSizes - n_predictor "
                "(%d) + n_predicted (%d) != inputsize (%d)",
                n_predictor, n_predicted, inputsize_);
    if (n_predictor >= 0)
        predictor_part.resize(n_predictor);
    if (n_predicted >= 0)
        predicted_part.resize(n_predicted);
    if (!call_parent)
        return false;
    else
        return (n_predictor != backup_n_predictor ||
                n_predicted != backup_n_predicted);
}

Here is the call graph for this function:

Here is the caller graph for this function:

void PLearn::PDistribution::splitCond ( const Vec input) const [protected]

Split an input into the part corresponding to the predictor (in 'predictor_part'), and the predicted (in 'predicted_part').

Also call setPredictor(..) with the new predictor part. If 'input' turns out to only have a predicted part (i.e. its length is equal to 'n_predicted'), then no predictor part will be set (it is assumed to stay the same as before).

Definition at line 533 of file PDistribution.cc.

References PLearn::TVec< T >::length(), n_predicted, n_predictor, PLASSERT, predicted_part, setPredictor(), and PLearn::TVec< T >::subVec().

Referenced by PLearn::GaussianDBNClassification::computeCostsFromOutputs(), PLearn::HintonDeepBeliefNet::computeCostsFromOutputs(), PLearn::PartSupervisedDBN::computeCostsFromOutputs(), PLearn::GaussPartSupervisedDBN::computeCostsFromOutputs(), PLearn::GaussianDBNRegression::computeCostsFromOutputs(), computeOutput(), PLearn::GaussianDBNRegression::fineTuneByGradientDescent(), PLearn::GaussPartSupervisedDBN::fineTuneByGradientDescent(), PLearn::HintonDeepBeliefNet::fineTuneByGradientDescent(), PLearn::GaussianDBNClassification::fineTuneByGradientDescent(), PLearn::PartSupervisedDBN::fineTuneByGradientDescent(), PLearn::SupervisedDBN::fineTuneByGradientDescent(), PLearn::GaussianDBNRegression::fineTuneByGradientDescentLastLayer(), PLearn::GaussianDBNRegression::train(), and PLearn::UnfrozenDeepBeliefNet::train().

                                                    {
    if (n_predictor == 0 || (n_predictor > 0 && input.length() == n_predicted))
    {
        // No predictor part provided: this means this is the same as before
        // (or that there is none at all).
        predicted_part << input;
    } else {
        PLASSERT( input.length() == n_predictor + n_predicted );
        predicted_part << input.subVec(n_predictor, n_predicted);
        setPredictor(input);
    }
}

Here is the call graph for this function:

Here is the caller graph for this function:

real PLearn::PDistribution::survival_fn ( const Vec y) const [virtual]
void PLearn::PDistribution::train ( ) [virtual]
void PLearn::PDistribution::unknownOutput ( char  def,
const Vec input,
Vec output,
int k 
) const [protected, virtual]

Called in computeOutput when an unknown character is found.

Reimplemented in PLearn::GaussMix.

Definition at line 628 of file PDistribution.cc.

References PLERROR.

Referenced by computeOutput(), and PLearn::GaussMix::unknownOutput().

{
    // Default is to throw an error.
    // TODO Can we find a better way to do this?
    PLERROR("In PDistribution::unknownOutput - Unrecognized outputs_def "
            "character: '%c'", def);
}

Here is the caller graph for this function:

void PLearn::PDistribution::variance ( Mat cov) const [virtual]

Member Data Documentation

The step when plotting the curve (upper case outputs_def).

Definition at line 114 of file PDistribution.h.

Referenced by build_(), and computeOutput().

Definition at line 133 of file PDistribution.h.

Referenced by PLearn::SupervisedDBN::build_layers(), PLearn::PartSupervisedDBN::build_layers(), PLearn::GaussPartSupervisedDBN::build_layers(), PLearn::GaussianDBNRegression::build_layers(), PLearn::GaussianDBNClassification::build_layers(), PLearn::HintonDeepBeliefNet::build_layers(), PLearn::PartSupervisedDBN::build_params(), PLearn::HintonDeepBeliefNet::build_params(), PLearn::GaussianDBNRegression::build_params(), PLearn::GaussPartSupervisedDBN::build_params(), PLearn::GaussianDBNClassification::build_params(), PLearn::PartSupervisedDBN::build_regressors(), PLearn::GaussPartSupervisedDBN::build_regressors(), PLearn::SupervisedDBN::build_regressors(), PLearn::GaussianDBNClassification::computeCostsFromOutputs(), PLearn::HintonDeepBeliefNet::computeCostsFromOutputs(), PLearn::PartSupervisedDBN::computeCostsFromOutputs(), PLearn::GaussPartSupervisedDBN::computeCostsFromOutputs(), PLearn::GaussMix::computeLogLikelihood(), computeOutput(), declareOptions(), PLearn::PartSupervisedDBN::density(), PLearn::GaussianDBNRegression::density(), PLearn::SupervisedDBN::density(), PLearn::HintonDeepBeliefNet::density(), PLearn::GaussPartSupervisedDBN::density(), PLearn::GaussianDBNClassification::density(), PLearn::GaussMix::expectation(), PLearn::GaussPartSupervisedDBN::fineTuneByGradientDescent(), PLearn::HintonDeepBeliefNet::fineTuneByGradientDescent(), PLearn::PartSupervisedDBN::fineTuneByGradientDescent(), PLearn::SupervisedDBN::fineTuneByGradientDescent(), forget(), PLearn::RBMDistribution::forget(), PLearn::GaussMix::generateFromGaussian(), generateJoint(), generateN(), PLearn::SupervisedDBN::greedyStep(), PLearn::PartSupervisedDBN::greedyStep(), PLearn::GaussPartSupervisedDBN::greedyStep(), PLearn::GaussPartSupervisedDBN::jointGreedyStep(), PLearn::GaussianDBNClassification::jointGreedyStep(), PLearn::HintonDeepBeliefNet::jointGreedyStep(), PLearn::PartSupervisedDBN::jointGreedyStep(), outputsize(), PLearn::GaussMix::resizeDataBeforeUsing(), PLearn::GaussMix::setPredictor(), PLearn::GaussianDBNRegression::setPredictorPredictedSizes(), PLearn::GaussianDBNClassification::setPredictorPredictedSizes(), PLearn::HintonDeepBeliefNet::setPredictorPredictedSizes(), PLearn::GaussPartSupervisedDBN::setPredictorPredictedSizes(), PLearn::PartSupervisedDBN::setPredictorPredictedSizes(), PLearn::SupervisedDBN::setPredictorPredictedSizes(), setPredictorPredictedSizes(), PLearn::GaussMix::setPredictorPredictedSizes_const(), splitCond(), and PLearn::GaussianDBNRegression::train().

Learnt sizes of the 'predictor' and 'predicted' sizes.

These are the options to use in PDistribution subclasses code. They always verify: n_predictor + n_predicted == inputsize()

Definition at line 132 of file PDistribution.h.

Referenced by PLearn::GaussMix::addToCovariance(), PLearn::PartSupervisedDBN::build_layers(), PLearn::SupervisedDBN::build_layers(), PLearn::GaussPartSupervisedDBN::build_layers(), PLearn::GaussianDBNRegression::build_layers(), PLearn::GaussianDBNClassification::build_layers(), PLearn::HintonDeepBeliefNet::build_layers(), PLearn::GaussMix::computeLogLikelihood(), declareOptions(), PLearn::RandomGaussMix::declareOptions(), PLearn::UnconditionalDistribution::declareOptions(), PLearn::GaussMix::expectation(), PLearn::SupervisedDBN::fineTuneByGradientDescent(), forget(), PLearn::GaussMix::generateFromGaussian(), generateJoint(), PLearn::SupervisedDBN::greedyStep(), PLearn::PartSupervisedDBN::greedyStep(), PLearn::GaussPartSupervisedDBN::greedyStep(), PLearn::GaussPartSupervisedDBN::jointGreedyStep(), PLearn::GaussianDBNClassification::jointGreedyStep(), PLearn::HintonDeepBeliefNet::jointGreedyStep(), PLearn::PartSupervisedDBN::jointGreedyStep(), PLearn::GaussMix::log_density(), PLearn::GaussMix::replaceGaussian(), PLearn::GaussMix::resizeDataBeforeUsing(), setPredictor(), PLearn::GaussMix::setPredictor(), PLearn::GaussianDBNClassification::setPredictorPredictedSizes(), PLearn::GaussianDBNRegression::setPredictorPredictedSizes(), PLearn::HintonDeepBeliefNet::setPredictorPredictedSizes(), PLearn::GaussPartSupervisedDBN::setPredictorPredictedSizes(), PLearn::PartSupervisedDBN::setPredictorPredictedSizes(), PLearn::SupervisedDBN::setPredictorPredictedSizes(), setPredictorPredictedSizes(), PLearn::GaussMix::setPredictorPredictedSizes_const(), splitCond(), PLearn::HintonDeepBeliefNet::train(), PLearn::GaussianDBNRegression::train(), and PLearn::GaussianDBNClassification::train().

Mat PLearn::PDistribution::store_cov [mutable, protected]

Definition at line 109 of file PDistribution.h.

Referenced by computeOutput(), and makeDeepCopyFromShallowCopy().

Definition at line 108 of file PDistribution.h.

Referenced by computeOutput(), and makeDeepCopyFromShallowCopy().


The documentation for this class was generated from the following files:
 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Friends Defines