PLearn 0.1
|
#include <RandomVar.h>
Public Member Functions | |
RandomVariable (int thelength, int thewidth=1) | |
All these constructors give rise to a RV with no parents. | |
RandomVariable (const Vec &the_value) | |
RandomVariable (const Mat &the_value) | |
RandomVariable (const Var &the_value) | |
RandomVariable (const RVArray &parents, int thelength) | |
these constructors give rise to a RV with parents | |
RandomVariable (const RVArray &parents, int thelength, int thewidth) | |
virtual char * | classname ()=0 |
virtual int | length () |
virtual int | width () |
int | nelems () |
bool | isScalar () |
bool | isVec () |
bool | isColumnVec () |
bool | isRowVec () |
virtual bool | isNonRandom ()=0 |
bool | isConstant () |
non-random is not the same thing as constant: (a Var is constant if it does not depend on other vars) | |
virtual bool | isDiscrete ()=0 |
true if a discrete random variable, false otherwise. | |
RandomVar | subVec (int start, int length) |
virtual void | setValueFromParentsValue ()=0 |
void | markRHSandSetKnownValues (const RVInstanceArray &RHS) |
virtual void | EMBprop (const Vec obs, real posterior)=0 |
virtual void | EMUpdate () |
virtual bool | canStopEM () |
virtual void | EMTrainingInitialize (const RVArray ¶meters_to_learn) |
Initialization of EM training (before all the iterations start). | |
virtual void | EMEpochInitialize () |
Initialization of an individual EMEpoch. | |
virtual void | mark (Var v) |
!< ALL BELOW THIS IS NOT NECESSARY FOR ORDINARY USERS OR SUBCLASS WRITERS //!< | |
virtual void | mark () |
virtual void | unmark () |
virtual void | clearEMmarks () |
virtual void | unmarkAncestors () |
clear not only the marked field but also that of parents | |
virtual bool | isMarked () |
virtual void | setKnownValues () |
virtual Var | logP (const Var &obs, const RVInstanceArray &RHS, RVInstanceArray *parameters_to_learn=0)=0 |
virtual Var | P (const Var &obs, const RVInstanceArray &RHS) |
virtual Var | ElogP (const Var &obs, RVArray ¶meters_to_learn, const RVInstanceArray &RHS) |
virtual real | EM (const RVArray ¶meters_to_learn, VarArray &prop_path, VarArray &observedVars, VMat distr, int n_samples, int max_n_iterations, real relative_improvement_threshold, bool accept_worsening_likelihood=false) |
virtual real | epoch (VarArray &prop_path, VarArray &observed_vars, const VMat &distr, int n_samples, bool do_EM_learning=true) |
virtual | ~RandomVariable () |
Public Attributes | |
const RVArray | parents |
other random variables, whose value give rise | |
Var | value |
used either when marked, non-random, or to assess P(This=value) | |
Protected Attributes | |
const int | rv_number |
rv_number is used to sort RVs in topological order, knowing that parents must be created before their descendents. | |
bool | marked |
bool | EMmark |
mark used by EM to avoid repeated calls to EMBprop/EMUpdate | |
bool | pmark |
yet another mark | |
bool * | learn_the_parameters |
learn_the_parameters[i] says if parent[i] should be learned in the current call to EM (as a parameter of the distribution). | |
Static Private Attributes | |
static int | rv_counter = 0 |
used to assign a value to rv_number when a new RV is constructed | |
Friends | |
class | RandomVar |
class | RVInstanceArray |
class | RVArray |
Definition at line 545 of file RandomVar.h.
All these constructors give rise to a RV with no parents.
Definition at line 184 of file RandomVar.cc.
:rv_number(rv_counter++), value(thelength,thewidth),marked(false), EMmark(false), pmark(false), learn_the_parameters(0) { }
PLearn::RandomVariable::RandomVariable | ( | const Vec & | the_value | ) |
Definition at line 190 of file RandomVar.cc.
:rv_number(rv_counter++), value(the_value),marked(false), EMmark(false) , pmark(false), learn_the_parameters(0) { }
PLearn::RandomVariable::RandomVariable | ( | const Mat & | the_value | ) |
Definition at line 196 of file RandomVar.cc.
:rv_number(rv_counter++), value(the_value),marked(false), EMmark(false), pmark(false), learn_the_parameters(0) { }
PLearn::RandomVariable::RandomVariable | ( | const Var & | the_value | ) |
Definition at line 202 of file RandomVar.cc.
:rv_number(rv_counter++), value(the_value),marked(false), EMmark(false), pmark(false), learn_the_parameters(0) { }
these constructors give rise to a RV with parents
Definition at line 208 of file RandomVar.cc.
:rv_number(rv_counter++), parents(the_parents), value(thelength), marked(false), EMmark(false), pmark(false), learn_the_parameters(new bool[the_parents.size()]) { }
Definition at line 215 of file RandomVar.cc.
: rv_number(rv_counter++), parents(the_parents), value(thelength,thewidth),marked(false), EMmark(false), pmark(false), learn_the_parameters(new bool[the_parents.size()]) { }
PLearn::RandomVariable::~RandomVariable | ( | ) | [virtual] |
Definition at line 396 of file RandomVar.cc.
References learn_the_parameters.
{ if (learn_the_parameters) delete learn_the_parameters; }
bool PLearn::RandomVariable::canStopEM | ( | ) | [virtual] |
Has the distribution seen enough EM iterations to meaningfully stop EM iterative training? This is a way for a RandomVariable sub-class to force continuation of the EM iterations beyond the criteria given by the caller of EM. the default just propagates to the unmarked parents.
Reimplemented in PLearn::MixtureRandomVariable.
Definition at line 348 of file RandomVar.cc.
References parents, and PLearn::TVec< T >::size().
Referenced by EM().
{ // propagate to parents bool can=true; for (int i=0;i<parents.size() && !can;i++) can = parents[i]->canStopEM(); return can; }
virtual char* PLearn::RandomVariable::classname | ( | ) | [pure virtual] |
Implemented in PLearn::NonRandomVariable, PLearn::JointRandomVariable, PLearn::RandomElementOfRandomVariable, PLearn::RVArrayRandomElementRandomVariable, PLearn::NegRandomVariable, PLearn::ExpRandomVariable, PLearn::LogRandomVariable, PLearn::DiagonalNormalRandomVariable, PLearn::MixtureRandomVariable, PLearn::PlusRandomVariable, PLearn::MinusRandomVariable, PLearn::ElementWiseDivisionRandomVariable, PLearn::ProductRandomVariable, PLearn::SubVecRandomVariable, PLearn::MultinomialRandomVariable, PLearn::ExtendedRandomVariable, and PLearn::ConcatColumnsRandomVariable.
Referenced by ElogP().
void PLearn::RandomVariable::clearEMmarks | ( | ) | [virtual] |
Reimplemented in PLearn::MixtureRandomVariable.
Definition at line 313 of file RandomVar.cc.
References EMmark, parents, and PLearn::TVec< T >::size().
Referenced by EM(), and epoch().
Var PLearn::RandomVariable::ElogP | ( | const Var & | obs, |
RVArray & | parameters_to_learn, | ||
const RVInstanceArray & | RHS | ||
) | [virtual] |
the latter is like logP but it represents the expected log-probability of obs given the RHS, where the expectation is over the "hidden" random variables of EM in mixtures, as a function of the values of the parameters_to_learn.
Definition at line 399 of file RandomVar.cc.
References classname(), and PLERROR.
real PLearn::RandomVariable::EM | ( | const RVArray & | parameters_to_learn, |
VarArray & | prop_path, | ||
VarArray & | observedVars, | ||
VMat | distr, | ||
int | n_samples, | ||
int | max_n_iterations, | ||
real | relative_improvement_threshold, | ||
bool | accept_worsening_likelihood = false |
||
) | [virtual] |
Ordinary users don't need to use this method. Use the global function EM instead. Train with EM or analytical maximization of likelihood, and return training negative log-likelihood. The VMat provides the n_samples training examples, and EM is applied for at most max_n_iterations or until the relative improvement in neg-log-likelihood (improvement / previous_value) is less than relative_improvement_threshold. This method should in general NOT be redefined. Here prop_path contains the propagation path of Var's from the observedVars (sampled from VMat) to the negative-log-probability of the variable whose likelihood should be maximized.
NOTE NOTE NOTE:
THE ORDER OF THE VALUES IN THE DISTRIBUTION MUST BE: (1) conditioning variables (RHS), (2) output variables
Definition at line 229 of file RandomVar.cc.
References canStopEM(), clearEMmarks(), EMTrainingInitialize(), PLearn::endl(), and epoch().
{ real avgnegloglik = 0; real previous_nll=FLT_MAX, nll_change; bool EMfinished= !(max_n_iterations>0); int n_epochs=0; EMTrainingInitialize(parameters_to_learn); clearEMmarks(); while (!EMfinished) { avgnegloglik=epoch(prop_path, observedVars, distr,n_samples); cout << "EM epoch NLL = " << avgnegloglik << endl; nll_change = (previous_nll - avgnegloglik)/fabs(previous_nll); if (nll_change < -1e-4 && !accept_worsening_likelihood) printf("%s %s from %f to %f\n", "RandomVariable::EM", "An EM epoch yielded worse negative log-likelihood,", previous_nll, avgnegloglik); n_epochs++; EMfinished = canStopEM() && ((n_epochs >= max_n_iterations) || (fabs(nll_change) <= relative_improvement_threshold) || (!accept_worsening_likelihood && nll_change <= relative_improvement_threshold)); previous_nll=avgnegloglik; } return avgnegloglik; }
************ EM STUFF ********** propagate posterior information to parents in order to perform an EMupdate at the end of an EMEpoch. In the case of mixture-like RVs and their components, the posterior is the probability of the component "this" given the observation "obs".
Implemented in PLearn::NonRandomVariable, PLearn::JointRandomVariable, PLearn::RandomElementOfRandomVariable, PLearn::RVArrayRandomElementRandomVariable, PLearn::NegRandomVariable, PLearn::ExpRandomVariable, PLearn::LogRandomVariable, PLearn::DiagonalNormalRandomVariable, PLearn::MixtureRandomVariable, PLearn::PlusRandomVariable, PLearn::MinusRandomVariable, PLearn::ElementWiseDivisionRandomVariable, PLearn::ProductRandomVariable, PLearn::SubVecRandomVariable, PLearn::MultinomialRandomVariable, PLearn::ExtendedRandomVariable, and PLearn::ConcatColumnsRandomVariable.
Referenced by epoch().
void PLearn::RandomVariable::EMEpochInitialize | ( | ) | [virtual] |
Initialization of an individual EMEpoch.
the default just propagates to the unmarked parents
Reimplemented in PLearn::DiagonalNormalRandomVariable, PLearn::MixtureRandomVariable, PLearn::PlusRandomVariable, PLearn::MinusRandomVariable, PLearn::ElementWiseDivisionRandomVariable, PLearn::ProductRandomVariable, and PLearn::MultinomialRandomVariable.
Definition at line 384 of file RandomVar.cc.
References EMmark, parents, and PLearn::TVec< T >::size().
Referenced by epoch().
{ if (EMmark) return; EMmark=true; for (int i=0;i<parents.size();i++) parents[i]->EMEpochInitialize(); }
void PLearn::RandomVariable::EMTrainingInitialize | ( | const RVArray & | parameters_to_learn | ) | [virtual] |
Initialization of EM training (before all the iterations start).
the default just propagates to the unmarked parents
Reimplemented in PLearn::MixtureRandomVariable, PLearn::PlusRandomVariable, PLearn::MinusRandomVariable, PLearn::ElementWiseDivisionRandomVariable, and PLearn::ProductRandomVariable.
Definition at line 358 of file RandomVar.cc.
References PLearn::TVec< T >::contains(), EMmark, i, isConstant(), isNonRandom(), learn_the_parameters, parents, PLERROR, and PLearn::TVec< T >::size().
Referenced by EM().
{ if (EMmark) return; EMmark=true; int n_can_learn=0; int n_random=0; for (int i=0;i<parents.size();i++) { if (parameters_to_learn.contains(parents[i])) { if (!parents[i]->isConstant()) PLERROR("Trying to learn a parameter that is not constant!"); learn_the_parameters[i] = true; n_can_learn++; } else learn_the_parameters[i] = false; if (!parents[i]->isNonRandom()) n_random++; } if (n_can_learn>0 && n_random>0) PLERROR("RandomVariable: can't learn some parameter if others are random"); for (int i=0;i<parents.size();i++) parents[i]->EMTrainingInitialize(parameters_to_learn); }
void PLearn::RandomVariable::EMUpdate | ( | ) | [virtual] |
update the fixed (non-random) parameters using internal learning mechanism, at end of an EMEpoch. the default just propagates to the unmarked parents.
Reimplemented in PLearn::DiagonalNormalRandomVariable, PLearn::MixtureRandomVariable, PLearn::PlusRandomVariable, PLearn::MinusRandomVariable, PLearn::ElementWiseDivisionRandomVariable, PLearn::ProductRandomVariable, and PLearn::MultinomialRandomVariable.
Definition at line 340 of file RandomVar.cc.
References EMmark, isConstant(), parents, and PLearn::TVec< T >::size().
Referenced by epoch().
{ if (EMmark) return; EMmark=true; for (int i=0;i<parents.size();i++) if (!parents[i]->isConstant()) parents[i]->EMUpdate(); }
real PLearn::RandomVariable::epoch | ( | VarArray & | prop_path, |
VarArray & | observed_vars, | ||
const VMat & | distr, | ||
int | n_samples, | ||
bool | do_EM_learning = true |
||
) | [virtual] |
Do one iteration of EM or compute the likelihood of the given data. This method should not be redefined for most types of random variables, but it may be necessary for things like sequential data. This method can be used for doing one epoch of EM (do_EM_learning=true) or simply for computing the likelihood, if do_EM_learning=false.
Definition at line 260 of file RandomVar.cc.
References clearEMmarks(), EMBprop(), EMEpochInitialize(), EMUpdate(), PLearn::endl(), PLearn::VarArray::fprop(), PLearn::TVec< T >::last(), PLearn::VarArray::printInfo(), and PLearn::VMat::width().
Referenced by EM().
{ real avg_cost = 0; if (do_EM_learning) { EMEpochInitialize(); clearEMmarks(); } for (int i=0;i<n_samples;i++) { Vec sam(distr->width()); distr->getRow(i,sam); observed_vars << sam; prop_path.fprop(); // computes logP in last element of prop_path Var logp = prop_path.last(); #if 0 // debugging cout << "at example i=" << i << endl; VarArray sources = logp->sources(); logp->unmarkAncestors(); sources.printInfo(); prop_path.printInfo(); #endif avg_cost -= logp->value[0]; if (do_EM_learning) // last = LHS observed value EMBprop(observed_vars.last()->value,1.0); } if (do_EM_learning) { EMUpdate(); clearEMmarks(); } return avg_cost / n_samples; }
bool PLearn::RandomVariable::isColumnVec | ( | ) | [inline] |
Definition at line 600 of file RandomVar.h.
{ return width()==1; }
bool PLearn::RandomVariable::isConstant | ( | ) | [inline] |
non-random is not the same thing as constant: (a Var is constant if it does not depend on other vars)
Definition at line 612 of file RandomVar.h.
Referenced by PLearn::NegRandomVariable::EMBprop(), PLearn::PlusRandomVariable::EMBprop(), PLearn::DiagonalNormalRandomVariable::EMBprop(), PLearn::ProductRandomVariable::EMBprop(), PLearn::LogRandomVariable::EMBprop(), PLearn::ExpRandomVariable::EMBprop(), PLearn::MinusRandomVariable::EMBprop(), EMTrainingInitialize(), PLearn::MinusRandomVariable::EMUpdate(), PLearn::ProductRandomVariable::EMUpdate(), PLearn::DiagonalNormalRandomVariable::EMUpdate(), EMUpdate(), and PLearn::PlusRandomVariable::EMUpdate().
{ return isNonRandom() && value->isConstant(); }
virtual bool PLearn::RandomVariable::isDiscrete | ( | ) | [pure virtual] |
true if a discrete random variable, false otherwise.
Most of the RVs of interest here are continuous, so this is the default.
Implemented in PLearn::StochasticRandomVariable, PLearn::FunctionalRandomVariable, PLearn::MixtureRandomVariable, and PLearn::MultinomialRandomVariable.
virtual bool PLearn::RandomVariable::isMarked | ( | ) | [inline, virtual] |
Definition at line 690 of file RandomVar.h.
Referenced by PLearn::MixtureRandomVariable::ElogP(), PLearn::RVArrayRandomElementRandomVariable::logP(), PLearn::MixtureRandomVariable::logP(), PLearn::MultinomialRandomVariable::logP(), PLearn::FunctionalRandomVariable::logP(), and PLearn::DiagonalNormalRandomVariable::logP().
{ return marked; }
virtual bool PLearn::RandomVariable::isNonRandom | ( | ) | [pure virtual] |
Random means that its value cannot be known deterministically. Note that StochasticRandomVariable's are never non-random, but a FunctionalRandomVariable is non-random if it has no parents (i.e. it is a NonRandomVariable) or if all its parents are non-random.
Implemented in PLearn::StochasticRandomVariable, and PLearn::FunctionalRandomVariable.
Referenced by EMTrainingInitialize().
bool PLearn::RandomVariable::isRowVec | ( | ) | [inline] |
Definition at line 601 of file RandomVar.h.
{ return length()==1; }
bool PLearn::RandomVariable::isScalar | ( | ) | [inline] |
Definition at line 598 of file RandomVar.h.
bool PLearn::RandomVariable::isVec | ( | ) | [inline] |
Definition at line 599 of file RandomVar.h.
virtual int PLearn::RandomVariable::length | ( | ) | [inline, virtual] |
Definition at line 595 of file RandomVar.h.
Referenced by PLearn::SubVecRandomVariable::EMBprop(), PLearn::DiagonalNormalRandomVariable::EMBprop(), PLearn::SubVecRandomVariable::invertible(), PLearn::FunctionalRandomVariable::logP(), and PLearn::RVArrayRandomElementRandomVariable::RVArrayRandomElementRandomVariable().
{ return value->length(); }
virtual Var PLearn::RandomVariable::logP | ( | const Var & | obs, |
const RVInstanceArray & | RHS, | ||
RVInstanceArray * | parameters_to_learn = 0 |
||
) | [pure virtual] |
Construct a Var that computes logP(This = obs | RHS ). This function SHOULD NOT be used directly, but is called by the global function logP (same argument), which does proper massaging of the network before and after this call.
Implemented in PLearn::FunctionalRandomVariable, PLearn::RVArrayRandomElementRandomVariable, PLearn::DiagonalNormalRandomVariable, PLearn::MixtureRandomVariable, and PLearn::MultinomialRandomVariable.
Referenced by P().
virtual void PLearn::RandomVariable::mark | ( | Var | v | ) | [inline, virtual] |
!< ALL BELOW THIS IS NOT NECESSARY FOR ORDINARY USERS OR SUBCLASS WRITERS //!<
Definition at line 682 of file RandomVar.h.
virtual void PLearn::RandomVariable::mark | ( | ) | [inline, virtual] |
Definition at line 683 of file RandomVar.h.
{ marked = true; }
void PLearn::RandomVariable::markRHSandSetKnownValues | ( | const RVInstanceArray & | RHS | ) | [inline] |
Definition at line 640 of file RandomVar.h.
References i, and PLearn::TVec< T >::size().
Referenced by PLearn::MixtureRandomVariable::ElogP().
int PLearn::RandomVariable::nelems | ( | ) | [inline] |
Definition at line 597 of file RandomVar.h.
{ return value->nelems(); }
Var PLearn::RandomVariable::P | ( | const Var & | obs, |
const RVInstanceArray & | RHS | ||
) | [virtual] |
Definition at line 392 of file RandomVar.cc.
References PLearn::exp(), and logP().
void PLearn::RandomVariable::setKnownValues | ( | ) | [virtual] |
traverse the graph of ancestors of this node and mark nodes which are deterministic descendents of marked nodes while setting their "value" field as a function of their parents.
Reimplemented in PLearn::StochasticRandomVariable, and PLearn::MixtureRandomVariable.
Definition at line 323 of file RandomVar.cc.
References i, marked, parents, pmark, setValueFromParentsValue(), and PLearn::TVec< T >::size().
{ if (!pmark && !marked) { pmark=true; bool all_parents_marked=true; for (int i=0;i<parents.size();i++) { parents[i]->setKnownValues(); all_parents_marked &= parents[i]->isMarked(); } setValueFromParentsValue(); if (all_parents_marked) marked=true; } }
virtual void PLearn::RandomVariable::setValueFromParentsValue | ( | ) | [pure virtual] |
ALL BELOW THIS IS NOT NECESSARY FOR ORDINARY USERS < but may be necessary when writing subclasses. Note < however that normally the subclasses should not be < direct subclasses of RandomVariable but rather be < subclasses of StochasticRandomVariable and of < FunctionalRandomVariable.
define the formula that gives a value to this RV given its parent's value (sets the value field). If the RV is stochastic, the formula may also be "stochastic" (using SampleVariable's to define the Var).
Implemented in PLearn::FunctionalRandomVariable, PLearn::NonRandomVariable, PLearn::JointRandomVariable, PLearn::RandomElementOfRandomVariable, PLearn::RVArrayRandomElementRandomVariable, PLearn::NegRandomVariable, PLearn::ExpRandomVariable, PLearn::LogRandomVariable, PLearn::DiagonalNormalRandomVariable, PLearn::MixtureRandomVariable, PLearn::PlusRandomVariable, PLearn::MinusRandomVariable, PLearn::ElementWiseDivisionRandomVariable, PLearn::ProductRandomVariable, PLearn::SubVecRandomVariable, PLearn::MultinomialRandomVariable, PLearn::ExtendedRandomVariable, and PLearn::ConcatColumnsRandomVariable.
Referenced by setKnownValues(), and PLearn::StochasticRandomVariable::setKnownValues().
Return a new RandomVar which is obtained by extracting a sub-vector of length "length" from the value of this, starting at position "start".
Definition at line 225 of file RandomVar.cc.
{ return new SubVecRandomVariable(this,start,length); }
virtual void PLearn::RandomVariable::unmark | ( | ) | [inline, virtual] |
Definition at line 684 of file RandomVar.h.
{ marked = false; }
void PLearn::RandomVariable::unmarkAncestors | ( | ) | [virtual] |
clear not only the marked field but also that of parents
Reimplemented in PLearn::MixtureRandomVariable.
Definition at line 302 of file RandomVar.cc.
References marked, parents, pmark, and PLearn::TVec< T >::size().
{ if (pmark) { marked=false; pmark=false; for (int i=0;i<parents.size();i++) parents[i]->unmarkAncestors(); } }
virtual int PLearn::RandomVariable::width | ( | ) | [inline, virtual] |
Definition at line 596 of file RandomVar.h.
{ return value->width(); }
friend class RandomVar [friend] |
Definition at line 547 of file RandomVar.h.
Referenced by PLearn::RVArrayRandomElementRandomVariable::logP().
friend class RVArray [friend] |
Definition at line 549 of file RandomVar.h.
friend class RVInstanceArray [friend] |
Definition at line 548 of file RandomVar.h.
bool PLearn::RandomVariable::EMmark [protected] |
mark used by EM to avoid repeated calls to EMBprop/EMUpdate
Definition at line 576 of file RandomVar.h.
Referenced by PLearn::MixtureRandomVariable::clearEMmarks(), clearEMmarks(), PLearn::MinusRandomVariable::EMEpochInitialize(), EMEpochInitialize(), PLearn::MixtureRandomVariable::EMEpochInitialize(), PLearn::DiagonalNormalRandomVariable::EMEpochInitialize(), PLearn::ProductRandomVariable::EMEpochInitialize(), PLearn::MultinomialRandomVariable::EMEpochInitialize(), PLearn::PlusRandomVariable::EMEpochInitialize(), PLearn::MixtureRandomVariable::EMTrainingInitialize(), EMTrainingInitialize(), PLearn::MinusRandomVariable::EMUpdate(), PLearn::ProductRandomVariable::EMUpdate(), PLearn::MultinomialRandomVariable::EMUpdate(), PLearn::DiagonalNormalRandomVariable::EMUpdate(), PLearn::MixtureRandomVariable::EMUpdate(), EMUpdate(), and PLearn::PlusRandomVariable::EMUpdate().
bool* PLearn::RandomVariable::learn_the_parameters [protected] |
learn_the_parameters[i] says if parent[i] should be learned in the current call to EM (as a parameter of the distribution).
Definition at line 581 of file RandomVar.h.
Referenced by EMTrainingInitialize(), and ~RandomVariable().
bool PLearn::RandomVariable::marked [protected] |
temporary used in various procedures that traverse the graphical model. For example, for logP it means "conditionally non-random".
Definition at line 574 of file RandomVar.h.
Referenced by setKnownValues(), PLearn::StochasticRandomVariable::setKnownValues(), PLearn::MixtureRandomVariable::setKnownValues(), PLearn::ExtendedRandomVariable::setValueFromParentsValue(), PLearn::RVArrayRandomElementRandomVariable::setValueFromParentsValue(), PLearn::NegRandomVariable::setValueFromParentsValue(), PLearn::SubVecRandomVariable::setValueFromParentsValue(), PLearn::RandomElementOfRandomVariable::setValueFromParentsValue(), PLearn::JointRandomVariable::setValueFromParentsValue(), PLearn::ProductRandomVariable::setValueFromParentsValue(), PLearn::LogRandomVariable::setValueFromParentsValue(), PLearn::ElementWiseDivisionRandomVariable::setValueFromParentsValue(), PLearn::ConcatColumnsRandomVariable::setValueFromParentsValue(), PLearn::ExpRandomVariable::setValueFromParentsValue(), PLearn::MinusRandomVariable::setValueFromParentsValue(), PLearn::PlusRandomVariable::setValueFromParentsValue(), unmarkAncestors(), and PLearn::MixtureRandomVariable::unmarkAncestors().
other random variables, whose value give rise
Definition at line 561 of file RandomVar.h.
Referenced by canStopEM(), clearEMmarks(), PLearn::ExtendedRandomVariable::EMBprop(), PLearn::SubVecRandomVariable::EMBprop(), PLearn::LogRandomVariable::EMBprop(), PLearn::ExpRandomVariable::EMBprop(), PLearn::NegRandomVariable::EMBprop(), PLearn::JointRandomVariable::EMBprop(), EMEpochInitialize(), EMTrainingInitialize(), EMUpdate(), PLearn::ExtendedRandomVariable::invertible(), PLearn::SubVecRandomVariable::invertible(), PLearn::JointRandomVariable::invertible(), PLearn::FunctionalRandomVariable::isDiscrete(), PLearn::FunctionalRandomVariable::isNonRandom(), PLearn::RVArrayRandomElementRandomVariable::logP(), PLearn::FunctionalRandomVariable::logP(), PLearn::StochasticRandomVariable::setKnownValues(), setKnownValues(), PLearn::ConcatColumnsRandomVariable::setValueFromParentsValue(), PLearn::ExtendedRandomVariable::setValueFromParentsValue(), PLearn::SubVecRandomVariable::setValueFromParentsValue(), PLearn::LogRandomVariable::setValueFromParentsValue(), PLearn::ExpRandomVariable::setValueFromParentsValue(), PLearn::NegRandomVariable::setValueFromParentsValue(), PLearn::RVArrayRandomElementRandomVariable::setValueFromParentsValue(), PLearn::JointRandomVariable::setValueFromParentsValue(), and unmarkAncestors().
bool PLearn::RandomVariable::pmark [protected] |
yet another mark
Definition at line 577 of file RandomVar.h.
Referenced by setKnownValues(), PLearn::StochasticRandomVariable::setKnownValues(), PLearn::MixtureRandomVariable::setKnownValues(), unmarkAncestors(), and PLearn::MixtureRandomVariable::unmarkAncestors().
int PLearn::RandomVariable::rv_counter = 0 [static, private] |
used to assign a value to rv_number when a new RV is constructed
Definition at line 551 of file RandomVar.h.
const int PLearn::RandomVariable::rv_number [protected] |
rv_number is used to sort RVs in topological order, knowing that parents must be created before their descendents.
takes the value rv_counter upon construction
Definition at line 558 of file RandomVar.h.
used either when marked, non-random, or to assess P(This=value)
stochastically or deterministically to this value. This field basically defines the NETWORK of the model. Note that this oriented graphical model must not have cycles.
Definition at line 567 of file RandomVar.h.
Referenced by PLearn::MixtureRandomVariable::ElogP(), PLearn::MixtureRandomVariable::EMBprop(), PLearn::DiagonalNormalRandomVariable::EMBprop(), PLearn::MinusRandomVariable::EMBprop(), PLearn::PlusRandomVariable::EMBprop(), PLearn::MultinomialRandomVariable::EMUpdate(), PLearn::MixtureRandomVariable::EMUpdate(), PLearn::ProductRandomVariable::EMUpdate(), PLearn::ExtendedRandomVariable::invertible(), PLearn::ProductRandomVariable::invertible(), PLearn::MultinomialRandomVariable::logP(), PLearn::MixtureRandomVariable::logP(), PLearn::DiagonalNormalRandomVariable::logP(), PLearn::FunctionalRandomVariable::logP(), PLearn::ConcatColumnsRandomVariable::setValueFromParentsValue(), PLearn::ExtendedRandomVariable::setValueFromParentsValue(), PLearn::SubVecRandomVariable::setValueFromParentsValue(), PLearn::MultinomialRandomVariable::setValueFromParentsValue(), PLearn::MixtureRandomVariable::setValueFromParentsValue(), PLearn::DiagonalNormalRandomVariable::setValueFromParentsValue(), PLearn::ProductRandomVariable::setValueFromParentsValue(), PLearn::ElementWiseDivisionRandomVariable::setValueFromParentsValue(), PLearn::MinusRandomVariable::setValueFromParentsValue(), PLearn::PlusRandomVariable::setValueFromParentsValue(), PLearn::LogRandomVariable::setValueFromParentsValue(), PLearn::ExpRandomVariable::setValueFromParentsValue(), PLearn::NegRandomVariable::setValueFromParentsValue(), PLearn::RVArrayRandomElementRandomVariable::setValueFromParentsValue(), PLearn::RandomElementOfRandomVariable::setValueFromParentsValue(), and PLearn::JointRandomVariable::setValueFromParentsValue().