66 * http://cogcomp.cs.illinois.edu/
77 */
88/**
9- * <p>
10- * For sparse learners, it is often the case that the array of features you learn
9+ * <p>For sparse learners, it is often the case that the array of features you learn
1110 * contains only a subset of useful features. When we leave these features in the lexicon,
1211 * we end up with bloated lexicons and weight vectors. This leads to larger than necessary
13- * models.<p>
12+ * models.</ p>
1413 *
15- * This package contains an interface that defines the life cycle for the pruning
14+ * <p> This package contains an interface that defines the life cycle for the feature pruning
1615 * process, as well as some implementations, one that takes multiple weight vectors (for
17- * multi-class network learners), and some that takes only one weight vector.<p>
16+ * multi-class network learners), and some that takes only one weight vector.</ p>
1817 *
19- * All optimizers should subclass @see LexiconOptimizer which implements most of the
18+ * <p> All optimizers should subclass @see LexiconOptimizer which implements most of the
2019 * optimization. Subclass will need to provide methods to compute the weight value to compare
2120 * against the threshold, a method to identify the useless features, and a method to prune
22- * those features.<p>
21+ * those features.</ p>
2322 *
24- * The optimizers are invoked by the doneTraining method of the Learner class when all learning
25- * is complete. For those who have build their own training procedure, they are required to invoke
26- * the doneTraining and startTraining method during their training process.<p>
23+ * <p>The optimizers are invoked by the{@link edu.illinois.cs.cogcomp.lbjava.learn.Learner#doneTraining}
24+ * method of the Learner class when all learning is complete. For those learners that include a feature
25+ * pruning implementation, they must override this method to invoke the optimizer. In this way, during the
26+ * normal LBJava compile and model build cycle, the optimization is performed automatically. For those
27+ * who have build their own training procedure, they are required to invoke the doneTraining and
28+ *{@link edu.illinois.cs.cogcomp.lbjava.learn.Learner#startTraining} method at appropriate points during
29+ * their training process.</p>
2730 *
28- * The pruning threshold value is provided by the specific learner, and should be, in one way or
29- * another, parameterized.<p>
31+ * <p>The learner classes typically have a parameter that can be set to change the default feature
32+ * pruning threshold to any the user might choose, or it can be set to 0.0 to disable. </p>
33+ *
34+ * <p>The pruning threshold value is provided by the specific learner, and should be, in one way or
35+ * another, parameterized.</p>
3036 * @author redman
3137 */
3238package edu .illinois .cs .cogcomp .lbjava .learn .featurepruning ;
0 commit comments