程序包 weka.classifiers.meta


package weka.classifiers.meta
  • 说明
    Class for boosting a nominal class classifier using the Adaboost M1 method.
    Meta classifier that enhances the performance of a regression base classifier.
    Dimensionality of training and test data is reduced by attribute selection before being passed on to a classifier.
    Class for bagging a classifier to reduce variance.
    A simple meta-classifier that uses a clusterer for classification.
    Class for doing classification using regression methods.
    A metaclassifier that makes its base classifier cost-sensitive.
    Class for performing parameter selection by cross-validation for any classifier.

    For more information, see:

    R.
    This meta classifier creates a number of disjoint, stratified folds out of the data and feeds each chunk of data to a copy of the supplied base classifier.
    DECORATE is a meta-learner for building diverse ensembles of classifiers by using specially constructed artificial training examples.
    A meta classifier for handling multi-class datasets with 2-class classifiers by building an ensemble of nested dichotomies.

    For more info, check

    Lin Dong, Eibe Frank, Stefan Kramer: Ensembles of Balanced Nested Dichotomies for Multi-class Problems.
    Class for running an arbitrary classifier on data that has been passed through an arbitrary filter.
    Implements Grading.
    Performs a grid search of parameter pairs for the a classifier (Y-axis, default is LinearRegression with the "Ridge" parameter) and the PLSFilter (X-axis, "# of Components") and chooses the best pair found for the actual predicting.

    The initial grid is worked on with 2-fold CV to determine the values of the parameter pairs for the selected type of evaluation (e.g., accuracy).
    Class for performing additive logistic regression.
    This metaclassifier makes its base classifier cost-sensitive using the method specified in

    Pedro Domingos: MetaCost: A general method for making classifiers cost-sensitive.
    Class for boosting a classifier using the MultiBoosting method.

    MultiBoosting is an extension to the highly successful AdaBoost technique for forming decision committees.
    A metaclassifier for handling multi-class datasets with 2-class classifiers.
    Class for selecting a classifier from among several using cross validation on the training data or the performance on the training data.
    Meta classifier that allows standard classification algorithms to be applied to ordinal class problems.

    For more information see:

    Eibe Frank, Mark Hall: A Simple Approach to Ordinal Classification.
    Classifier for incremental learning of large datasets by way of racing logit-boosted committees.

    For more information see:

    Eibe Frank, Geoffrey Holmes, Richard Kirkby, Mark Hall: Racing committees for large datasets.
    Class for building an ensemble of randomizable base classifiers.
    This method constructs a decision tree based classifier that maintains highest accuracy on training data and improves on generalization accuracy as it grows in complexity.
    A regression scheme that employs any classifier on a copy of the data that has the class attribute (equal-width) discretized.
    Class for construction a Rotation Forest.
    Combines several classifiers using the stacking method.
    Implements StackingC (more efficient version of stacking).

    For more information, see

    A.K.
    A metaclassifier that selecting a mid-point threshold on the probability output by a Classifier.
    Class for combining classifiers.