All Downloads are FREE. Search and download functionalities are using the official Maven repository.

Download .test JAR files with dependency

Search JAR files by class name

racedIncrementalLogitBoost from group nz.ac.waikato.cms.weka (version 1.0.2)

Classifier for incremental learning of large datasets by way of racing logit-boosted committees. For more information see: Eibe Frank, Geoffrey Holmes, Richard Kirkby, Mark Hall: Racing committees for large datasets. In: Proceedings of the 5th International Conferenceon Discovery Science, 153-164, 2002.

Group: nz.ac.waikato.cms.weka Artifact: racedIncrementalLogitBoost
Show all versions Show documentation Show source 
 

0 downloads
Artifact racedIncrementalLogitBoost
Group nz.ac.waikato.cms.weka
Version 1.0.2
Last update 26. April 2012
Organization University of Waikato, Hamilton, NZ
URL http://weka.sourceforge.net/doc.packages/racedIncrementalLogitBoost
License GNU General Public License 3
Dependencies amount 1
Dependencies weka-dev,
There are maybe transitive dependencies!

raceSearch from group nz.ac.waikato.cms.weka (version 1.0.2)

Races the cross validation error of competing attribute subsets. Use in conjuction with a ClassifierSubsetEval. RaceSearch has four modes: forward selection races all single attribute additions to a base set (initially no attributes), selects the winner to become the new base set and then iterates until there is no improvement over the base set. Backward elimination is similar but the initial base set has all attributes included and races all single attribute deletions. Schemata search is a bit different. Each iteration a series of races are run in parallel. Each race in a set determines whether a particular attribute should be included or not---ie the race is between the attribute being "in" or "out". The other attributes for this race are included or excluded randomly at each point in the evaluation. As soon as one race has a clear winner (ie it has been decided whether a particular attribute should be inor not) then the next set of races begins, using the result of the winning race from the previous iteration as new base set. Rank race first ranks the attributes using an attribute evaluator and then races the ranking. The race includes no attributes, the top ranked attribute, the top two attributes, the top three attributes, etc. It is also possible to generate a raked list of attributes through the forward racing process. If generateRanking is set to true then a complete forward race will be run---that is, racing continues until all attributes have been selected. The order that they are added in determines a complete ranking of all the attributes. Racing uses paired and unpaired t-tests on cross-validation errors of competing subsets. When there is a significant difference between the means of the errors of two competing subsets then the poorer of the two can be eliminated from the race. Similarly, if there is no significant difference between the mean errors of two competing subsets and they are within some threshold of each other, then one can be eliminated from the race.

Group: nz.ac.waikato.cms.weka Artifact: raceSearch
Show all versions Show documentation Show source 
 

0 downloads
Artifact raceSearch
Group nz.ac.waikato.cms.weka
Version 1.0.2
Last update 26. April 2012
Organization University of Waikato, Hamilton, NZ
URL http://weka.sourceforge.net/doc.packages/raceSearch
License GNU General Public License 3
Dependencies amount 2
Dependencies weka-dev, classifierBasedAttributeSelection,
There are maybe transitive dependencies!

paceRegression from group nz.ac.waikato.cms.weka (version 1.0.2)

Class for building pace regression linear models and using them for prediction. Under regularity conditions, pace regression is provably optimal when the number of coefficients tends to infinity. It consists of a group of estimators that are either overall optimal or optimal under certain conditions. The current work of the pace regression theory, and therefore also this implementation, do not handle: - missing values - non-binary nominal attributes - the case that n - k is small where n is the number of instances and k is the number of coefficients (the threshold used in this implmentation is 20) For more information see: Wang, Y (2000). A new approach to fitting linear models in high dimensional spaces. Hamilton, New Zealand. Wang, Y., Witten, I. H.: Modeling for optimal probability prediction. In: Proceedings of the Nineteenth International Conference in Machine Learning, Sydney, Australia, 650-657, 2002.

Group: nz.ac.waikato.cms.weka Artifact: paceRegression
Show all versions Show documentation Show source 
 

0 downloads
Artifact paceRegression
Group nz.ac.waikato.cms.weka
Version 1.0.2
Last update 26. April 2012
Organization University of Waikato, Hamilton, NZ
URL http://weka.sourceforge.net/doc.packages/paceRegression
License GNU General Public License 3
Dependencies amount 1
Dependencies weka-dev,
There are maybe transitive dependencies!

naiveBayesTree from group nz.ac.waikato.cms.weka (version 1.0.2)

Class for generating a decision tree with naive Bayes classifiers at the leaves. For more information, see Ron Kohavi: Scaling Up the Accuracy of Naive-Bayes Classifiers: A Decision-Tree Hybrid. In: Second International Conference on Knoledge Discovery and Data Mining, 202-207, 1996.

Group: nz.ac.waikato.cms.weka Artifact: naiveBayesTree
Show all versions Show documentation Show source 
 

0 downloads
Artifact naiveBayesTree
Group nz.ac.waikato.cms.weka
Version 1.0.2
Last update 26. April 2012
Organization University of Waikato, Hamilton, NZ
URL http://weka.sourceforge.net/doc.packages/naiveBayesTree
License GNU General Public License 3
Dependencies amount 1
Dependencies weka-dev,
There are maybe transitive dependencies!

multiBoostAB from group nz.ac.waikato.cms.weka (version 1.0.2)

Class for boosting a classifier using the MultiBoosting method. MultiBoosting is an extension to the highly successful AdaBoost technique for forming decision committees. MultiBoosting can be viewed as combining AdaBoost with wagging. It is able to harness both AdaBoost's high bias and variance reduction with wagging's superior variance reduction. Using C4.5 as the base learning algorithm, Multi-boosting is demonstrated to produce decision committees with lower error than either AdaBoost or wagging significantly more often than the reverse over a large representative cross-section of UCI data sets. It offers the further advantage over AdaBoost of suiting parallel execution. For more information, see Geoffrey I. Webb (2000). MultiBoosting: A Technique for Combining Boosting and Wagging. Machine Learning. Vol.40(No.2).

Group: nz.ac.waikato.cms.weka Artifact: multiBoostAB
Show all versions Show documentation Show source 
 

0 downloads
Artifact multiBoostAB
Group nz.ac.waikato.cms.weka
Version 1.0.2
Last update 26. April 2012
Organization University of Waikato, Hamilton, NZ
URL http://weka.sourceforge.net/doc.packages/multiBoostAB
License GNU General Public License 3
Dependencies amount 1
Dependencies weka-dev,
There are maybe transitive dependencies!

leastMedSquared from group nz.ac.waikato.cms.weka (version 1.0.2)

Implements a least median squared linear regression utilizing the existing weka LinearRegression class to form predictions. Least squared regression functions are generated from random subsamples of the data. The least squared regression with the lowest meadian squared error is chosen as the final model. The basis of the algorithm is Peter J. Rousseeuw, Annick M. Leroy (1987). Robust regression and outlier detection.

Group: nz.ac.waikato.cms.weka Artifact: leastMedSquared
Show all versions Show documentation Show source 
 

0 downloads
Artifact leastMedSquared
Group nz.ac.waikato.cms.weka
Version 1.0.2
Last update 26. April 2012
Organization University of Waikato, Hamilton, NZ
URL http://weka.sourceforge.net/doc.packages/leastMedSquared
License GNU General Public License 3
Dependencies amount 1
Dependencies weka-dev,
There are maybe transitive dependencies!

isotonicRegression from group nz.ac.waikato.cms.weka (version 1.0.2)

Learns an isotonic regression model. Picks the attribute that results in the lowest squared error. Missing values are not allowed. Can only deal with numeric attributes. Considers the monotonically increasing case as well as the monotonically decreasing case.

Group: nz.ac.waikato.cms.weka Artifact: isotonicRegression
Show all versions Show documentation Show source 
 

0 downloads
Artifact isotonicRegression
Group nz.ac.waikato.cms.weka
Version 1.0.2
Last update 26. April 2012
Organization University of Waikato, Hamilton, NZ
URL http://weka.sourceforge.net/doc.packages/isotonicRegression
License GNU General Public License 3
Dependencies amount 1
Dependencies weka-dev,
There are maybe transitive dependencies!

hyperPipes from group nz.ac.waikato.cms.weka (version 1.0.2)

Class implementing a HyperPipe classifier. For each category a HyperPipe is constructed that contains all points of that category (essentially records the attribute bounds observed for each category). Test instances are classified according to the category that "most contains the instance". Does not handle numeric class, or missing values in test cases. Extremely simple algorithm, but has the advantage of being extremely fast, and works quite well when you have "smegloads" of attributes.

Group: nz.ac.waikato.cms.weka Artifact: hyperPipes
Show all versions Show documentation Show source 
 

1 downloads
Artifact hyperPipes
Group nz.ac.waikato.cms.weka
Version 1.0.2
Last update 26. April 2012
Organization University of Waikato, Hamilton, NZ
URL http://weka.sourceforge.net/doc.packages/hyperPipes
License GNU General Public License 3
Dependencies amount 1
Dependencies weka-dev,
There are maybe transitive dependencies!

grading from group nz.ac.waikato.cms.weka (version 1.0.2)

Implements Grading. The base classifiers are "graded". For more information, see A.K. Seewald, J. Fuernkranz: An Evaluation of Grading Classifiers. In: Advances in Intelligent Data Analysis: 4th International Conference, Berlin/Heidelberg/New York/Tokyo, 115-124, 2001.

Group: nz.ac.waikato.cms.weka Artifact: grading
Show all versions Show documentation Show source 
 

0 downloads
Artifact grading
Group nz.ac.waikato.cms.weka
Version 1.0.2
Last update 26. April 2012
Organization University of Waikato, Hamilton, NZ
URL http://weka.sourceforge.net/doc.packages/grading
License GNU General Public License 3
Dependencies amount 1
Dependencies weka-dev,
There are maybe transitive dependencies!

filteredAttributeSelection from group nz.ac.waikato.cms.weka (version 1.0.2)

This package provides two meta attribute selection evaluators that can apply an arbitrary filter to the input data before executing the actual attribute selection scheme. One filters data and then passes it to an attribute evaluator (FilteredAttributeEval), and the other filters data and then passes it to a subset evaluator (FilteredSubsetEval).

Group: nz.ac.waikato.cms.weka Artifact: filteredAttributeSelection
Show all versions Show documentation Show source 
 

0 downloads
Artifact filteredAttributeSelection
Group nz.ac.waikato.cms.weka
Version 1.0.2
Last update 26. April 2012
Organization University of Waikato, Hamilton, NZ
URL http://weka.sourceforge.net/doc.packages/filteredAttributeSelection
License GNU General Public License 3
Dependencies amount 1
Dependencies weka-dev,
There are maybe transitive dependencies!



Page 2141 from 2580 (items total 25800)


© 2015 - 2024 Weber Informatics LLC | Privacy Policy