All Downloads are FREE. Search and download functionalities are using the official Maven repository.

Download JAR files tagged by machine with all dependencies

Search JAR files by class name

consistencySubsetEval from group nz.ac.waikato.cms.weka (version 1.0.4)

Evaluates the worth of a subset of attributes by the level of consistency in the class values when the training instances are projected onto the subset of attributes. The consistency of any subset can never be lower than that of the full set of attributes, hence the usual practice is to use this subset evaluator in conjunction with a Random or Exhaustive search which looks for the smallest subset with consistency equal to that of the full set of attributes. See: H. Liu, R. Setiono: A probabilistic approach to feature selection - A filter solution. In: 13th International Conference on Machine Learning, 319-327, 1996.

Group: nz.ac.waikato.cms.weka Artifact: consistencySubsetEval
Show all versions Show documentation Show source 
 

1 downloads
Artifact consistencySubsetEval
Group nz.ac.waikato.cms.weka
Version 1.0.4
Last update 16. October 2014
Organization University of Waikato, Hamilton, NZ
URL http://weka.sourceforge.net/doc.packages/consistencySubsetEval
License GNU General Public License 3
Dependencies amount 1
Dependencies weka-dev,
There are maybe transitive dependencies!

dagging from group nz.ac.waikato.cms.weka (version 1.0.3)

This meta classifier creates a number of disjoint, stratified folds out of the data and feeds each chunk of data to a copy of the supplied base classifier. Predictions are made via majority vote, since all the generated base classifiers are put into the Vote meta classifier. Useful for base classifiers that are quadratic or worse in time behavior, regarding number of instances in the training data. For more information, see: Ting, K. M., Witten, I. H.: Stacking Bagged and Dagged Models. In: Fourteenth international Conference on Machine Learning, San Francisco, CA, 367-375, 1997.

Group: nz.ac.waikato.cms.weka Artifact: dagging
Show all versions Show documentation Show source 
 

2 downloads
Artifact dagging
Group nz.ac.waikato.cms.weka
Version 1.0.3
Last update 29. April 2014
Organization University of Waikato, Hamilton, NZ
URL http://weka.sourceforge.net/doc.packages/dagging
License GNU General Public License 3
Dependencies amount 1
Dependencies weka-dev,
There are maybe transitive dependencies!

userClassifier from group nz.ac.waikato.cms.weka (version 1.0.3)

Interactively classify through visual means. You are Presented with a scatter graph of the data against two user selectable attributes, as well as a view of the decision tree. You can create binary splits by creating polygons around data plotted on the scatter graph, as well as by allowing another classifier to take over at points in the decision tree should you see fit. For more information see: Malcolm Ware, Eibe Frank, Geoffrey Holmes, Mark Hall, Ian H. Witten (2001). Interactive machine learning: letting users build classifiers. Int. J. Hum.-Comput. Stud. 55(3):281-292.

Group: nz.ac.waikato.cms.weka Artifact: userClassifier
Show all versions Show documentation Show source 
 

2 downloads
Artifact userClassifier
Group nz.ac.waikato.cms.weka
Version 1.0.3
Last update 25. April 2014
Organization University of Waikato, Hamilton, NZ
URL http://weka.sourceforge.net/doc.packages/userClassifier
License GNU General Public License 3
Dependencies amount 1
Dependencies weka-dev,
There are maybe transitive dependencies!

averagedOneDependenceEstimators from group nz.ac.waikato.cms.weka (version 1.2.1)

AODE achieves highly accurate classification by averaging over all of a small space of alternative naive-Bayes-like models that have weaker (and hence less detrimental) independence assumptions than naive Bayes. The resulting algorithm is computationally efficient while delivering highly accurate classification on many learning tasks. For more information, see G. Webb, J. Boughton, Z. Wang (2005). Not So Naive Bayes: Aggregating One-Dependence Estimators. Machine Learning. 58(1):5-24.

Group: nz.ac.waikato.cms.weka Artifact: averagedOneDependenceEstimators
Show all versions Show documentation Show source 
 

0 downloads
Artifact averagedOneDependenceEstimators
Group nz.ac.waikato.cms.weka
Version 1.2.1
Last update 20. July 2012
Organization University of Waikato, Hamilton, NZ
URL http://weka.sourceforge.net/doc.packages/averagedOneDependenceEstimators
License GNU General Public License 3
Dependencies amount 1
Dependencies weka-dev,
There are maybe transitive dependencies!

multiBoostAB from group nz.ac.waikato.cms.weka (version 1.0.2)

Class for boosting a classifier using the MultiBoosting method. MultiBoosting is an extension to the highly successful AdaBoost technique for forming decision committees. MultiBoosting can be viewed as combining AdaBoost with wagging. It is able to harness both AdaBoost's high bias and variance reduction with wagging's superior variance reduction. Using C4.5 as the base learning algorithm, Multi-boosting is demonstrated to produce decision committees with lower error than either AdaBoost or wagging significantly more often than the reverse over a large representative cross-section of UCI data sets. It offers the further advantage over AdaBoost of suiting parallel execution. For more information, see Geoffrey I. Webb (2000). MultiBoosting: A Technique for Combining Boosting and Wagging. Machine Learning. Vol.40(No.2).

Group: nz.ac.waikato.cms.weka Artifact: multiBoostAB
Show all versions Show documentation Show source 
 

0 downloads
Artifact multiBoostAB
Group nz.ac.waikato.cms.weka
Version 1.0.2
Last update 26. April 2012
Organization University of Waikato, Hamilton, NZ
URL http://weka.sourceforge.net/doc.packages/multiBoostAB
License GNU General Public License 3
Dependencies amount 1
Dependencies weka-dev,
There are maybe transitive dependencies!

lazyBayesianRules from group nz.ac.waikato.cms.weka (version 1.0.2)

Lazy Bayesian Rules Classifier. The naive Bayesian classifier provides a simple and effective approach to classifier learning, but its attribute independence assumption is often violated in the real world. Lazy Bayesian Rules selectively relaxes the independence assumption, achieving lower error rates over a range of learning tasks. LBR defers processing to classification time, making it a highly efficient and accurate classification algorithm when small numbers of objects are to be classified. For more information, see: Zijian Zheng, G. Webb (2000). Lazy Learning of Bayesian Rules. Machine Learning. 4(1):53-84.

Group: nz.ac.waikato.cms.weka Artifact: lazyBayesianRules
Show all versions Show documentation Show source 
 

0 downloads
Artifact lazyBayesianRules
Group nz.ac.waikato.cms.weka
Version 1.0.2
Last update 26. April 2012
Organization University of Waikato, Hamilton, NZ
URL http://weka.sourceforge.net/doc.packages/lazyBayesianRules
License GNU General Public License 3
Dependencies amount 1
Dependencies weka-dev,
There are maybe transitive dependencies!

fuzzyLaticeReasoning from group nz.ac.waikato.cms.weka (version 1.0.2)

The Fuzzy Lattice Reasoning Classifier uses the notion of Fuzzy Lattices for creating a Reasoning Environment. The current version can be used for classification using numeric predictors. For more information see: I. N. Athanasiadis, V. G. Kaburlasos, P. A. Mitkas, V. Petridis: Applying Machine Learning Techniques on Air Quality Data for Real-Time Decision Support. In: 1st Intl. NAISO Symposium on Information Technologies in Environmental Engineering (ITEE-2003), Gdansk, Poland, 2003; V. G. Kaburlasos, I. N. Athanasiadis, P. A. Mitkas, V. Petridis (2003). Fuzzy Lattice Reasoning (FLR) Classifier and its Application on Improved Estimation of Ambient Ozone Concentration.

Group: nz.ac.waikato.cms.weka Artifact: fuzzyLaticeReasoning
Show all versions Show documentation Show source 
 

0 downloads
Artifact fuzzyLaticeReasoning
Group nz.ac.waikato.cms.weka
Version 1.0.2
Last update 26. April 2012
Organization University of Waikato, Hamilton, NZ
URL http://weka.sourceforge.net/doc.packages/fuzzyLaticeReasoning
License GNU General Public License 3
Dependencies amount 1
Dependencies weka-dev,
There are maybe transitive dependencies!

plexus-compiler-javafxc from group net.sf.m2-javafxc (version 0.3)

This component may be plugged into standard compile plugin of maven to compile JavaFX ( http://javafx.com/) sources. The component assumes that JavaFX SDK 1.2+ is installed on the machine were built process is run. Environment variable JFX_HOME should point to JavaFX installation directory (typically /usr/share/javafx-sdk1.2 for Linux machines). Version 0.3 is current one and is stable. Version 0.2 has deffect and compiles only basic code samples in all other cases it simply fail.

Group: net.sf.m2-javafxc Artifact: plexus-compiler-javafxc
Show all versions 
 

0 downloads
Artifact plexus-compiler-javafxc
Group net.sf.m2-javafxc
Version 0.3
Last update 21. July 2009
Organization not specified
URL http://m2-javafxc.sourceforge.net/
License The Apache Software License, Version 2.0
Dependencies amount 2
Dependencies plexus-utils, plexus-compiler-api,
There are maybe transitive dependencies!

jburg from group net.sourceforge.jburg (version 1.10.3)

A bottom-up rewrite machine is a compiler construction tool that is often used in the compiler's back end to convert a tree-structured representation of a program into machine code -- or, in Java's case, bytecode. JBurg can also be used as a general-purpose dynamic programming engine. JBurg is descended from iburg-class BURGs, described in Fraser, Hanson, and Proebsting's paper, "Engineering a Simple, Efficient Code Generator Generator." JBurg brings similar O(N) minimum-cost tree rewriting capabilities to Java, and also allows the programmer to specify transitions between non-terminal states, that are significantly more powerful than iburg's transitive closures: JBurg transformation rules allow the transformation to inject additional program logic, which makes a JBurg specification more like a grammar than like a list of pattern-matching rules.

Group: net.sourceforge.jburg Artifact: jburg
Show all versions Show documentation Show source 
 

0 downloads
Artifact jburg
Group net.sourceforge.jburg
Version 1.10.3
Last update 24. February 2016
Organization not specified
URL http://jburg.sourceforge.net/
License Common Public License Version 1.0
Dependencies amount 0
Dependencies No dependencies
There are maybe transitive dependencies!

oneClassClassifier from group nz.ac.waikato.cms.weka (version 1.0.4)

Performs one-class classification on a dataset. Classifier reduces the class being classified to just a single class, and learns the datawithout using any information from other classes. The testing stage will classify as 'target'or 'outlier' - so in order to calculate the outlier pass rate the dataset must contain informationfrom more than one class. Also, the output varies depending on whether the label 'outlier' exists in the instances usedto build the classifier. If so, then 'outlier' will be predicted, if not, then the label willbe considered missing when the prediction does not favour the target class. The 'outlier' classwill not be used to build the model if there are instances of this class in the dataset. It cansimply be used as a flag, you do not need to relabel any classes. For more information, see: Kathryn Hempstalk, Eibe Frank, Ian H. Witten: One-Class Classification by Combining Density and Class Probability Estimation. In: Proceedings of the 12th European Conference on Principles and Practice of Knowledge Discovery in Databases and 19th European Conference on Machine Learning, ECMLPKDD2008, Berlin, 505--519, 2008.

Group: nz.ac.waikato.cms.weka Artifact: oneClassClassifier
Show all versions Show documentation Show source 
 

3 downloads
Artifact oneClassClassifier
Group nz.ac.waikato.cms.weka
Version 1.0.4
Last update 14. May 2013
Organization University of Waikato, Hamilton, NZ
URL http://weka.sourceforge.net/doc.packages/oneClassClassifier
License GNU General Public License 3
Dependencies amount 1
Dependencies weka-dev,
There are maybe transitive dependencies!



Page 149 from 3 (items total 1499)


© 2015 - 2024 Weber Informatics LLC | Privacy Policy