com.pulumi.azurenative.machinelearningservices.enums.RegressionModels Maven / Gradle / Ivy
Go to download
Show more of this group Show more artifacts with this name
Show all versions of azure-native Show documentation
Show all versions of azure-native Show documentation
A native Pulumi package for creating and managing Azure resources.
// *** WARNING: this file was generated by pulumi-java-gen. ***
// *** Do not edit by hand unless you're certain you know what you are doing! ***
package com.pulumi.azurenative.machinelearningservices.enums;
import com.pulumi.core.annotations.EnumType;
import java.lang.String;
import java.util.Objects;
import java.util.StringJoiner;
/**
* Enum for all Regression models supported by AutoML.
*
*/
@EnumType
public enum RegressionModels {
/**
* Elastic net is a popular type of regularized linear regression that combines two popular penalties, specifically the L1 and L2 penalty functions.
*
*/
ElasticNet("ElasticNet"),
/**
* The technique of transiting week learners into a strong learner is called Boosting. The gradient boosting algorithm process works on this theory of execution.
*
*/
GradientBoosting("GradientBoosting"),
/**
* Decision Trees are a non-parametric supervised learning method used for both classification and regression tasks.
* The goal is to create a model that predicts the value of a target variable by learning simple decision rules inferred from the data features.
*
*/
DecisionTree("DecisionTree"),
/**
* K-nearest neighbors (KNN) algorithm uses 'feature similarity' to predict the values of new datapoints
* which further means that the new data point will be assigned a value based on how closely it matches the points in the training set.
*
*/
KNN("KNN"),
/**
* Lasso model fit with Least Angle Regression a.k.a. Lars. It is a Linear Model trained with an L1 prior as regularizer.
*
*/
LassoLars("LassoLars"),
/**
* SGD: Stochastic gradient descent is an optimization algorithm often used in machine learning applications
* to find the model parameters that correspond to the best fit between predicted and actual outputs.
* It's an inexact but powerful technique.
*
*/
SGD("SGD"),
/**
* Random forest is a supervised learning algorithm.
* The "forest" it builds, is an ensemble of decision trees, usually trained with the “bagging” method.
* The general idea of the bagging method is that a combination of learning models increases the overall result.
*
*/
RandomForest("RandomForest"),
/**
* Extreme Trees is an ensemble machine learning algorithm that combines the predictions from many decision trees. It is related to the widely used random forest algorithm.
*
*/
ExtremeRandomTrees("ExtremeRandomTrees"),
/**
* LightGBM is a gradient boosting framework that uses tree based learning algorithms.
*
*/
LightGBM("LightGBM"),
/**
* XGBoostRegressor: Extreme Gradient Boosting Regressor is a supervised machine learning model using ensemble of base learners.
*
*/
XGBoostRegressor("XGBoostRegressor");
private final String value;
RegressionModels(String value) {
this.value = Objects.requireNonNull(value);
}
@EnumType.Converter
public String getValue() {
return this.value;
}
@Override
public java.lang.String toString() {
return new StringJoiner(", ", "RegressionModels[", "]")
.add("value='" + this.value + "'")
.toString();
}
}
© 2015 - 2024 Weber Informatics LLC | Privacy Policy