abacusai.api_class.deployment
Classes
An abstract class for prediction arguments specific to problem type. |
|
Prediction arguments for the OPTIMIZATION problem type |
|
Prediction arguments for the TS_ANOMALY problem type |
|
Prediction arguments for the CHAT_LLM problem type |
|
Prediction arguments for the PREDICTIVE_MODELING problem type |
|
Prediction arguments for the FORECASTING problem type |
|
Prediction arguments for the CUMULATIVE_FORECASTING problem type |
|
Prediction arguments for the NATURAL_LANGUAGE_SEARCH problem type |
|
Prediction arguments for the FEATURE_STORE problem type |
|
Helper class that provides a standard way to create an ABC using |
Module Contents
- class abacusai.api_class.deployment.PredictionArguments
Bases:
abacusai.api_class.abstract.ApiClass
An abstract class for prediction arguments specific to problem type.
- problem_type: abacusai.api_class.enums.ProblemType
- classmethod _get_builder()
- class abacusai.api_class.deployment.OptimizationPredictionArguments
Bases:
PredictionArguments
Prediction arguments for the OPTIMIZATION problem type
- Parameters:
forced_assignments (dict) – Set of assignments to force and resolve before returning query results.
solve_time_limit_seconds (float) – Maximum time in seconds to spend solving the query.
include_all_assignments (bool) – If True, will return all assignments, including assignments with value 0. Default is False.
- __post_init__()
- class abacusai.api_class.deployment.TimeseriesAnomalyPredictionArguments
Bases:
PredictionArguments
Prediction arguments for the TS_ANOMALY problem type
- Parameters:
- __post_init__()
- class abacusai.api_class.deployment.ChatLLMPredictionArguments
Bases:
PredictionArguments
Prediction arguments for the CHAT_LLM problem type
- Parameters:
llm_name (str) – Name of the specific LLM backend to use to power the chat experience.
num_completion_tokens (int) – Default for maximum number of tokens for chat answers.
system_message (str) – The generative LLM system message.
temperature (float) – The generative LLM temperature.
search_score_cutoff (float) – Cutoff for the document retriever score. Matching search results below this score will be ignored.
ignore_documents (bool) – If True, will ignore any documents and search results, and only use the messages to generate a response.
- __post_init__()
- class abacusai.api_class.deployment.RegressionPredictionArguments
Bases:
PredictionArguments
Prediction arguments for the PREDICTIVE_MODELING problem type
- Parameters:
- __post_init__()
- class abacusai.api_class.deployment.ForecastingPredictionArguments
Bases:
PredictionArguments
Prediction arguments for the FORECASTING problem type
- Parameters:
num_predictions (int) – The number of timestamps to predict in the future.
prediction_start (str) – The start date for predictions (e.g., “2015-08-01T00:00:00” as input for mid-night of 2015-08-01).
explain_predictions (bool) – If True, explain predictions for forecasting.
explainer_type (str) – Type of explainer to use for explanations.
get_item_data (bool) – If True, will return the data corresponding to items as well.
- __post_init__()
- class abacusai.api_class.deployment.CumulativeForecastingPredictionArguments
Bases:
PredictionArguments
Prediction arguments for the CUMULATIVE_FORECASTING problem type
- Parameters:
num_predictions (int) – The number of timestamps to predict in the future.
prediction_start (str) – The start date for predictions (e.g., “2015-08-01T00:00:00” as input for mid-night of 2015-08-01).
explain_predictions (bool) – If True, explain predictions for forecasting.
explainer_type (str) – Type of explainer to use for explanations.
get_item_data (bool) – If True, will return the data corresponding to items as well.
- __post_init__()
- class abacusai.api_class.deployment.NaturalLanguageSearchPredictionArguments
Bases:
PredictionArguments
Prediction arguments for the NATURAL_LANGUAGE_SEARCH problem type
- Parameters:
llm_name (str) – Name of the specific LLM backend to use to power the chat experience.
num_completion_tokens (int) – Default for maximum number of tokens for chat answers.
system_message (str) – The generative LLM system message.
temperature (float) – The generative LLM temperature.
search_score_cutoff (float) – Cutoff for the document retriever score. Matching search results below this score will be ignored.
ignore_documents (bool) – If True, will ignore any documents and search results, and only use the messages to generate a response.
- __post_init__()
- class abacusai.api_class.deployment.FeatureStorePredictionArguments
Bases:
PredictionArguments
Prediction arguments for the FEATURE_STORE problem type
- Parameters:
limit_results (int) – If provided, will limit the number of results to the value specified.
- __post_init__()
- class abacusai.api_class.deployment._PredictionArgumentsFactory
Bases:
abacusai.api_class.abstract._ApiClassFactory
Helper class that provides a standard way to create an ABC using inheritance.
- config_abstract_class
- config_class_key = 'problem_type'
- config_class_map