abacusai.llm_response

Classes

LlmResponse

The response returned by LLM

Module Contents

class abacusai.llm_response.LlmResponse(client, content=None, tokens=None, stopReason=None, llmName=None, inputTokens=None, outputTokens=None, totalTokens=None, codeBlocks={})

Bases: abacusai.return_class.AbstractApiClass

The response returned by LLM

Parameters:
  • client (ApiClient) – An authenticated API Client instance

  • content (str) – Full response from LLM.

  • tokens (int) – The number of tokens in the response.

  • stopReason (str) – The reason due to which the response generation stopped.

  • llmName (str) – The name of the LLM model used to generate the response.

  • inputTokens (int) – The number of input tokens used in the LLM call.

  • outputTokens (int) – The number of output tokens generated in the LLM response.

  • totalTokens (int) – The total number of tokens (input + output) used in the LLM interaction.

  • codeBlocks (LlmCodeBlock) – A list of parsed code blocks from raw LLM Response

content
tokens
stop_reason
llm_name
input_tokens
output_tokens
total_tokens
code_blocks
deprecated_keys
__repr__()
to_dict()

Get a dict representation of the parameters in this class

Returns:

The dict value representation of the class parameters

Return type:

dict