abacusai.llm_response ===================== .. py:module:: abacusai.llm_response Classes ------- .. autoapisummary:: abacusai.llm_response.LlmResponse Module Contents --------------- .. py:class:: LlmResponse(client, content=None, tokens=None, stopReason=None, llmName=None, inputTokens=None, outputTokens=None, totalTokens=None, codeBlocks={}) Bases: :py:obj:`abacusai.return_class.AbstractApiClass` The response returned by LLM :param client: An authenticated API Client instance :type client: ApiClient :param content: Full response from LLM. :type content: str :param tokens: The number of tokens in the response. :type tokens: int :param stopReason: The reason due to which the response generation stopped. :type stopReason: str :param llmName: The name of the LLM model used to generate the response. :type llmName: str :param inputTokens: The number of input tokens used in the LLM call. :type inputTokens: int :param outputTokens: The number of output tokens generated in the LLM response. :type outputTokens: int :param totalTokens: The total number of tokens (input + output) used in the LLM interaction. :type totalTokens: int :param codeBlocks: A list of parsed code blocks from raw LLM Response :type codeBlocks: LlmCodeBlock .. py:attribute:: content :value: None .. py:attribute:: tokens :value: None .. py:attribute:: stop_reason :value: None .. py:attribute:: llm_name :value: None .. py:attribute:: input_tokens :value: None .. py:attribute:: output_tokens :value: None .. py:attribute:: total_tokens :value: None .. py:attribute:: code_blocks .. py:attribute:: deprecated_keys .. py:method:: __repr__() .. py:method:: to_dict() Get a dict representation of the parameters in this class :returns: The dict value representation of the class parameters :rtype: dict