Default arguments to use when making requests to the LLM
Source:R/chattr-defaults.R
chattr_defaults.Rd
Default arguments to use when making requests to the LLM
Usage
chattr_defaults(
type = "default",
prompt = NULL,
max_data_files = NULL,
max_data_frames = NULL,
include_doc_contents = NULL,
include_history = NULL,
provider = NULL,
path = NULL,
model = NULL,
model_arguments = NULL,
system_msg = NULL,
yaml_file = "chattr.yml",
force = FALSE,
label = NULL,
...
)
Arguments
- type
Entry point to interact with the model. Accepted values: 'notebook', chat'
- prompt
Request to send to LLM. Defaults to NULL
- max_data_files
Sets the maximum number of data files to send to the model. It defaults to 20. To send all, set to NULL
- max_data_frames
Sets the maximum number of data frames loaded in the current R session to send to the model. It defaults to 20. To send all, set to NULL
- include_doc_contents
Send the current code in the document
- include_history
Indicates whether to include the chat history when every time a new prompt is submitted
- provider
The name of the provider of the LLM. Today, only "openai" is is available
- path
The location of the model. It could be an URL or a file path.
- model
The name or path to the model to use.
- model_arguments
Additional arguments to pass to the model as part of the request, it requires a list. Examples of arguments: temperature, top_p, max_tokens
- system_msg
For OpenAI GPT 3.5 or above, the system message to send as part of the request
- yaml_file
The path to a valid
config
YAML file that contains the defaults to use in a session- force
Re-process the base and any work space level file defaults
- label
Label to display in the Shiny app, and other locations
- ...
Additional model arguments that are not standard for all models/backends
Value
An 'ch_model' object that contains the current defaults that will be used to communicate with the LLM.
Details
The idea is that because we will use addin shortcut to execute the
request, all of the other arguments can be controlled via this function. By
default, it will try to load defaults from a config
YAML file, if none are
found, then the defaults for GPT 3.5 will be used. The defaults can be
modified by calling this function, even after the interactive session has
started.