Skip to contents

Sets the LLM model to use in your session


chattr_use(x = NULL, ...)



The label of the LLM model to use, or the path of a valid YAML default file . Valid values are 'copilot', 'gpt4', 'gpt35', 'llamagpt', 'databricks-dbrx', 'databricks-meta-llama3-70b', and 'databricks-mixtral8x7b'. The value 'test' is also acceptable, but it is meant for package examples, and internal testing.


Default values to modify.


It returns console messages to allow the user select the model to use.


If the error "No model setup found" was returned, that is because none of the expected setup for Copilot, OpenAI or LLama was automatically detected. Here is how to setup a model:

  • OpenAI - The main thing chattr checks is the presence of the R user's OpenAI PAT (Personal Access Token). It looks for it in the 'OPENAI_API_KEY' environment variable. Get a PAT from the OpenAI website, and save it to that environment variable. Then restart R, and try again.

  • GitHub Copilot - Setup GitHub Copilot in your RStudio IDE, and restart R. chattr will look for the default location where RStudio saves the Copilot authentication information.

  • Databricks - chattr checks for presence of R user's Databricks host and token ('DATABRICKS_HOST' and 'DATABRICKS TOKEN' environment variables).

Use the 'CHATTR_MODEL' environment variable to set it for the R session, or create a YAML file named 'chattr.yml' in your working directory to control the model, and the defaults it will use to communicate with such model.