llm_config()
builds a provider-agnostic configuration object that
call_llm()
(and friends) understand. You can pass provider-specific
parameters via ...
; LLMR forwards them as-is, with a few safe conveniences.
Usage
llm_config(
provider,
model,
api_key = NULL,
troubleshooting = FALSE,
base_url = NULL,
embedding = NULL,
no_change = FALSE,
...
)
Arguments
- provider
Character scalar. One of:
"openai"
,"anthropic"
,"gemini"
,"groq"
,"together"
,"voyage"
(embeddings only),"deepseek"
,"xai"
.- model
Character scalar. Model name understood by the chosen provider. (e.g.,
"gpt-4o-mini"
,"o4-mini"
,"claude-3.7"
,"gemini-2.0-flash"
, etc.)- api_key
Character scalar. Provider API key.
- troubleshooting
Logical. If
TRUE
, prints the full request payloads (including your API key!) for debugging. Use with extreme caution.- base_url
Optional character. Back-compat alias; if supplied it is stored as
api_url
inmodel_params
and overrides the default endpoint.- embedding
NULL
(default),TRUE
, orFALSE
. IfTRUE
, the call is routed to the provider's embeddings API; ifFALSE
, to the chat API. IfNULL
, LLMR infers embeddings whenmodel
contains"embedding"
.- no_change
Logical. If
TRUE
, LLMR never auto-renames/adjusts provider parameters. IfFALSE
(default), well-known compatibility shims may apply (e.g., renaming OpenAI'smax_tokens
→max_completion_tokens
after a server hint; seecall_llm()
notes).- ...
Additional provider-specific parameters (e.g.,
temperature
,top_p
,max_tokens
,top_k
,repetition_penalty
,reasoning_effort
,api_url
, etc.). Values are forwarded verbatim unless documented shims apply. For Anthropic extended thinking, supplythinking_budget
(canonical; mapped tothinking.budget_tokens
) together withinclude_thoughts = TRUE
to request the thinking block in the response.
Value
An object of class c("llm_config", provider)
. Fields:
provider
, model
, api_key
, troubleshooting
, embedding
,
no_change
, and model_params
(a named list of extras).
Temperature range clamping
Anthropic temperatures must be in [0, 1]
; others in [0, 2]
. Out-of-range
values are clamped with a warning.
Endpoint overrides
You can pass api_url
(or base_url=
alias) in ...
to point to gateways
or compatible proxies.
Examples
if (FALSE) { # \dontrun{
# Basic OpenAI config
cfg <- llm_config("openai", "gpt-4o-mini",
temperature = 0.7, max_tokens = 300)
# Generative call returns an llmr_response object
r <- call_llm(cfg, "Say hello in Greek.")
print(r)
as.character(r)
# Embeddings (inferred from the model name)
e_cfg <- llm_config("gemini", "text-embedding-004")
# Force embeddings even if model name does not contain "embedding"
e_cfg2 <- llm_config("voyage", "voyage-large-2", embedding = TRUE)
} # }