Skip to main content

Model Parameters

ModelFileParameters is an enum of generation parameters that Ollama exposes via its Modelfile. These parameters are serialized under the options key in the toApi() output (which is what Ollama's /v1 endpoint expects).

use ByJG\LlmApiObjects\Enum\ModelFileParameters;

$chat->setParameter(ModelFileParameters::temperature, 0.7);

setParameter() validates the value before accepting it. Passing an invalid value throws \InvalidArgumentException.

warning

These parameters are Ollama-specific. They are sent under the options key, which is not a standard OpenAI API field. If you use this with the OpenAI API directly, the extra options key will be ignored or may cause an error.

Parameter Reference

Enum caseAPI keyTypeValid valuesDefaultDescription
mirostatmirostatint0, 1, 20Enable Mirostat sampling. 0 = disabled, 1 = Mirostat, 2 = Mirostat 2.0
mirostatEtamirostat_etafloat≥ 00.1Mirostat learning rate — how quickly the algorithm responds to feedback
mirostatTaumirostat_taufloat≥ 05.0Controls the balance between coherence and diversity
numCtxnum_ctxint> 02048Context window size in tokens
repeatLastNrepeat_last_nint≥ -164How far back to look to prevent repetition. 0 = disabled, -1 = num_ctx
repeatPenaltyrepeat_penaltyfloat> 01.1Strength of repetition penalty. Higher = penalizes more
temperaturetemperaturefloat> 00.8Creativity of responses. Higher = more creative
seedseedint≥ 00Random seed for reproducible generation
stopstopstringany stringStop sequence. Generation halts when this pattern is encountered
tfsZtfs_zfloat≥ 11Tail free sampling. Higher = reduces impact of low-probability tokens. 1.0 = disabled
numPredictnum_predictint≥ -1-1Max tokens to generate. -1 = infinite
topKtop_kint> 040Limits token selection to the top K tokens. Higher = more diverse
topPtop_pfloat> 0, ≤ 10.9Nucleus sampling threshold. Works with top_k
minPmin_pfloat> 0, ≤ 1Minimum token probability relative to the most likely token

Example

use ByJG\LlmApiObjects\Enum\ModelFileParameters;
use ByJG\LlmApiObjects\Model\Chat;
use ByJG\LlmApiObjects\Model\Message;
use ByJG\LlmApiObjects\Enum\Role;

$chat = new Chat(
model: 'llama3',
messages: [
new Message(role: Role::user, message: 'Write a short poem about the sea.'),
],
);

$chat->setParameter(ModelFileParameters::temperature, 0.9);
$chat->setParameter(ModelFileParameters::numCtx, 4096);
$chat->setParameter(ModelFileParameters::seed, 42);
$chat->setParameter(ModelFileParameters::numPredict, 200);

print_r($chat->toApi());
// 'options' => ['temperature' => 0.9, 'num_ctx' => 4096, 'seed' => 42, 'num_predict' => 200]

Validation

Each parameter has strict validation. Invalid values throw \InvalidArgumentException:

// OK
$chat->setParameter(ModelFileParameters::mirostat, 1);

// Throws InvalidArgumentException — mirostat only accepts 0, 1, or 2
$chat->setParameter(ModelFileParameters::mirostat, 3);

// Throws InvalidArgumentException — temperature must be > 0
$chat->setParameter(ModelFileParameters::temperature, -0.5);