LLM API Objects
Strongly-typed PHP model layer for building OpenAI Chat Completions API request payloads.
Works with both OpenAI and Ollama (via the /v1 OpenAI-compatible endpoint) using the openai-php/client.
Features
- Strongly-typed models for Chat Completions requests
- Full support for function calling (tools):
Tool,ToolFunction,ToolParameter,ToolChoice - Multi-turn conversations with
role=toolmessages andtool_call_id - Automatic message splitting when content exceeds a token limit
- Delimiter-based message partitioning
- Compatible with OpenAI and Ollama (
/v1endpoint) - Ollama model tuning parameters via
ModelFileParameters - Jira XML/RSS importers to seed conversation context
Documentation
- Getting Started — Installation and first request
- Chat Model — Building and sending chat requests
- Messages — Roles, content,
tool_call_id,tool_calls - Tools — Function calling: Tool, ToolFunction, ToolParameter, ToolChoice
- Model Parameters — Ollama model tuning parameters
- Importers — Jira XML/RSS importers
Quick Examples
Basic chat with OpenAI
<?php
require 'vendor/autoload.php';
$client = OpenAI::factory()
->withApiKey($_ENV['OPENAI_API_KEY'])
->make();
$chat = new \ByJG\LlmApiObjects\Model\Chat(
model: 'gpt-4o',
messages: [
new \ByJG\LlmApiObjects\Model\Message(
role: \ByJG\LlmApiObjects\Enum\Role::user,
message: 'What is the capital of France?',
),
],
system: 'You are a helpful geography assistant.',
);
$result = $client->chat()->create($chat->toApi());
echo $result->choices[0]->message->content;
Basic chat with Ollama
<?php
$client = OpenAI::factory()
->withApiKey('ollama')
->withBaseUri('http://localhost:11434/v1')
->make();
$chat = new \ByJG\LlmApiObjects\Model\Chat(
model: 'llama3',
messages: [
new \ByJG\LlmApiObjects\Model\Message(
role: \ByJG\LlmApiObjects\Enum\Role::user,
message: 'Hello!',
),
],
);
$result = $client->chat()->create($chat->toApi());
echo $result->choices[0]->message->content;
Function calling (tools)
<?php
use ByJG\LlmApiObjects\Enum\ToolChoice;
use ByJG\LlmApiObjects\Model\Chat;
use ByJG\LlmApiObjects\Model\Message;
use ByJG\LlmApiObjects\Enum\Role;
use ByJG\LlmApiObjects\Tool\Tool;
use ByJG\LlmApiObjects\Tool\ToolFunction;
use ByJG\LlmApiObjects\Tool\ToolParameter;
$weatherTool = new Tool(
new ToolFunction(
name: 'get_current_weather',
description: 'Get the current weather in a given location',
parameters: [
new ToolParameter(name: 'location', type: 'string', description: 'City name', required: true),
new ToolParameter(name: 'unit', type: 'string', enum: ['celsius', 'fahrenheit']),
],
),
);
$chat = new Chat(
model: 'gpt-4o',
messages: [
new Message(role: Role::user, message: "What's the weather in Paris?"),
],
);
$chat->addTool($weatherTool);
$chat->setToolChoice(ToolChoice::auto);
$result = $client->chat()->create($chat->toApi());
Install
composer require byjg/llm-api-objects
Running Unit Tests
composer test