OpenAI Provider
OpenAI API provider implementing AiProvider with streaming, tool use, and vision support. Connects to GPT-4o, GPT-4, GPT-3.5 Turbo, and other OpenAI models.
Installation
$
flai add openai_provider
Import
import 'package:my_app/flai/providers/openai_provider.dart';
Setup
final provider = OpenAiProvider(
apiKey: 'sk-...',
model: 'gpt-4o',
baseUrl: 'https://api.openai.com/v1', // default
);
// Use with ChatScreenController
final controller = ChatScreenController(
provider: provider,
);
Never hardcode API keys in your source code. Use environment variables, a secrets manager, or a backend proxy.
Configuration
| Parameter | Type | Default | Description |
|---|---|---|---|
| apiKey | String | required | Your OpenAI API key. |
| model | String | 'gpt-4o' | Model ID to use for chat completions. |
| baseUrl | String | 'https://api.openai.com/v1' | API base URL. Change for Azure OpenAI or proxies. |
| temperature | double? | null | Sampling temperature (0.0 to 2.0). |
| maxTokens | int? | null | Maximum tokens in the response. |
| systemPrompt | String? | null | System message prepended to all conversations. |
Capabilities
| Capability | Supported | Notes |
|---|---|---|
| supportsStreaming | Yes | Server-sent events via stream: true |
| supportsToolUse | Yes | Function calling with JSON schema tools |
| supportsVision | Yes | Image inputs via base64 or URL (GPT-4o, GPT-4V) |
| supportsThinking | No | OpenAI does not expose reasoning tokens |
Streaming
The provider emits a stream of ChatEvent objects as the response is generated:
// Low-level streaming API
final stream = provider.streamChat(ChatRequest(
messages: [
Message(
id: '1',
role: MessageRole.user,
content: 'Explain Flutter',
timestamp: DateTime.now(),
),
],
));
await for (final event in stream) {
switch (event) {
case TextDelta(:final text):
print(text); // Token-by-token text
case ToolCallDelta(:final toolCall):
print('Tool: ${toolCall.name}');
case UsageEvent(:final usage):
print('Tokens: ${usage.totalTokens}');
case DoneEvent():
print('Stream complete');
case ErrorEvent(:final error):
print('Error: $error');
}
}
Tool Use
Define tools using the ChatRequest.tools parameter:
final request = ChatRequest(
messages: messages,
tools: [
Tool(
name: 'search_web',
description: 'Search the web for information',
parameters: {
'type': 'object',
'properties': {
'query': {
'type': 'string',
'description': 'Search query',
},
},
'required': ['query'],
},
),
],
);
Azure OpenAI
Point to your Azure OpenAI deployment by changing the baseUrl:
final provider = OpenAiProvider(
apiKey: 'your-azure-key',
model: 'your-deployment-name',
baseUrl: 'https://your-resource.openai.azure.com/openai/deployments/your-deployment',
);