Open WebUI¶
Open WebUI is the primary user interface for interacting with AI models in the Local AI Cyber Lab. It provides a modern, feature-rich chat interface for model interaction and management.
Architecture Overview¶
graph TB
subgraph User_Interface["User Interface"]
chat["Chat Interface"]
settings["Settings Panel"]
templates["Prompt Templates"]
history["Chat History"]
end
subgraph Backend["Backend Services"]
api["API Server"]
auth["Authentication"]
storage["State Management"]
subgraph Model_Integration["Model Integration"]
ollama["Ollama Connection"]
langfuse["Langfuse Analytics"]
guardian["AI Guardian"]
end
end
subgraph Storage["Persistent Storage"]
db["Chat Database"]
files["File Storage"]
config["Configuration"]
end
User_Interface --> Backend
Backend --> Storage
Model_Integration --> ollama
Model_Integration --> langfuse
Model_Integration --> guardian
classDef primary fill:#f9f,stroke:#333,stroke-width:2px
classDef secondary fill:#bbf,stroke:#333,stroke-width:1px
class chat,api primary
class ollama,guardian secondary
Features¶
Chat Interface¶
graph LR
subgraph Chat_Features["Chat Features"]
A["Message Input"] --> B["Model Selection"]
B --> C["Parameter Control"]
C --> D["Response Generation"]
D --> E["History Management"]
end
subgraph Advanced_Features["Advanced Features"]
F["File Upload"] --> G["Code Highlighting"]
G --> H["Markdown Support"]
H --> I["Export Options"]
end
subgraph Integration["Integrations"]
J["Model APIs"] --> K["Security Checks"]
K --> L["Analytics"]
L --> M["Storage"]
end
Chat_Features --> Advanced_Features
Advanced_Features --> Integration
Installation¶
Open WebUI is automatically deployed as part of the Local AI Cyber Lab. To customize:
# Update Open WebUI
docker-compose pull openwebui
# Start the service
docker-compose up -d openwebui
Configuration¶
Environment Variables¶
# .env file
OPENWEBUI_PORT=3000
WEBUI_AUTH_TOKEN=your-secure-token
OLLAMA_API_BASE_URL=http://ollama:11434
Security Settings¶
# docker-compose.yml
services:
openwebui:
environment:
- WEBUI_AUTH_TOKEN=${WEBUI_AUTH_TOKEN}
- SESSION_SECRET=${SESSION_SECRET}
- ENABLE_SECURITY_HEADERS=true
User Interface Features¶
Chat Management¶
- Model Selection:
- Choose from available models
- Configure model parameters
-
Save custom presets
-
Chat Controls:
- Message formatting
- File attachments
- Code blocks
-
Markdown support
-
History Management:
- Save conversations
- Export chat logs
- Search history
- Tag conversations
Advanced Features¶
-
Prompt Templates:
-
Parameter Controls:
Integration¶
Ollama Integration¶
# Example API integration
async def query_model(prompt, model="llama2"):
response = await fetch(f"{OLLAMA_API_BASE_URL}/api/chat", {
method: "POST",
headers: {"Content-Type": "application/json"},
body: JSON.stringify({
model: model,
messages: [{"role": "user", "content": prompt}]
})
})
return await response.json()
Security Integration¶
# AI Guardian integration
async def validate_prompt(prompt):
response = await fetch("/api/security/validate", {
method: "POST",
headers: {
"Content-Type": "application/json",
"Authorization": f"Bearer {API_KEY}"
},
body: JSON.stringify({ prompt: prompt })
})
return await response.json()
Monitoring¶
Health Checks¶
# docker-compose.yml
services:
openwebui:
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:3000"]
interval: 30s
timeout: 10s
retries: 3
Analytics Integration¶
// Langfuse integration
const trackModelUsage = async (modelId, prompt, response) => {
await langfuse.track({
type: "llm",
modelId,
prompt,
response,
metadata: {
temperature: modelParams.temperature,
maxTokens: modelParams.max_tokens
}
});
};
Performance Optimization¶
Caching¶
// Response caching
const cache = new Map();
const getCachedResponse = async (prompt, model) => {
const key = `${model}:${prompt}`;
if (cache.has(key)) {
return cache.get(key);
}
const response = await queryModel(prompt, model);
cache.set(key, response);
return response;
};
Resource Management¶
# docker-compose.yml
services:
openwebui:
mem_limit: ${OPENWEBUI_MEMORY_LIMIT:-1g}
cpus: ${OPENWEBUI_CPU_LIMIT:-1.0}
Troubleshooting¶
Common Issues¶
-
Connection Problems:
-
Authentication Issues: