API & Integration

Learn how to integrate Thox.ai with your applications and development tools.

REST API authentication

Default configuration

By default, Thox.ai does not require authentication for local network access. This makes it easy to get started quickly.

Enabling API keys

To enable authentication, run: thox auth enable. Then generate an API key: thox auth generate-key --name "my-app". Store this key securely.

Using API keys

Include your API key in the Authorization header: Authorization: Bearer sk-your-key-here

Example request with authentication

curl http://thox.local:8080/v1/chat/completions \
  -H "Authorization: Bearer sk-your-key" \
  -H "Content-Type: application/json" \
  -d '{"model": "thox-coder", "messages": [{"role": "user", "content": "Hello"}]}'

WebSocket streaming setup

Why use WebSocket?

WebSocket provides lower latency streaming compared to HTTP streaming. It's ideal for real-time applications where you need immediate token-by-token output.

Connection URL

Connect to: ws://thox.local:8080/v1/stream

Sending requests

Send JSON messages with the same format as the REST API. The server will stream back response chunks.

Example code

const ws = new WebSocket('ws://thox.local:8080/v1/stream');

ws.onopen = () => {
  ws.send(JSON.stringify({
    model: 'thox-coder',
    messages: [{ role: 'user', content: 'Hello!' }]
  }));
};

ws.onmessage = (event) => {
  const data = JSON.parse(event.data);
  console.log(data.choices?.[0]?.delta?.content || '');
};

OpenAI SDK compatibility

Overview

Thox.ai provides an OpenAI-compatible API, allowing you to use existing OpenAI client libraries with minimal changes.

Python setup

from openai import OpenAI

client = OpenAI(
    base_url="http://thox.local:8080/v1",
    api_key="not-required"  # Set any value if auth is disabled
)

response = client.chat.completions.create(
    model="thox-coder",
    messages=[{"role": "user", "content": "Write a hello world"}]
)

print(response.choices[0].message.content)

Node.js setup

import OpenAI from 'openai';

const client = new OpenAI({
  baseURL: 'http://thox.local:8080/v1',
  apiKey: 'not-required'
});

const response = await client.chat.completions.create({
  model: 'thox-coder',
  messages: [{ role: 'user', content: 'Write a hello world' }]
});

console.log(response.choices[0].message.content);

Supported features

Most OpenAI API features are supported including chat completions, streaming, function calling, and JSON mode. Check the documentation for specific model capabilities.

Model Context Protocol (MCP)

What is MCP?

MCP (Model Context Protocol) is a standard for connecting AI models with development tools. Thox.ai supports MCP for seamless IDE integration.

Configuration

Add Thox.ai as an MCP server in your IDE or tool configuration:

MCP server config

{
  "mcpServers": {
    "thox": {
      "url": "http://thox.local:8080/mcp",
      "transport": "http"
    }
  }
}

Available tools

MCP provides tools like thox_complete (code completion), thox_explain (explain code), thox_refactor (suggest refactoring), and thox_test (generate tests).

More Resources