Endpoints
Chat, Responses and Messages
The public API supports central LLM workflows for existing integrations.
- POST /v1/chat/completions
- POST /v1/responses
- POST /v1/messages
- POST /v1/embeddings
Messages API. Responses API. Platform Gateway.
2342.ai provides its own platform endpoints. API compatibility is a migration feature: existing tools can connect with minimal changes, but requests still pass through 2342.ai policies, routing, billing, rate limits, logs and compliance rules.
Not a raw pass-through
Requests are not sold as direct provider accounts. Customers use 2342.ai endpoints. Depending on workspace policy, model configurations are selected, budgets are checked, usage is metered and provider-specific compliance rules are applied.
Endpoints
The public API supports central LLM workflows for existing integrations.
Governance
Teams can define which models and provider configurations are approved for which tasks.
Billing
2342.ai usage credits cover routed model execution, gateway processing, policy enforcement, metering and provider costs.
Rollout
Define which teams, data classes, budgets and models are approved.
Connect existing integrations to 2342.ai with minimal changes.
Keep cost, model routing and data paths visible over time.
No. Customers receive access to the 2342.ai platform. Third-party models may be used as technical infrastructure components and subprocessors for selected platform functions.
No. The right approach is an approved GDPR mode: admins decide which provider and model configurations are allowed for sensitive data.
OpenAI-style or Anthropic-style integrations can connect with minimal changes. The endpoints remain 2342.ai platform endpoints with policies, routing, billing, rate limits and logs.