Synchronous Requests

While the Queue system is the more reliable and recommended way to submit requests, we also support synchronous requests endpoint.

Synchronous Requests API

While our Queue system is the more reliable and recommended way to submit requests, we also support synchronous requests endpoint.

Synchronous endpoints are beneficial if when you know the request is quick and you are looking for minimal latency. The drawbacks are:

  • You need to keep the connection open until receiving the result

  • The request cannot be interrupted

  • If the connection is interrupted there is not way to obtain the result

  • You will be charged for the full request whether or not you were able to receive the result

The endpoint format and parameters are similar to the Queue ones:

Endpoint
Method
Description

POST

LLM Model endpoint

POST

Image Generative Model endpoint

POST

Video Generative Model endpoint

Submit a request

Here is an example of using the curl command to submit a synchronous request:

curl https://llm.onerouter.pro/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $API_KEY" \
  -d '{
  "model": "claude-3-5-sonnet@20240620",
  "messages": [
    {
      "role": "user",
      "content": "What is the meaning of life?"
    }
  ]
}'

The response will come directly from the model:

Last updated