Frameworks
Using OneRouter with Frameworks
Using the OpenAI SDK
Using
pip install openai
Using
npm i openai
from openai import OpenAI
client = OpenAI(
base_url="https://app.onerouter.pro/v1",
api_key="<API_KEY>",
)
completion = client.chat.completions.create(
model="{{MODEL}}",
messages=[
{
"role": "user",
"content": "What is the meaning of life?"
}
]
)
print(completion.choices[0].message.content)
Using LangChain
Using LangChain for Python
Using LangChain.js
Using Streamlit
from langchain.chat_models import ChatOpenAI
from langchain.prompts import PromptTemplate
from langchain.chains import LLMChain
from os import getenv
from dotenv import load_dotenv
load_dotenv()
template = """Question: {question}
Answer: Let's think step by step."""
prompt = PromptTemplate(template=template, input_variables=["question"])
llm = ChatOpenAI(
openai_api_key="API_KEY",
openai_api_base="https://app.onerouter.pro/v1",
model_name="<model_name>",
model_kwargs={
},
)
llm_chain = LLMChain(prompt=prompt, llm=llm)
question = "What NFL team won the Super Bowl in the year Justin Beiber was born?"
print(llm_chain.run(question))
Using PydanticAI
PydanticAI provides a high-level interface for working with various LLM providers, including OneRouter.
Installation
pip install 'pydantic-ai-slim[openai]'
Configuration
You can use OneRouter with PydanticAI through its OpenAI-compatible interface:
from pydantic_ai import Agent
from pydantic_ai.models.openai import OpenAIModel
model = OpenAIModel(
"claude-3-5-sonnet@20240620", # or any other OneRouter model
base_url="https://app.onerouter.pro/v1",
api_key="API_KEY",
)
agent = Agent(model)
result = await agent.run("What is the meaning of life?")
print(result)
Resources
For more information and detailed documentation, check out these resources:
Mastra Documentation - Comprehensive documentation for the Mastra framework
Last updated