Skip to content

Cookbook — LangChain#

LangChain works drop-in with siati.ai by changing only base_url and the model.

from langchain_openai import ChatOpenAI, OpenAIEmbeddings
from langchain.schema import HumanMessage, SystemMessage

llm = ChatOpenAI(
    base_url="https://api.siati.ai/v1",
    api_key="siati_...",
    model="siati/llama-3.1-405b",
    temperature=0.7,
)

resp = llm.invoke([
    SystemMessage(content="Answer in formal English."),
    HumanMessage(content="What is the nFADP in 2 lines?"),
])
print(resp.content)

Embeddings:

emb = OpenAIEmbeddings(
    base_url="https://api.siati.ai/v1",
    api_key="siati_...",
    model="siati/bge-m3",
)
vecs = emb.embed_documents(["Sentence A", "Sentence B"])

See also the RAG cookbook for a full FAISS + retrieval chain example.