🗃️ OpenAI
3 items
📄️ OpenAI (Text Completion)
haimaker supports OpenAI text completion models
📄️ OpenAI-Compatible Endpoints
Selecting openai as the provider routes your request to an OpenAI-compatible endpoint using the upstream
🗃️ Azure OpenAI
3 items
🗃️ Azure AI
2 items
🗃️ Vertex AI
3 items
🗃️ Google AI Studio
4 items
📄️ Anthropic
haimaker supports all anthropic models.
📄️ AWS Sagemaker
haimaker supports All Sagemaker Huggingface Jumpstart Models
🗃️ Bedrock
3 items
📄️ haimaker Proxy (LLM Gateway)
| Property | Details |
📄️ Meta Llama
| Property | Details |
📄️ Mistral AI API
https://docs.mistral.ai/api/
📄️ Codestral API [Mistral AI]
Codestral is available in select code-completion plugins but can also be queried directly. See the documentation for more details.
📄️ Cohere
API KEYS
📄️ Anyscale
https://app.endpoints.anyscale.com/
🗃️ HuggingFace
2 items
📄️ Hyperbolic
Overview
📄️ Databricks
haimaker supports all models on Databricks
📄️ Deepgram
haimaker supports Deepgram's /listen endpoint.
📄️ IBM watsonx.ai
haimaker supports all IBM watsonx.ai foundational models and embeddings.
📄️ Predibase
haimaker supports all models on Predibase
📄️ Nvidia NIM
https://docs.api.nvidia.com/nim/reference/
📄️ Nscale (EU Sovereign)
https://docs.nscale.com/docs/inference/chat
📄️ xAI
https://docs.x.ai/docs
📄️ Moonshot AI
Overview
📄️ LM Studio
https://lmstudio.ai/docs/basics/server
📄️ Cerebras
https://inference-docs.cerebras.ai/api-reference/chat-completions
📄️ Volcano Engine (Volcengine)
https://www.volcengine.com/docs/82379/1263482
📄️ Triton Inference Server
haimaker supports Embedding Models on Triton Inference Servers
📄️ Ollama
haimaker supports all models from Ollama
📄️ Perplexity AI (pplx-api)
https://www.perplexity.ai
📄️ FriendliAI
We support ALL FriendliAI models, just set friendliai/ as a prefix when sending completion requests
📄️ Galadriel
https://docs.galadriel.com/api-reference/chat-completion-API
📄️ Topaz
| Property | Details |
📄️ Groq
https://groq.com/
📄️ Deepseek
https://deepseek.com/
📄️ ElevenLabs
ElevenLabs provides high-quality AI voice technology, including speech-to-text capabilities through their transcription API.
📄️ Fireworks AI
We support ALL Fireworks AI models, just set fireworks_ai/ as a prefix when sending completion requests
📄️ Clarifai
Anthropic, OpenAI, Mistral, Llama and Gemini LLMs are Supported on Clarifai.
📄️ VLLM
haimaker supports all models on VLLM.
📄️ Llamafile
haimaker supports all models on Llamafile.
📄️ Infinity
| Property | Details |
📄️ Xinference [Xorbits Inference]
https://inference.readthedocs.io/en/latest/index.html
📄️ AI/ML API
Getting started with the AI/ML API is simple. Follow these steps to set up your integration:
📄️ Cloudflare Workers AI
https://developers.cloudflare.com/workers-ai/models/text-generation/
📄️ DeepInfra
https://deepinfra.com/
📄️ 🆕 Github
https://github.com/marketplace/models
📄️ GitHub Copilot
https://docs.github.com/en/copilot
📄️ AI21
haimaker supports the following AI21 models:
📄️ NLP Cloud
haimaker supports all LLMs on NLP Cloud.
📄️ Recraft
https://www.recraft.ai/
📄️ Replicate
haimaker supports all models on Replicate
📄️ Together AI
haimaker supports all models on Together AI.
📄️ v0
Overview
📄️ Morph
haimaker supports all models on Morph
📄️ Lambda AI
Overview
📄️ Novita AI
| Property | Details |
📄️ Voyage AI
https://docs.voyageai.com/embeddings/
📄️ Jina AI
https://jina.ai/embeddings/
📄️ Aleph Alpha
haimaker supports all models from Aleph Alpha.
📄️ Baseten
haimaker supports any Text-Gen-Interface models on Baseten.
📄️ OpenRouter
haimaker supports all the text / chat / vision models from OpenRouter
📄️ SambaNova
https://cloud.sambanova.ai/
📄️ Custom API Server (Custom Format)
Call your custom torch-serve / internal LLM APIs via haimaker
📄️ Petals
Petals//github.com/bigscience-workshop/petals
📄️ Snowflake
| Property | Details |
📄️ GradientAI
https://digitalocean.com/products/gradientai
📄️ Featherless AI
https://featherless.ai/
📄️ Nebius AI Studio
https://docs.nebius.com/studio/inference/quickstart
📄️ Dashscope
https://dashscope.console.aliyun.com/
📄️ Bytez
haimaker supports all chat models on Bytez!
📄️ Oracle Cloud Infrastructure (OCI)
haimaker supports the following models for OCI on-demand GenAI API.