Web Search
Use web search with haimaker to get real-time information from the internet.
| Feature | Details |
|---|---|
| Supported Endpoints | /chat/completions |
| Supported Providers | openai, xai, vertex_ai, anthropic, gemini, perplexity |
Which Search Engine is Used?
Each provider uses their own search backend:
| Provider | Search Engine | Notes |
|---|---|---|
OpenAI (gpt-4o-search-preview) | OpenAI's internal search | Real-time web data |
xAI (grok-3) | xAI's search + X/Twitter | Real-time social media data |
Google AI/Vertex (gemini-2.0-flash) | Google Search | Uses actual Google search results |
Anthropic (claude-3-5-sonnet) | Anthropic's web search | Real-time web data |
| Perplexity | Perplexity's search engine | AI-powered search and reasoning |
info
Anthropic Web Search Models: Claude models that support web search: claude-3-5-sonnet-latest, claude-3-5-sonnet-20241022, claude-3-5-haiku-latest, claude-3-5-haiku-20241022, claude-3-7-sonnet-20250219
Quick Start
Python
from openai import OpenAI
client = OpenAI(
api_key="YOUR_API_KEY",
base_url="https://api.haimaker.ai/v1"
)
response = client.chat.completions.create(
model="openai/gpt-4o-search-preview",
messages=[
{
"role": "user",
"content": "What was a positive news story from today?"
}
],
extra_body={
"web_search_options": {
"search_context_size": "medium" # Options: "low", "medium", "high"
}
}
)
print(response.choices[0].message.content)
cURL
curl https://api.haimaker.ai/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_API_KEY" \
-d '{
"model": "openai/gpt-4o-search-preview",
"messages": [
{
"role": "user",
"content": "What was a positive news story from today?"
}
],
"web_search_options": {
"search_context_size": "medium"
}
}'
Using Different Providers
xAI Grok
from openai import OpenAI
client = OpenAI(
api_key="YOUR_API_KEY",
base_url="https://api.haimaker.ai/v1"
)
response = client.chat.completions.create(
model="xai/grok-3",
messages=[
{
"role": "user",
"content": "What are people saying about AI on Twitter today?"
}
],
extra_body={
"web_search_options": {
"search_context_size": "high"
}
}
)
print(response.choices[0].message.content)
Anthropic Claude
from openai import OpenAI
client = OpenAI(
api_key="YOUR_API_KEY",
base_url="https://api.haimaker.ai/v1"
)
response = client.chat.completions.create(
model="anthropic/claude-3-5-sonnet-latest",
messages=[
{
"role": "user",
"content": "What are the latest developments in AI?"
}
],
extra_body={
"web_search_options": {
"search_context_size": "medium",
"user_location": {
"type": "approximate",
"approximate": {
"city": "San Francisco"
}
}
}
}
)
print(response.choices[0].message.content)
Google Gemini
from openai import OpenAI
client = OpenAI(
api_key="YOUR_API_KEY",
base_url="https://api.haimaker.ai/v1"
)
response = client.chat.completions.create(
model="gemini/gemini-2.0-flash",
messages=[
{
"role": "user",
"content": "What are the top news stories today?"
}
],
extra_body={
"web_search_options": {
"search_context_size": "low"
}
}
)
print(response.choices[0].message.content)
Search Context Size
The search_context_size parameter controls how much search context is provided to the model:
| Value | Description |
|---|---|
"low" | Minimal search context, faster responses |
"medium" | Balanced search context (default) |
"high" | Maximum search context, more comprehensive |
response = client.chat.completions.create(
model="openai/gpt-4o-search-preview",
messages=[{"role": "user", "content": "What's happening in tech today?"}],
extra_body={
"web_search_options": {
"search_context_size": "high"
}
}
)