A practical step-by-step tutorial showing developers how to enhance their AI applications with private knowledge and real-time data using Context Kitten's OpenAI-compatible API, complete with Python and TypeScript code examples.
Building Your First Context-Aware AI App with Context Kitten: A Technical Tutorial Last week, a developer reached out with a common challenge: their AI application was giving outdated responses about their product documentation. Despite having a great chatbot interface built with OpenAI's API, the responses weren't incorporating their latest product updates. Their frustration was palpable - they needed their AI to understand both their private documentation and current market data. This is a challenge we've seen repeatedly. Organizations invest significant resources in building AI applications, only to find their AI responses lacking critical context. Today, we'll walk through how to solve this using Context Kitten's API, which seamlessly combines your private data with real-time internet information. The Challenge of Context-Aware AI Traditional AI implementations face several limitations:
Limited access to private organizational knowledge Outdated information due to training cutoff dates Complex integration requirements across multiple data sources High costs of retraining models with new data
Let's transform a basic OpenAI-powered chatbot into a context-aware AI application. We'll use Python for the backend and TypeScript for the frontend.
First, replace your OpenAI client configuration with Context Kitten:
from openai import OpenAI
client = OpenAI(api_key="your-api-key")
from openai import OpenAI
client = OpenAI(
api_key="your-context-kitten-key",
base_url="https://api.contextkitten.com/v1"
)
Now, let's add document management to incorporate your private knowledge:
async def upload_document(file_path: str):
with open(file_path, 'rb') as f:
files = {'file': f}
response = await client.post(
'https://api.contextkitten.com/v1/documents/upload',
files=files
)
return response.json()
Here's how to upgrade your chat interface to include context from both your documents and the internet:
from typing import List, Dict
import asyncio
from fastapi import FastAPI, WebSocket
from pydantic import BaseModel
class Message(BaseModel):
role: str
content: str
class ChatService:
def __init__(self, api_key: str):
self.client = OpenAI(
api_key=api_key,
base_url="https://api.contextkitten.com/v1"
)
self.messages: List[Message] = []
async def send_message(self, content: str) -> Dict:
self.messages.append({"role": "user", "content": content})
response = await self.client.chat.completions.create(
messages=self.messages,
search_options={
"enable_document_search": True,
"enable_web_search": True
}
)
assistant_message = response.choices[0].message
self.messages.append(assistant_message)
return assistant_message
# FastAPI implementation
app = FastAPI()
chat_service = ChatService(api_key="your-context-kitten-key")
@app.websocket("/chat")
async def chat_endpoint(websocket: WebSocket):
await websocket.accept()
try:
while True:
content = await websocket.receive_text()
response = await chat_service.send_message(content)
await websocket.send_json(response)
except Exception as e:
print(f"Error in chat: {e}")
await websocket.close()
Let's compare responses to the question "What are the current Context Kitten pricing tiers?":
We offer Basic, Pro, and Enterprise tiers, but I don't have specific pricing information.
We currently offer three tiers:
- Free Tier: 1GB storage and $1 in free API credits
- Scale ($30/month): 20GB storage with API credits at cost
- Enterprise: Custom solutions with flexible storage and advanced security features Additional storage is available at $0.50/GB for the Scale tier.
async def search_documents(query: str):
response = await client.get(
'https://api.contextkitten.com/v1/search',
params={
'query': query,
'enable_document_search': True,
'enable_web_search': True
}
)
return response.json()
async def list_available_models():
response = await client.get(
'https://api.contextkitten.com/v1/models'
)
return response.json()
Building context-aware AI applications doesn't have to be complex. With Context Kitten, you can quickly upgrade your existing AI implementation to include both private knowledge and real-time information. The API's compatibility with OpenAI's interface makes migration straightforward, while adding powerful new capabilities to your applications. Want to see more? Check out our API documentation or join our growing developer community.