SDKs & Libraries
Use our official SDKs to integrate Fusion AI into your applications with just a few lines of code.
Python SDK
pip install fusion-ai
JavaScript SDK
npm install @fusion-ai/sdk
Postman Collection
Import & test instantly
cURL Examples
Command line testing
Python SDK
Installation
pip install fusion-ai
Basic Usage
from fusion_ai import FusionAI # Initialize the client client = FusionAI(api_key="sk-fusion-your-key-here") # Simple chat completion response = client.chat( prompt="Explain machine learning in simple terms", provider="neuroswitch" # Let NeuroSwitch choose the best model ) print(response.text) print(f"Used: {response.provider}/{response.model}") print(f"Tokens: {response.tokens.total_tokens}")
Advanced Examples
Streaming Response
for chunk in client.chat_stream( prompt="Write a story about AI", provider="neuroswitch" ): print(chunk.text, end="", flush=True)
Specific Provider
response = client.chat( prompt="Code review this function", provider="claude", model="claude-3-sonnet" )
With BYOAPI Key
response = client.chat( prompt="Hello world", provider="openai", byoapi_key="your-openai-key" )
Error Handling
try: response = client.chat(prompt="Hello") except FusionAIError as e: print(f"Error: {e.message}") print(f"Code: {e.status_code}")
JavaScript SDK
Installation
npm install @fusion-ai/sdk
Basic Usage
import FusionAI from '@fusion-ai/sdk'; // Initialize the client const client = new FusionAI({ apiKey: 'sk-fusion-your-key-here' }); // Simple chat completion const response = await client.chat({ prompt: 'Explain machine learning in simple terms', provider: 'neuroswitch' }); console.log(response.text); console.log(`Used: ${response.provider}/${response.model}`); console.log(`Tokens: ${response.tokens.total_tokens}`);
Advanced Examples
Streaming Response
for await (const chunk of client.chatStream({ prompt: 'Write a story about AI', provider: 'neuroswitch' })) { process.stdout.write(chunk.text); }
React Hook
import { useFusionAI } from '@fusion-ai/react'; const { chat, loading, error } = useFusionAI(); const handleSubmit = async () => { const response = await chat({ prompt: userInput, provider: 'neuroswitch' }); };
Express.js Route
app.post('/api/chat', async (req, res) => { try { const response = await client.chat({ prompt: req.body.prompt, provider: 'neuroswitch' }); res.json(response); } catch (error) { res.status(500).json({ error: error.message }); } });
TypeScript Types
interface ChatRequest { prompt: string; provider?: Provider; model?: string; temperature?: number; max_tokens?: number; }
Postman Collection
Import our pre-configured Postman collection to start testing the Fusion AI API immediately.
What's Included
- Chat completion examples
- All provider options (NeuroSwitch, OpenAI, Claude, Gemini)
- Streaming requests
- BYOAPI examples
- Error handling scenarios
- Environment variables setup
cURL Examples
Basic Chat Request
curl -X POST https://api.mcp4.ai/chat \ -H "Authorization: Bearer sk-fusion-your-key-here" \ -H "Content-Type: application/json" \ -d '{ "prompt": "Hello, how are you?", "provider": "neuroswitch" }'
Streaming Request
curl -X POST https://api.mcp4.ai/chat \ -H "Authorization: Bearer sk-fusion-..." \ -H "Content-Type: application/json" \ -d '{ "prompt": "Write a story", "provider": "neuroswitch", "stream": true }'
Specific Provider
curl -X POST https://api.mcp4.ai/chat \ -H "Authorization: Bearer sk-fusion-..." \ -H "Content-Type: application/json" \ -d '{ "prompt": "Explain quantum physics", "provider": "claude", "model": "claude-3-sonnet" }'
With Parameters
curl -X POST https://api.mcp4.ai/chat \ -H "Authorization: Bearer sk-fusion-..." \ -H "Content-Type: application/json" \ -d '{ "prompt": "Be creative", "provider": "neuroswitch", "temperature": 0.9, "max_tokens": 500 }'
Check Credits
curl -H "Authorization: Bearer sk-fusion-..." \ https://api.mcp4.ai/credits