Skip to content

Latest commit

 

History

History
131 lines (105 loc) · 7.48 KB

docs.mdx

File metadata and controls

131 lines (105 loc) · 7.48 KB
title description
What is Portkey?
Teams use Portkey to improve the cost, performance, and accuracy of their Gen AI apps.

It takes <2 mins to integrate and with that, it already starts monitoring all of your LLM requests and makes your app resilient, secure, performant, and more accurate at the same time.

Here's a product walkthrough (3 mins):

<iframe width="100%" height="400px" src="https://www.youtube.com/embed/9aO340Hew2I?si=K988Sxs_A1qJg2ag" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

Integrate in 3 Lines of Code

# pip install portkey-ai

from openai import OpenAI
from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders

client = OpenAI(
    base_url=PORTKEY_GATEWAY_URL,
    default_headers=createHeaders(provider="openai", api_key="PORTKEY_API_KEY")
)

chat_complete = client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "Say this is a test"}],
)

print(chat_complete.choices[0].message.content)
// npm i portkey-ai

import OpenAI from 'openai';
import { PORTKEY_GATEWAY_URL, createHeaders } from 'portkey-ai'

const openai = new OpenAI({
  baseURL: PORTKEY_GATEWAY_URL,
  defaultHeaders: createHeaders({provider: "openai", apiKey: "PORTKEY_API_KEY"})
});

async function main() {
  const chatCompletion = await openai.chat.completions.create({
    messages: [{ role: 'user', content: 'Say this is a test' }],
    model: 'gpt-3.5-turbo',
  });

  console.log(chatCompletion.choices);
}

main();
Setting up Portkey takes less than 2 minutes. Find the best integration for you with 200+ models across LLM providers and multiple frameworks. Jump to the product section to learn more about the Portkey modules and the use cases they solve. Head to the API reference and code samples for all Portkey functionality available through REST APIs and SDKs. While you're here, why not [give us a star](https://git.new/ai-gateway-docs)? It helps us a lot!

Languages Supported

Language Supported Library
Javascript portkey-node-sdk
openai-node
Python portkey-python-sdk
openai-python
Go go-openai
Java openai-java
Rust async-openai
Ruby ruby-openai

AI Providers Supported

Portkey is multimodal by default - along with chat and text models, we also support audio, vision, and image generation models.

AI Provider Status
OpenAI fully supported public
Anthropic fully supported public
Azure OpenAI fully supported public
Cohere fully supported public
Anyscale fully supported public
Google Palm fully supported public
Google Gemini fully supported public
Together AI fully supported public
Perplexity fully supported public
Mistral fully supported public
Stability fully supported public
Nomic fully supported public
AI21 fully supported public
AWS Bedrock fully supported public
Ollama fully supported public
AzureML partially supported
BYOLLM fully supported public
Jina AI fully supported public
Fireworks AI fully supported public
LocalAI partially supported public
Predibase fully supported public
ZhipuAI (ChatGLM) fully supported public
Deepinfra fully supported public

View all the supported integration guides.

Frameworks Supported

Framework Status
Langchain native python typescript
Llamaindex native python typescript
Autogen native python
Vercel native typescript
Instructor native python typescript
Promptfoo native typescript
CrewAI native python