Skip to main content

Poe

Overview​

PropertyDetails
DescriptionPoe is Quora's AI platform that provides access to more than 100 models across text, image, video, and voice modalities through a developer-friendly API.
Provider Route on LiteLLMpoe/
Link to Provider DocPoe Website ↗
Base URLhttps://api.poe.com/v1
Supported Operations/chat/completions

What is Poe?​

Poe is Quora's comprehensive AI platform that offers:

  • 100+ Models: Access to a wide variety of AI models
  • Multiple Modalities: Text, image, video, and voice AI
  • Popular Models: Including OpenAI's GPT series and Anthropic's Claude
  • Developer API: Easy integration for applications
  • Extensive Reach: Benefits from Quora's 400M monthly unique visitors

Required Variables​

Environment Variables
os.environ["POE_API_KEY"] = ""  # your Poe API key

Get your Poe API key from the Poe platform.

Usage - LiteLLM Python SDK​

Non-streaming​

Poe Non-streaming Completion
import os
import litellm
from litellm import completion

os.environ["POE_API_KEY"] = "" # your Poe API key

messages = [{"content": "What is the capital of France?", "role": "user"}]

# Poe call
response = completion(
model="poe/model-name", # Replace with actual model name
messages=messages
)

print(response)

Streaming​

Poe Streaming Completion
import os
import litellm
from litellm import completion

os.environ["POE_API_KEY"] = "" # your Poe API key

messages = [{"content": "Write a short poem about AI", "role": "user"}]

# Poe call with streaming
response = completion(
model="poe/model-name", # Replace with actual model name
messages=messages,
stream=True
)

for chunk in response:
print(chunk)

Usage - LiteLLM Proxy Server​

1. Save key in your environment​

export POE_API_KEY=""

2. Start the proxy​

model_list:
- model_name: poe-model
litellm_params:
model: poe/model-name # Replace with actual model name
api_key: os.environ/POE_API_KEY

Supported OpenAI Parameters​

Poe supports all standard OpenAI-compatible parameters:

ParameterTypeDescription
messagesarrayRequired. Array of message objects with 'role' and 'content'
modelstringRequired. Model ID from 100+ available models
streambooleanOptional. Enable streaming responses
temperaturefloatOptional. Sampling temperature
top_pfloatOptional. Nucleus sampling parameter
max_tokensintegerOptional. Maximum tokens to generate
frequency_penaltyfloatOptional. Penalize frequent tokens
presence_penaltyfloatOptional. Penalize tokens based on presence
stopstring/arrayOptional. Stop sequences
toolsarrayOptional. List of available tools/functions
tool_choicestring/objectOptional. Control tool/function calling
response_formatobjectOptional. Response format specification
userstringOptional. User identifier

Available Model Categories​

Poe provides access to models across multiple providers:

  • OpenAI Models: Including GPT-4, GPT-4 Turbo, GPT-3.5 Turbo
  • Anthropic Models: Including Claude 3 Opus, Sonnet, Haiku
  • Other Popular Models: Various provider models available
  • Multi-Modal: Text, image, video, and voice models

Platform Benefits​

Using Poe through LiteLLM offers several advantages:

  • Unified Access: Single API for many different models
  • Quora Integration: Access to large user base and content ecosystem
  • Content Sharing: Capabilities to share model outputs with followers
  • Content Distribution: Best AI content distributed to all users
  • Model Discovery: Efficient way to explore new AI models

Developer Resources​

Poe is actively building developer features and welcomes early access requests for API integration.

Additional Resources​