Portkey: An open-source AI gateway for simple LLM orchestration

Learn extra at:


from portkey_ai import Portkey
import os

lb_config = {
    "technique": { "mode": "loadbalance" },
    "targets": [{
        "provider": 'openai',
        "api_key": os.environ["OPENAI_API_KEY"],
        "weight": 0.1
    },{
        "supplier": 'groq',
        "api_key": os.environ["GROQ_API_KEY"],
        "weight": 0.9,
        "override_params": {
            "mannequin": 'llama3-70b-8192'
        },
    }],
}

shopper = Portkey(config=lb_config)

response = shopper.chat.completions.create(
    messages=[{"role": "user", "content": "What's the meaning of life?"}],
    mannequin="gpt-4o-mini"
)

print(response.selections[0].message.content material)

Implementing conditional routing:


from portkey_ai import Portkey
import os

openai_api_key = os.environ["OPENAI_API_KEY"]
groq_api_key = os.environ["GROQ_API_KEY"]

pk_config = {
    "technique": {
        "mode": "conditional",
        "situations": [
            {
                "query": {"metadata.user_plan": {"$eq": "pro"}},
                "then": "openai"
            },
            {
                "query": {"metadata.user_plan": {"$eq": "basic"}},
                "then": "groq"
            }
        ],
        "default": "groq"
    },
    "targets": [
        {
            "name": "openai",
            "provider": "openai",
            "api_key": openai_api_key
        },
        {
            "name": "groq",
            "provider": "groq",
            "api_key": groq_api_key,
            "override_params": {
                "model": "llama3-70b-8192"
            }
        }
    ]
}

metadata = {
    "user_plan": "professional"
}

shopper = Portkey(config=pk_config, metadata=metadata)

response = shopper.chat.completions.create(
    messages=[{"role": "user", "content": "What's the meaning of life?"}]
)
print(response.selections[0].message.content material)

The above instance makes use of the metadata worth user_plan to find out which mannequin ought to be used for the question. That is helpful for SaaS suppliers who supply AI by a freemium plan.

Harnessing Portkey AI Gateway for LLM integration

Portkey represents a big innovation in LLM integration. It addresses important challenges in managing a number of suppliers and optimizing efficiency. By offering an open-source framework that allows seamless interplay with numerous LLM suppliers, the challenge fills a big hole in present AI growth workflows.

The challenge thrives on neighborhood collaboration, welcoming contributions from builders worldwide. With an energetic GitHub neighborhood and open points, Portkey encourages builders to take part in increasing its capabilities. The challenge’s clear growth strategy and open-source licensing make it accessible for each particular person builders and enterprise groups.

Turn leads into sales with free email marketing tools (en)

Leave a reply

Please enter your comment!
Please enter your name here