Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs.json
Original file line number Diff line number Diff line change
Expand Up @@ -425,6 +425,7 @@
"integrations/llms/featherless",
"integrations/llms/jina-ai",
"integrations/llms/lambda",
"integrations/llms/latitude",
"integrations/llms/lemon-fox",
"integrations/llms/lepton",
"integrations/llms/lingyi-01.ai",
Expand Down
24 changes: 18 additions & 6 deletions integrations/guardrails/lasso.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,17 +10,17 @@ To get started with Lasso Security, visit their documentation:

## Using Lasso with Portkey

### 1. Add Lasso Credentials to Portkey
### 1. Add Lasso credentials to Portkey

* Navigate to the `Integrations` page under `Settings`
* Click on the edit button for the Lasso integration
* Add your Lasso API Key (obtain this from your Lasso Security account)
* Optionally, set a custom **API Endpoint** if you use a dedicated Lasso deployment (defaults to `https://server.lasso.security`)

### 2. Add Lasso's Guardrail Check
### 2. Add Lasso's guardrail check

* Navigate to the `Guardrails` page and click the `Create` button
* Search for "Scan Content" and click `Add`
* Set the timeout in milliseconds (default: 10000ms)
* Search for "Classifier" and click `Add`
* Set any `actions` you want on your check, and create the Guardrail!

<Note>
Expand All @@ -29,7 +29,7 @@ To get started with Lasso Security, visit their documentation:

| Check Name | Description | Parameters | Supported Hooks |
|------------|-------------|------------|-----------------|
| Scan Content | Lasso Security's Deputies analyze content for various security risks including jailbreak attempts, custom policy violations, sexual content, hate speech, illegal content, and more. | `Timeout` (number) | `beforeRequestHook` |
| Classifier | Classifies content for security risks using Lasso Security's Deputies v3 API. Returns detailed findings with action types (BLOCK, WARN, AUTO_MASKING) and severity levels. | `messages` (array), `conversationId` (string, optional), `userId` (string, optional) | `beforeRequestHook`, `afterRequestHook` |



Expand Down Expand Up @@ -117,7 +117,19 @@ For more, refer to the [Config documentation](/product/ai-gateway/configs).

Your requests are now guarded by Lasso Security's protective measures, and you can see the verdict and any actions taken directly in your Portkey logs!

## Key Security Features
## Verdict behavior

The Lasso plugin uses the Deputies v3 API and determines whether to block a request based on `violations_detected` and the `action` field in findings:

| Scenario | Verdict | Behavior |
|----------|---------|----------|
| No violations detected | Allow | Request passes through |
| Violations with `BLOCK` action | Block | Request is blocked |
| Violations with only `WARN` actions | Allow | Request passes through, findings included in response data |
| Violations with only `AUTO_MASKING` actions | Allow | Request passes through, findings included in response data |
| API error | Block | Request is blocked (fail-safe) |

## Key security features

Lasso Security's Deputies analyze content for various security risks across multiple categories:

Expand Down
285 changes: 285 additions & 0 deletions integrations/llms/latitude.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,285 @@
---
title: "Latitude AI"
description: Use Latitude AI's OpenAI-compatible inference for chat completions, tool calling, and structured output through Portkey.
---

## Quick start

Get started with Latitude AI in under 2 minutes:

<CodeGroup>

```python Python icon="python"
from portkey_ai import Portkey

# 1. Install: pip install portkey-ai
# 2. Add @latitude provider in model catalog
# 3. Use it:

portkey = Portkey(api_key="PORTKEY_API_KEY")

response = portkey.chat.completions.create(
model="@latitude/qwen-2.5-7b",
messages=[{"role": "user", "content": "Hello!"}]
)

print(response.choices[0].message.content)
```

```js Javascript icon="square-js"
import Portkey from 'portkey-ai'

// 1. Install: npm install portkey-ai
// 2. Add @latitude provider in model catalog
// 3. Use it:

const portkey = new Portkey({
apiKey: "PORTKEY_API_KEY"
})

const response = await portkey.chat.completions.create({
model: "@latitude/qwen-2.5-7b",
messages: [{ role: "user", content: "Hello!" }]
})

console.log(response.choices[0].message.content)
```

```python OpenAI Py icon="python"
from openai import OpenAI
from portkey_ai import PORTKEY_GATEWAY_URL

# 1. Install: pip install openai portkey-ai
# 2. Add @latitude provider in model catalog
# 3. Use it:

client = OpenAI(
api_key="PORTKEY_API_KEY", # Portkey API key
base_url=PORTKEY_GATEWAY_URL
)

response = client.chat.completions.create(
model="@latitude/qwen-2.5-7b",
messages=[{"role": "user", "content": "Hello!"}]
)

print(response.choices[0].message.content)
```

```js OpenAI JS icon="square-js"
import OpenAI from "openai"
import { PORTKEY_GATEWAY_URL } from "portkey-ai"

// 1. Install: npm install openai portkey-ai
// 2. Add @latitude provider in model catalog
// 3. Use it:

const client = new OpenAI({
apiKey: "PORTKEY_API_KEY", // Portkey API key
baseURL: PORTKEY_GATEWAY_URL
})

const response = await client.chat.completions.create({
model: "@latitude/qwen-2.5-7b",
messages: [{ role: "user", content: "Hello!" }]
})

console.log(response.choices[0].message.content)
```

```sh cURL icon="square-terminal"
# 1. Add @latitude provider in model catalog
# 2. Use it:

curl https://api.portkey.ai/v1/chat/completions \
-H "Content-Type: application/json" \
-H "x-portkey-api-key: $PORTKEY_API_KEY" \
-d '{
"model": "@latitude/qwen-2.5-7b",
"messages": [{"role": "user", "content": "Hello!"}]
}'
```

</CodeGroup>

## Add provider in model catalog

Before making requests, add Latitude AI to your Model Catalog:

1. Go to [**Model Catalog → Add Provider**](https://app.portkey.ai/model-catalog/providers)
2. Select **Latitude**
3. Enter your [Latitude AI API key](https://ai.latitude.sh)
4. Name your provider (e.g., `latitude`)

<Card title="Complete Setup Guide" icon="book" href="/product/model-catalog">
See all setup options and detailed configuration instructions
</Card>

---

## Latitude AI capabilities

### Tool calling

Use Latitude AI's tool calling feature to trigger external functions:

<CodeGroup>

```python Python
from portkey_ai import Portkey

portkey = Portkey(api_key="PORTKEY_API_KEY")

tools = [{
"type": "function",
"function": {
"name": "getWeather",
"description": "Get the current weather",
"parameters": {
"type": "object",
"properties": {
"location": {"type": "string", "description": "City and state"},
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
},
"required": ["location"]
}
}
}]

response = portkey.chat.completions.create(
model="@latitude/qwen-2.5-7b",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What's the weather like in Delhi?"}
],
tools=tools,
tool_choice="auto"
)

print(response.choices[0].finish_reason)
```

```javascript Node.js
import Portkey from 'portkey-ai';

const portkey = new Portkey({
apiKey: 'PORTKEY_API_KEY'
});

const tools = [{
type: "function",
function: {
name: "getWeather",
description: "Get the current weather",
parameters: {
type: "object",
properties: {
location: { type: "string", description: "City and state" },
unit: { type: "string", enum: ["celsius", "fahrenheit"] }
},
required: ["location"]
}
}
}];

const response = await portkey.chat.completions.create({
model: "@latitude/qwen-2.5-7b",
messages: [
{ role: "system", content: "You are a helpful assistant." },
{ role: "user", content: "What's the weather like in Delhi?" }
],
tools,
tool_choice: "auto"
});

console.log(response.choices[0].finish_reason);
```

</CodeGroup>

### JSON output

Force structured JSON responses from Latitude AI models:

<CodeGroup>

```python Python
from portkey_ai import Portkey

portkey = Portkey(api_key="PORTKEY_API_KEY")

response = portkey.chat.completions.create(
model="@latitude/qwen-2.5-7b",
messages=[
{"role": "system", "content": "Respond in JSON format with keys: answer, confidence"},
{"role": "user", "content": "What is the capital of France?"}
],
response_format={"type": "json_object"}
)

print(response.choices[0].message.content)
```

```javascript Node.js
import Portkey from 'portkey-ai';

const portkey = new Portkey({
apiKey: 'PORTKEY_API_KEY'
});

const response = await portkey.chat.completions.create({
model: "@latitude/qwen-2.5-7b",
messages: [
{ role: "system", content: "Respond in JSON format with keys: answer, confidence" },
{ role: "user", content: "What is the capital of France?" }
],
response_format: { type: "json_object" }
});

console.log(response.choices[0].message.content);
```

</CodeGroup>

---

## Supported models

| Model | Context | Features |
|-------|---------|----------|
| `qwen-2.5-7b` | 131K | Tools, JSON mode |
| `llama-3.1-8b` | 128K | Tools, JSON mode |
| `qwen3-32b` | 131K | Tools, JSON mode |
| `gemma-2-27b` | 8K | Tools, JSON mode |
| `deepseek-r1-distill-14b` | 64K | Tools, JSON mode, Reasoning |
| `qwen2.5-coder-32b` | 131K | Tools, JSON mode |
| `qwen-2.5-vl-7b` | 32K | Tools, JSON mode, Vision |

<Card title="Latitude AI Models" icon="list" href="https://ai.latitude.sh">
View the complete list of models available on Latitude AI
</Card>

---

## Next steps

<CardGroup cols={2}>
<Card title="Gateway Configs" icon="sliders" href="/product/ai-gateway">
Add fallbacks, load balancing, and more
</Card>
<Card title="Observability" icon="chart-line" href="/product/observability">
Monitor and trace your Latitude AI requests
</Card>
<Card title="Prompt Library" icon="book" href="/product/prompt-engineering-studio">
Manage and version your prompts
</Card>
<Card title="Metadata" icon="tag" href="/product/observability/metadata">
Add custom metadata to requests
</Card>
</CardGroup>

For complete SDK documentation:

<Card title="SDK Reference" icon="code" href="/api-reference/sdk/list">
Complete Portkey SDK documentation
</Card>