Skip to main content

Check out Port for yourself ➜ 

AWS Bedrock setup

AWS Bedrock requires IAM policy configuration and an authentication method before you can register it as an LLM provider in Port. Complete the steps below before proceeding to Step 2: Store API Keys in Secrets in the main setup guide.

Step 1: Configure IAM policy

Set up an IAM policy to grant permissions for invoking Bedrock models. Serverless models are automatically available, but you control access through IAM policies. Anthropic models require additional setup — see Anthropic models requirements below.

Option 1: Allow specific models

Restrict access to specific models (recommended). Example for Anthropic models in Europe:

View IAM policy example (click to expand)
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowBedrockInference",
"Effect": "Allow",
"Action": [
"bedrock:InvokeModel",
"bedrock:InvokeModelWithResponseStream"
],
"Resource": [
"arn:aws:bedrock:*:*:inference-profile/eu.anthropic.claude-sonnet-4-20250514-v1:0",
"arn:aws:bedrock:*::foundation-model/anthropic.claude-sonnet-4-20250514-v1:0",
"arn:aws:bedrock:*:*:inference-profile/eu.anthropic.claude-haiku-4-5-20251001-v1:0",
"arn:aws:bedrock:*::foundation-model/anthropic.claude-haiku-4-5-20251001-v1:0",
"arn:aws:bedrock:*:*:inference-profile/eu.anthropic.claude-sonnet-4-5-20250929-v1:0",
"arn:aws:bedrock:*::foundation-model/anthropic.claude-sonnet-4-5-20250929-v1:0",
"arn:aws:bedrock:*:*:inference-profile/eu.anthropic.claude-opus-4-5-20251101-v1:0",
"arn:aws:bedrock:*::foundation-model/anthropic.claude-opus-4-5-20251101-v1:0",
"arn:aws:bedrock:*:*:inference-profile/eu.anthropic.claude-opus-4-6-v1:0",
"arn:aws:bedrock:*::foundation-model/anthropic.claude-opus-4-6-v1:0"
]
}
]
}

Each model requires two ARN entries: inference-profile and foundation-model. Adjust the region and model as needed.

Option 2: Allow all models

Use a wildcard policy to allow all models. You can still disable specific models using the Create or connect an LLM provider API.

View IAM policy example (click to expand)
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowBedrockInference",
"Effect": "Allow",
"Action": [
"bedrock:InvokeModel",
"bedrock:InvokeModelWithResponseStream"
],
"Resource": [
"arn:aws:bedrock:*:*:inference-profile/*",
"arn:aws:bedrock:*::foundation-model/*"
]
}
]
}

Using guardrails

If you want to use guardrails with your Bedrock models, add the bedrock:ApplyGuardrail action to your IAM policy:

View IAM policy example with guardrails (click to expand)
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowBedrockInference",
"Effect": "Allow",
"Action": [
"bedrock:InvokeModel",
"bedrock:InvokeModelWithResponseStream",
"bedrock:ApplyGuardrail"
],
"Resource": [
"arn:aws:bedrock:*:*:inference-profile/*",
"arn:aws:bedrock:*::foundation-model/*"
]
}
]
}

Anthropic models requirements

One-time usage form

  • Submit a one-time usage form through the Amazon Bedrock playground or PutUserCaseForModelAccess API.
  • For AWS Organizations, complete at the management account level; approval extends to child accounts.

AWS Marketplace subscription

  • Some Anthropic models require an AWS Marketplace subscription.
  • Subscriptions auto-create on first invocation if IAM includes aws-marketplace:Subscribe, or an admin can enable models first via console/API.

For details, see the AWS Security Blog post.

Step 2: Choose authentication method

After configuring the IAM policy, choose how Port authenticates with AWS Bedrock:

  • Assume role (recommended) — Configure an IAM role that Port's LLM gateway can assume. This eliminates the need to store long-lived credentials.
  • Access keys — Store AWS access key ID and secret access key in Port secrets. Configure this in Step 2: Store API Keys in Secrets.

Trust relationship configuration

Create a trust relationship policy on your IAM role that allows Port's LLM gateway roles to assume it:

View trust relationship policy example (click to expand)
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Statement1",
"Effect": "Allow",
"Principal": {
"AWS": [
"arn:aws:iam::185657066287:role/port-ai-bring-your-own-llm-eu-west-1",
"arn:aws:iam::185657066287:role/port-ai-bring-your-own-llm-us-east-1"
]
},
"Action": "sts:AssumeRole",
"Condition": {
"StringEquals": {
"sts:ExternalId": "<OPTIONAL_EXTERNAL_ID>"
}
}
}
]
}
Include both gateway roles

The example above includes both the EU (app.getport.io) and US (app.us.port.io) gateway roles. This is the recommended approach — it ensures your trust policy works regardless of which Port region handles the request.

The sts:ExternalId condition is optional but recommended for additional security. If you use an external ID, create it as a secret in Port before configuring the provider. See Step 2: Store API Keys in Secrets for instructions.

External ID must match exactly

The secret value stored in Port must match the sts:ExternalId in your trust policy character-for-character. Even a trailing space or newline will cause the assume-role to fail silently. Copy-paste the value directly to avoid mismatches.

Using access keys

Store your AWS access key ID and secret access key in Port secrets. See Step 2: Store API Keys in Secrets for instructions.