Amazon Bedrock
Amazon Bedrock ↗ allows you to build and scale generative AI applications with foundation models.
https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/aws-bedrockWhen making requests to Amazon Bedrock, you will need:
- AI Gateway account ID
- AI Gateway gateway name
- AWS credentials (
accessKeyId,secretAccessKey, andregion) with permissions for Amazon Bedrock - The name of the Amazon Bedrock model you want to use
When making requests to Amazon Bedrock, replace https://bedrock-runtime.us-east-1.amazonaws.com/ in the URL you are currently using with https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/aws-bedrock/bedrock-runtime/us-east-1/, then append the model you want to use.
For example, to invoke the Anthropic Claude model in us-east-1:
https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/aws-bedrock/bedrock-runtime/us-east-1/model/us.anthropic.claude-haiku-4-5-20251001-v1:0/invokeAmazon Bedrock uses AWS Signature Version 4 (SigV4) ↗ to authenticate API requests. Unlike providers such as OpenAI or Anthropic that use a simple API key, AWS requires each request to be cryptographically signed with your credentials.
AI Gateway handles this complexity for you. When you store your AWS credentials using BYOK, the gateway automatically signs each request before forwarding it to AWS.
| Method | cf-aig-authorization header | Authorization header | Signing |
|---|---|---|---|
| BYOK (Recommended) | Bearer {CF_AIG_TOKEN} | Not needed | Gateway signs automatically |
| Client-side signing | Bearer {CF_AIG_TOKEN} | Pre-signed AWS headers | You sign with aws4fetch or AWS SDK |
The recommended approach is to store your AWS credentials using AI Gateway's Bring Your Own Keys (BYOK) feature. This keeps your credentials secure and eliminates the need for client-side request signing.
-
In the Cloudflare dashboard, go to AI > AI Gateway > your gateway > Provider Keys.
-
Select Add API Key and choose Amazon Bedrock as the provider.
-
Enter your AWS credentials as a JSON object with the following structure:
{"accessKeyId": "AKIAIOSFODNN7EXAMPLE","secretAccessKey": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY","region": "us-east-1"} -
Select Save.
If you are using temporary credentials from AWS STS (for example, from assuming an IAM role), include the sessionToken field:
{ "accessKeyId": "ASIAIOSFODNN7EXAMPLE", "secretAccessKey": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY", "region": "us-east-1", "sessionToken": "FwoGZXIvYXdzEBY..."}With BYOK configured, you only need to include the cf-aig-authorization header in your requests. AI Gateway handles the AWS SigV4 signing automatically.
If you prefer to sign requests yourself, you can use the aws4fetch ↗ library or any AWS SDK to sign the request before sending it through AI Gateway. Refer to the client-side signing example below.
With your AWS credentials stored as a provider key, requests are simple — no AWS signing required:
curl "https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/aws-bedrock/bedrock-runtime/us-east-1/model/us.anthropic.claude-haiku-4-5-20251001-v1:0/invoke" \ -H "cf-aig-authorization: Bearer {CF_AIG_TOKEN}" \ -H "Content-Type: application/json" \ -d '{ "messages": [ { "role": "user", "content": "What is Cloudflare?" } ], "max_tokens": 256, "anthropic_version": "bedrock-2023-05-31" }'If you are not using BYOK, you must sign the request before sending it through AI Gateway. The following example uses the aws4fetch library in a Cloudflare Worker:
import { AwsClient } from "aws4fetch";
interface Env { accessKey: string; secretAccessKey: string;}
export default { async fetch( request: Request, env: Env, ctx: ExecutionContext, ): Promise<Response> { const cfAccountId = "{account_id}"; const gatewayName = "{gateway_id}"; const region = "us-east-1";
const awsClient = new AwsClient({ accessKeyId: env.accessKey, secretAccessKey: env.secretAccessKey, region: region, service: "bedrock", });
const body = JSON.stringify({ messages: [{ role: "user", content: "What does ethereal mean?" }], max_tokens: 256, anthropic_version: "bedrock-2023-05-31", });
// Sign against the original AWS URL const awsUrl = `https://bedrock-runtime.${region}.amazonaws.com/model/us.anthropic.claude-haiku-4-5-20251001-v1:0/invoke`;
const presignedRequest = await awsClient.sign(awsUrl, { method: "POST", headers: { "Content-Type": "application/json" }, body: body, });
// Send through AI Gateway const gatewayUrl = `https://gateway.ai.cloudflare.com/v1/${cfAccountId}/${gatewayName}/aws-bedrock/bedrock-runtime/${region}/model/us.anthropic.claude-haiku-4-5-20251001-v1:0/invoke`;
const response = await fetch(gatewayUrl, { method: "POST", headers: presignedRequest.headers, body: body, });
if ( response.ok && response.headers.get("content-type")?.includes("application/json") ) { const data = await response.json(); return new Response(JSON.stringify(data)); }
return new Response("Invalid response", { status: 500 }); },};AI Gateway provides a Unified API that lets you use the OpenAI chat completions format with Bedrock models. This is currently supported for Anthropic Claude and Amazon Nova model families. You can use the OpenAI SDK to access these models running on Bedrock without changing your request format.
https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat/chat/completionsWith your AWS credentials stored as a provider key, specify the model using the aws-bedrock/{model} format:
curl "https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat/chat/completions" \ -H "cf-aig-authorization: Bearer {CF_AIG_TOKEN}" \ -H "Content-Type: application/json" \ -d '{ "model": "aws-bedrock/us.anthropic.claude-haiku-4-5-20251001-v1:0", "messages": [ { "role": "user", "content": "What is Cloudflare?" } ] }'import OpenAI from "openai";
const client = new OpenAI({ apiKey: "{CF_AIG_TOKEN}", baseURL: "https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_id}/compat",});
const response = await client.chat.completions.create({ model: "aws-bedrock/us.anthropic.claude-haiku-4-5-20251001-v1:0", messages: [ { role: "user", content: "What is Cloudflare?", }, ],});
console.log(response.choices[0].message.content);