POST
/
v1
/
amazon-bedrock
/
chat
/
completions
Shell
curl --location 'https://api.altrum.ai/v1/amazon-bedrock/chat/completions' --header 'Project-Api-Key: ALTRUMAI_PROJECT_API_KEY' --header 'aws-access-key-id: AWS_ACCESS_KEY_ID' --header 'aws-secret-access-key: AWS_SECRET_ACCESS_KEY' --header 'aws-region-name: AWS_REGION_NAME' --header 'Content-Type: application/json' --data '{
    "model": "meta.llama3-8b-instruct-v1:0",
    "messages": [
        {"role":"assistant", "content":"You are a helpful assistant"},
        {"role":"user", "content":"Hello!"}
    ],
    "stream":false
}'
{
  "choices": [
    {
      "finish_reason": "stop",
      "index": 0,
      "message": {
        "content": "The General Data Protection Regulation (GDPR)!\n\nThe GDPR is a European Union (EU) regulation that went into effect on May 25, 2018, aiming to strengthen and unify data protection for individuals within the EU. It applies to any organization that processes personal data of EU residents, ",
        "role": "assistant"
      }
    }
  ],
  "created": 1734505842,
  "id": "chatcmpl-002ac263-5f89-49fb-a1c1-61a84caace1b",
  "model": "meta.llama3-8b-instruct-v1:0",
  "object": "chat.completion",
  "usage": {
    "completion_tokens": 60,
    "prompt_tokens": 25,
    "total_tokens": 85
  }
}

Note: Only Chat Completion LLMs are compatible with this proxy.

Before you can begin using Amazon Bedrock, make sure you complete the following setup:

  1. Create an AWS Account
    If you do not already have an AWS account, you must create one.

  2. Set Up an IAM Role
    Create an IAM role with the necessary permissions to access and use Amazon Bedrock.

  3. Request Access to Foundation Models (FM)
    Submit a request to gain access to the specific foundation models you intend to use.



Amazon Bedrock Models

🔗 Supported Models List

Authorizations

Project-Api-Key
string
header
required

Headers

aws-access-key-id
string
required

AWS ACCESS KEY

aws-secret-access-key
string
required

AWS SECRET ACCESS KEY

aws-region-name
string
required

AWS Region Name

Body

application/json

Schema for Amazon Bedrock API requests.

model
enum<string>
required

The model used for the chat completion

Available options:
amazon.titan-text-express-v1,
anthropic.claude-3-sonnet-20240229-v1:0,
anthropic.claude-3-haiku-20240307-v1:0,
anthropic.claude-3-5-sonnet-20240620-v1:0,
meta.llama3-8b-instruct-v1:0,
meta.llama3-70b-instruct-v1:0,
mistral.mistral-7b-instruct-v0:2,
mistral.mixtral-8x7b-instruct-v0:1,
mistral.mistral-large-2402-v1:0
messages
Message · object[]
required

A list of messages comprising the conversation

stream
boolean | null
default:false

If set, partial message deltas will be sent

timeout
integer | null
default:600

Timeout in seconds for completion requests

temperature
number | null
default:1

Sampling temperature

Required range: 0 <= x <= 2
top_p
number | null
default:1

Nucleus sampling parameter

Required range: 0 <= x <= 1
stop
string[] | null
max_tokens
integer | null

Maximum number of tokens to generate

Required range: x > 0
tools
Tools · object[] | null
tool_choice
string | null

Response

200
application/json

Successful Response

Schema for LLM model responses.

id
string
required
object
string
required
created
integer
required
model
string
required
choices
ResponseChoice · object[]
required
system_fingerprint
string | null
usage
object | null

Token usage information.