Python
Completion

Completion

The Completion endpoint allows you to give a prompt to an Anthropic LLM and get a respnose.

anthropic.Completion.create(
  model="anthropic.claude-v2",
  prompt="Why is the sky blue?"
  max_tokens_to_sample=300
)

Usage

import os
from bedrock_anthropic import AnthropicBedrock
 
anthropic = AnthropicBedrock(
    access_key=os.getenv("AWS_ACCESS_KEY"),
    secret_key=os.getenv("AWS_SECRET_KEY"),
    region=os.getenv("AWS_REGION")
)
 
completion = anthropic.Completion.create(
    model="anthropic.claude-v2",
    prompt="Why is the sky blue?",
    max_tokens_to_sample=300
)
 
print(completion["completion"])

Configuration

model

The model that will complete your prompt. Refer to the models page.

anthropic.Completion.create(
    model="anthropic.claude-v2",
    prompt="Why is the sky blue?",
    max_tokens_to_sample=300
)

prompt

The prompt you want to use.

  • Type: str
anthropic.Completion.create(
    model="anthropic.claude-v2",
    prompt="Why is the sky blue?",
    max_tokens_to_sample=300
)

max_tokens_to_sample (optional)

The maximum number of tokens to generate before stopping.

  • Default: 256
  • Range depends on the model, refer to the models page.
anthropc.Completion.create(
    model="anthropic.claude-v2",
    prompt="Why is the sky blue?",
    max_tokens_to_sample=300
)

stop_sequences (optional)

Sequences that will cause the model to stop generating completion text.

  • Default: []
anthropic.Completion.create(
    model="anthropic.claude-v2",
    prompt="Why is the sky blue?",
    max_tokens_to_sample=300,
    stop_sequences=[
        "sequence"
    ]
)

temperature (optional)

Amount of randomness injected in the response.

  • Default: 1
  • Range: 0-1
anthropic.Completion.create(
    model="anthropic.claude-v2",
    prompt="Why is the sky blue?",
    max_tokens_to_sample=300,
    temperature=0.7
)

top_p (optional)

Use nucleus sampling.

  • Default: 1
  • Range: 0-1
anthropic.Completion.create(
    model="anthropic.claude-v2",
    prompt="Why is the sky blue?",
    max_tokens_to_sample=300,
    top_p=0.7
)

top_k (optional)

Only sample from the top K options for each subsequent token.

  • Default: 250
  • Range: 0-500
anthropic.Completion.create(
    model="anthropic.claude-v2",
    prompt="Why is the sky blue?",
    max_tokens_to_sample=300,
    top_k=250
)