Python
Streaming

Streaming

Streaming is just an addition to the Completion endpoint. It supports streaming responses using Server Side Events (SSE).

anthropic.Completion.create(
    model="anthropic.claude-v2",
    prompt="Why is the sky blue?",
    max_tokens_to_sample=300,
    stream=True
)

Usage

import os
from bedrock_anthropic import AnthropicBedrock
 
anthropic = AnthropicBedrock(
    access_key=os.getenv("AWS_ACCESS_KEY"),
    secret_key=os.getenv("AWS_SECRET_KEY"),
    region=os.getenv("AWS_REGION")
)
 
stream = anthropic.Completion.create(
    model="anthropic.claude-v2",
    prompt="Why is the sky blue?",
    max_tokens_to_sample=300,
    stream=True
)
 
for completion in stream:
    print(completion, end="", flush=True)

Configuration

The configuration parameters are the exact same as completion, except the stream=True parameter must be passed into the completion function.

model

The model that will complete your prompt. Refer to the models page.

anthropic.Completion.create(
    model="anthropic.claude-v2",
    prompt="Why is the sky blue?",
    max_tokens_to_sample=300,
    stream=True
)

prompt

The prompt you want to use.

  • Type: str
anthropic.Completion.create(
    model="anthropic.claude-v2",
    prompt="Why is the sky blue?",
    max_tokens_to_sample=300,
    stream=True
)

max_tokens_to_sample

The maximum number of tokens to generate before stopping.

  • Default: 256
  • Range depends on the model, refer to the models page.
anthropc.Completion.create(
    model="anthropic.claude-v2",
    prompt="Why is the sky blue?",
    max_tokens_to_sample=300,
    stream=True
)

stop_sequences (optional)

Sequences that will cause the model to stop generating completion text.

  • Default: []
anthropic.Completion.create(
    model="anthropic.claude-v2",
    prompt="Why is the sky blue?",
    max_tokens_to_sample=300,
    stop_sequences=[
        "sequence"
    ],
    stream=True
)

temperature (optional)

Amount of randomness injected in the response.

  • Default: 1
  • Range: 0-1
anthropic.Completion.create(
    model="anthropic.claude-v2",
    prompt="Why is the sky blue?",
    max_tokens_to_sample=300,
    temperature=0.7,
    stream=True
)

top_p (optional)

Use nucleus sampling.

  • Default: 1
  • Range: 0-1
anthropic.Completion.create(
    model="anthropic.claude-v2",
    prompt="Why is the sky blue?",
    max_tokens_to_sample=300,
    top_p=0.7,
    stream=True
)

top_k (optional)

Only sample from the top K options for each subsequent token.

  • Default: 250
  • Range: 0-500
anthropic.Completion.create(
    model="anthropic.claude-v2",
    prompt="Why is the sky blue?",
    max_tokens_to_sample=300,
    top_k=250,
    stream=True
)