Async
The Async
endpoint allows you to run anthropic functions concurrently. All other features are the same as sync execution.
completion = await anthropic.Completion.create(
model="anthropic.claude-v2",
prompt="Why is the sky blue?",
max_tokens_to_sample=300
)
print(completion["completion"])
Usage
import os
from bedrock_anthropic import AsyncAnthropicBedrock
anthropic = AsyncAnthropicBedrock(
access_key=os.getenv("AWS_ACCESS_KEY"),
secret_key=os.getenv("AWS_SECRET_KEY"),
region=os.getenv("AWS_REGION")
)
async def main():
completion = await anthropic.Completion.create(
model="anthropic.claude-v2",
max_tokens_to_sample=300,
prompt="Why is the sky blue?"
)
print(completion["completion"])
asyncio.run(main())
Configuration
model
The model that will complete your prompt. Refer to the models page.
anthropic.Completion.create(
model="anthropic.claude-v2",
prompt="Why is the sky blue?",
max_tokens_to_sample=300
)
prompt
The prompt you want to use.
- Type:
str
anthropic.Completion.create(
model="anthropic.claude-v2",
prompt="Why is the sky blue?",
max_tokens_to_sample=300
)
max_tokens_to_sample
The maximum number of tokens to generate before stopping.
- Default:
256
- Range depends on the model, refer to the models page.
anthropc.Completion.create(
model="anthropic.claude-v2",
prompt="Why is the sky blue?",
max_tokens_to_sample=300
)
stop_sequences (optional)
Sequences that will cause the model to stop generating completion text.
- Default:
[]
anthropic.Completion.create(
model="anthropic.claude-v2",
prompt="Why is the sky blue?",
max_tokens_to_sample=300,
stop_sequences=[
"sequence"
]
)
temperature (optional)
Amount of randomness injected in the response.
- Default:
1
- Range:
0-1
anthropic.Completion.create(
model="anthropic.claude-v2",
prompt="Why is the sky blue?",
max_tokens_to_sample=300,
temperature=0.7
)
top_p (optional)
Use nucleus sampling.
- Default:
1
- Range:
0-1
anthropic.Completion.create(
model="anthropic.claude-v2",
prompt="Why is the sky blue?",
max_tokens_to_sample=300,
top_p=0.7
)
top_k (optional)
Only sample from the top K options for each subsequent token.
- Default:
250
- Range:
0-500
anthropic.Completion.create(
model="anthropic.claude-v2",
prompt="Why is the sky blue?",
max_tokens_to_sample=300,
top_k=0.7
)