Meta.Llama2 70b Chat
Meta.Llama2 70b Chat is available via AWS Bedrock with a 4K context window and up to 4,096 output tokens. Pricing: $1.95/1M input tokens, $2.56/1M output tokens.
Meta.Llama2 70b Chat Pricing & Specifications
What is Meta.Llama2 70b Chat?
Meta.Llama2 70b Chat is a large language model by AWS Bedrock with a 4K context window and up to 4,096 output tokens. It costs $1.95 per 1M input tokens and $2.56 per 1M output tokens. Meta.Llama2 70b Chat is available via AWS Bedrock with a 4K context window and up to 4,096 output tokens. Pricing: $1.95/1M input tokens, $2.56/1M output tokens.
Capabilities
text
Meta.Llama2 70b Chat Cost Examples
Short prompt (500 tokens)
$0.000975
Medium prompt (2K tokens)
$0.00390
Long output (4K tokens)
$0.01024
Count tokens for Meta.Llama2 70b Chat
Paste your prompt to see exact token counts and API cost estimates.
Open Token CounterSimilar Models to Meta.Llama2 70b Chat
Frequently Asked Questions
How much does Meta.Llama2 70b Chat cost per token? +
Meta.Llama2 70b Chat costs $1.95 per 1M input tokens and $2.56 per 1M output tokens. For a typical 1,000-token request with a 500-token response, that works out to roughly $0.003230.
What is the context window for Meta.Llama2 70b Chat? +
Meta.Llama2 70b Chat supports a context window of 4,096 tokens (4K). This determines the maximum combined length of your prompt and conversation history in a single API call.
What is the maximum output length for Meta.Llama2 70b Chat? +
Meta.Llama2 70b Chat can generate up to 4,096 tokens in a single response. If you need longer outputs, you can make multiple API calls and concatenate the results.
Is Meta.Llama2 70b Chat good for coding tasks? +
Meta.Llama2 70b Chat can handle basic coding tasks, but there are models specifically optimized for code generation that may perform better on complex programming problems.