Llama3.1
Llama3.1 is available via Ollama with a 8K context window and up to 8,192 output tokens. Pricing: $0.000000/1M input tokens, $0.000000/1M output tokens.
Llama3.1 Pricing & Specifications
What is Llama3.1?
Llama3.1 is a large language model by Ollama with a 8K context window and up to 8,192 output tokens. It costs $0.000 per 1M input tokens and $0.000 per 1M output tokens. Llama3.1 is available via Ollama with a 8K context window and up to 8,192 output tokens. Pricing: $0.000000/1M input tokens, $0.000000/1M output tokens.
Capabilities
text function calling
Llama3.1 Cost Examples
Short prompt (500 tokens)
$0.000000
Medium prompt (2K tokens)
$0.00000
Long output (4K tokens)
$0.00000
Count tokens for Llama3.1
Paste your prompt to see exact token counts and API cost estimates.
Open Token CounterSimilar Models to Llama3.1
Frequently Asked Questions
How much does Llama3.1 cost per token? +
Llama3.1 costs $0.000 per 1M input tokens and $0.000 per 1M output tokens. For a typical 1,000-token request with a 500-token response, that works out to roughly $0.000000.
What is the context window for Llama3.1? +
Llama3.1 supports a context window of 8,192 tokens (8K). This determines the maximum combined length of your prompt and conversation history in a single API call.
What is the maximum output length for Llama3.1? +
Llama3.1 can generate up to 8,192 tokens in a single response. If you need longer outputs, you can make multiple API calls and concatenate the results.
Is Llama3.1 good for coding tasks? +
Yes, Llama3.1 supports capabilities well-suited for coding tasks including code generation, debugging, and refactoring.