Mistral Nemo Instruct 2407
Mistral Nemo Instruct 2407 is available via Ovhcloud with a 118K context window and up to 118,000 output tokens. Pricing: $0.1300/1M input tokens, $0.1300/1M output tokens.
Mistral Nemo Instruct 2407 Pricing & Specifications
What is Mistral Nemo Instruct 2407?
Mistral Nemo Instruct 2407 is a large language model by Ovhcloud with a 118K context window and up to 118,000 output tokens. It costs $0.13 per 1M input tokens and $0.13 per 1M output tokens. Mistral Nemo Instruct 2407 is available via Ovhcloud with a 118K context window and up to 118,000 output tokens. Pricing: $0.1300/1M input tokens, $0.1300/1M output tokens.
Capabilities
text function calling json mode
Mistral Nemo Instruct 2407 Cost Examples
Short prompt (500 tokens)
$0.000065
Medium prompt (2K tokens)
$0.00026
Long output (4K tokens)
$0.00052
Count tokens for Mistral Nemo Instruct 2407
Paste your prompt to see exact token counts and API cost estimates.
Open Token CounterSimilar Models to Mistral Nemo Instruct 2407
Frequently Asked Questions
How much does Mistral Nemo Instruct 2407 cost per token? +
Mistral Nemo Instruct 2407 costs $0.13 per 1M input tokens and $0.13 per 1M output tokens. For a typical 1,000-token request with a 500-token response, that works out to roughly $0.000195.
What is the context window for Mistral Nemo Instruct 2407? +
Mistral Nemo Instruct 2407 supports a context window of 118,000 tokens (118K). This determines the maximum combined length of your prompt and conversation history in a single API call.
What is the maximum output length for Mistral Nemo Instruct 2407? +
Mistral Nemo Instruct 2407 can generate up to 118,000 tokens in a single response. If you need longer outputs, you can make multiple API calls and concatenate the results.
Is Mistral Nemo Instruct 2407 good for coding tasks? +
Yes, Mistral Nemo Instruct 2407 supports capabilities well-suited for coding tasks including code generation, debugging, and refactoring.