Provider

Meta API Cost Calculator and Comparison

Every Meta model, side by side — current API rates, context window, benchmarks, and a live calculator that ranks them at your exact workload. 9 active models, 9 with public pricing, from $0.20/1M to $3.00/1M input. Prices refreshed daily from Meta’s official pricing page.

Models tracked

10

Active

9

With public pricing

9

Cheapest input

$0.20/1M

As of April 2026, Meta offers 9 active AI models via API, ranging from $0.20/1M to $3.00/1M input tokens. The most context-rich model handles up to 1M tokens. Models include support for vision. All prices are in USD per 1 million tokens; output tokens cost more than input on every model.

Interactive

Calculate your Meta API cost at your workload.

Set your workload — every priced model ranks in real time.

Adjust the workload

Every model below updates in real time.

1,00010,00050,000250,0001M10M

Ranked by your monthly bill

No models with public pricing available to compare right now.

Pricing at a glance

Blended $/1M tokens across the lineup.

Blended price uses a 3-to-1 input/output ratio — a common industry standard. Green bar = cheapest.

Every model

Every Meta model — pricing, context & capabilities.

ModelContextInput /1MOutput /1M
llama-3.2-1b-instruct60K$0.2$0.2
llama-3.1-8b-instruct16K$0.2$0.5
llama-3-8b-instruct8K$0.3$0.4
llama-3.2-3b-instruct80K$0.3$0.5
LLaMA 4 Scout1M$0.5$1.50
LLaMA 3.2 90B128K$0.6$1.80
LLaMA 3 70B8K$0.9$0.9
LLaMA 4 Maverick1M$1.00$3.00
LLaMA 3.1 405B128K$3.00$3.00
LLaMA 4 Behemothpreview1MSelf-host only

FAQ

Meta — questions we see most.

Pricing patterns, best-known use cases, and how this provider stacks up.

Get instant answers from our AI agent

Meta API pricing ranges from $0.20 to $3.00 per 1M input tokens. Output tokens cost more than input on every model. Prices are per 1 million tokens (1M ≈ 750,000 words). Use the calculator above to estimate your monthly spend at your actual workload.
llama-3.2-1b-instruct is the lowest-priced Meta model with public pricing at $0.20/1M input tokens. It suits high-volume tasks where cost matters most — classification, extraction, summarization, and similar workloads that don't need frontier reasoning.
LLaMA 3.1 405B is Meta's highest-tier model at $3.00/1M input. It delivers the most sophisticated reasoning, instruction-following, and nuance. For workloads that don't require frontier performance, a mid-tier model typically cuts inference costs substantially.
Yes — LLaMA 4 Scout, LLaMA 3.2 90B, LLaMA 4 Maverick accept image input alongside text. You can pass screenshots, photos, charts, and documents for analysis. Vision adds no separate line-item on most Meta models — you're billed for the token equivalent of the image.
Meta does not currently list cached or batch pricing in our database. Check Meta's official pricing page for the latest discount tiers — providers add these options regularly.
Meta has historically adjusted prices when launching new model generations, often cutting rates to stay competitive. Buzzi.ai snapshots pricing daily — you can subscribe to price-drop alerts on any Meta model using the "Alert me" button on its detail page.
Use the main comparison wizard to run the same calculator across Meta, Anthropic, Google, Meta, Mistral, and 20+ other providers. Set your exact workload and get a ranked cost chart in under a minute.

Look wider

Compare Meta against other providers.

Open the full wizard — pick a use case, set your usage, and cross-compare against OpenAI, Anthropic, Google, and 20+ more.