cf.llm.prompt.token_count
cf.llm.prompt.token_count Number An estimated token count for the LLM prompt in the request.
The count is calculated using a general-purpose tokenizer and may not exactly match the count reported by your LLM provider.
Requires a Cloudflare Enterprise plan. You must also enable AI Security for Apps.
Example usage:
# Matches requests where the estimated token count exceeds 4,000:(cf.llm.prompt.token_count gt 4000) Categories:
- Request