alach11
a year ago
It's great to see the arms race between OpenAI and Anthropic benefitting end users.
I have several use cases where I realize ~75% cost reduction by using prompt caching. It's not clear to me while reading this - does the Batch API stack with prompt caching?
037
a year ago
> Yes, it is possible to use Prompt Caching with your Batches API requests. However, because asynchronous batch requests can be processed concurrently and in any order, we cannot guarantee that requests in a batch will benefit from caching.
https://docs.anthropic.com/en/docs/build-with-claude/prompt-...