alach11
7 hours ago
It's great to see the arms race between OpenAI and Anthropic benefitting end users.
I have several use cases where I realize ~75% cost reduction by using prompt caching. It's not clear to me while reading this - does the Batch API stack with prompt caching?