Introducing The Message Batches API

3 pointsposted 7 hours ago
by davidbarker

2 Comments

alach11

7 hours ago

It's great to see the arms race between OpenAI and Anthropic benefitting end users.

I have several use cases where I realize ~75% cost reduction by using prompt caching. It's not clear to me while reading this - does the Batch API stack with prompt caching?

user

7 hours ago

[deleted]