Introducing The Message Batches API

3 pointsposted a year ago
by davidbarker

3 Comments

alach11

a year ago

It's great to see the arms race between OpenAI and Anthropic benefitting end users.

I have several use cases where I realize ~75% cost reduction by using prompt caching. It's not clear to me while reading this - does the Batch API stack with prompt caching?

user

a year ago

[deleted]