Send many prompts to one AI in a single run and get all results back in a clean list. Great for content teams, research groups, and operations that need fast, repeatable output with clear tracking by custom id.
The flow can start from another workflow or by running a sample path. It builds a requests array, sets the required anthropic version header, and posts the batch to the Anthropic messages batches endpoint. A wait loop checks batch status until it ends. When finished, it downloads the results url, reads each line of JSONL, and turns it into structured items. Results are split and can be routed by custom id. Example paths show a single prompt build and a chat history build using simple memory nodes, so you can create batches from prior messages.
You need an Anthropic account and API key. Set the version value to 2023 06 01, pick your model, and adjust the poll interval to balance speed and cost. Expect faster turnaround for large sets and consistent mapping between inputs and outputs. Useful for bulk copy creation, batch Q and A, dataset tagging, and model evaluation runs. Setup is simple, and you can call it from any parent flow that passes an array of requests.