The Batch API lets you submit multiple chat completion requests as a batch for asynchronous processing. This is ideal for bulk workloads like dataset evaluation, content generation pipelines, or any scenario where you need to process many prompts efficiently.
usingXai;usingvarclient=newXaiClient(apiKey);// Step 1: Create a batchvarbatch=awaitclient.Batches.CreateBatchAsync(name:"my-batch");Console.WriteLine($"Created batch: {batch.BatchId}");// Step 2: Add requestsawaitclient.Batches.AddBatchRequestsAsync(batchId:batch.BatchId!,batchRequests:[newBatchRequestItem{BatchRequestId="req-1",BatchRequest=new{chat_get_completion=new{model="grok-3-mini",messages=new[]{new{role="user",content="What is 2+2? Answer with just the number."},},},},},newBatchRequestItem{BatchRequestId="req-2",BatchRequest=new{chat_get_completion=new{model="grok-3-mini",messages=new[]{new{role="user",content="What is 3+3? Answer with just the number."},},},},},]);// Step 3: Poll for completionwhile(true){awaitTask.Delay(TimeSpan.FromSeconds(10));varstatus=awaitclient.Batches.GetBatchAsync(batch.BatchId!);if(status.Stateis{NumPending:0}){// Step 4: Get resultsvarresults=awaitclient.Batches.GetBatchResultsAsync(batch.BatchId!);foreach(variteminresults.Succeeded!){Console.WriteLine($"{item.BatchRequestId}: completed");}break;}Console.WriteLine($"Pending: {status.State?.NumPending}");}
Add one or more requests to the batch. Each request needs a unique BatchRequestId for correlation:
1 2 3 4 5 6 7 8 910111213141516171819
varupdated=awaitclient.Batches.AddBatchRequestsAsync(batchId:batch.BatchId!,batchRequests:[newBatchRequestItem{BatchRequestId="prompt-001",BatchRequest=new{chat_get_completion=new{model="grok-3-mini",messages=new[]{new{role="user",content="Translate 'hello' to French."},},},},},]);
3. Monitor Progress
Poll the batch status to check how many requests are still pending: