Batch processing

Submit and track multiple video jobs

By YT2Text Team • Published January 20, 2025 • Updated February 23, 2026

Batch API

Batch endpoints are grouped under

/api/v1/batch
.

Endpoints

  • POST /api/v1/batch/process
    – enqueue a batch
  • GET /api/v1/batch/status/{batch_id}
    – get aggregate status
  • GET /api/v1/batch/results/{batch_id}
    – fetch detailed results
  • GET /api/v1/batch/
    – list batches

Plan requirements

Batch processing is available on the Pro plan.

POST
/api/v1/batch/process

Request body

  • jobs
    : array of job request objects (
    video_url
    ,
    summary_mode
    , optional
    webhook_url
    ,
    custom_instructions
    ,
    priority
    )

Example:

{
  "jobs": [
    {
      "video_url": "https://www.youtube.com/watch?v=dQw4w9WgXcQ",
      "summary_mode": "tldr"
    },
    {
      "video_url": "https://www.youtube.com/watch?v=oHg5SJYRHA0",
      "summary_mode": "brief"
    }
  ]
}

GET
/api/v1/batch/status/{batch_id}

Returns aggregate batch progress, including individual job summaries.

{
  "success": true,
  "data": {
    "batch_id": "batch_01",
    "status": "processing",
    "individual_jobs": [
      { "job_id": "job_1", "status": "completed" },
      { "job_id": "job_2", "status": "pending" }
    ]
  }
}

GET
/api/v1/batch/results/{batch_id}

Returns results only when jobs have completed. Optional query parameter

include_failed
controls whether failed jobs are included.

{
  "success": true,
  "data": {
    "status": "partial_failure",
    "successful_jobs": [
      {
        "job_id": "job_1",
        "summaries": ["..."]
      }
    ],
    "failed_jobs": [
      {
        "job_id": "job_2",
        "error": {
          "code": "JOB_FAILED",
          "message": "Could not extract transcript"
        }
      }
    ],
    "summary_stats": {
      "total": 2,
      "successful": 1,
      "failed": 1
    }
  }
}

status
can be
completed
,
processing
,
failed
, or
partial_failure
.

GET
/api/v1/batch/

Returns batch jobs for the current key.

{
  "success": true,
  "data": {
    "batches": [],
    "total": 0,
    "limit": 20,
    "offset": 0,
    "has_more": false
  }
}