Asynchronous Bulk Export API

Introduction

The Bulk Export API is designed to export large datasets.

It follows an asynchronous pattern: you first initiate an export job, then poll for its status, and finally download the resulting file.


Export Workflow

Step 1: Initiate an Export Job

  • Endpoint: POST /api/v1/idm/{firm-id}/[objects]/export

  • Request Body Parameters:

    • cursor: string - The nextCursor from previous export job
  • Example Request:

    {
      "cursor": "..."
    }
  • Response 200 Success :

    {
      "requestId": "export-8f7g6h5e-4d3c-2b1a-9098-f7e6d5c4b3a2"
    }

Step 2: Check Job Status

  • Endpoint: GET /api/v1/request/{request-id}

  • Possible Statuses:

    • Pending: the request is received but the execution is not yet started
    • InProgress: the request is executing
    • Completed: the request is completed
    • Failed: the request is failed
  • Example InProcessing response

    {
      "requestId": "export-8f7g6h5e-4d3c-2b1a-9098-f7e6d5c4b3a2",
      "status": "InProgress",
      "progress": {
        "processed": 123
      }
    }

  • Example Completed response

    {
      "requestId": "export-8f7g6h5e-4d3c-2b1a-9098-f7e6d5c4b3a2",
      "status": "Completed",
      "result": {
        "total": 100000,
        "fileUrl": "...",
        "nextCursor": "..."
      }
    }

Step 3: Download the Exported File

Use the fileUrl from the completed job status response to download the file.


Handle Large Exports with Cursors

If your export contains more objects than the job size limit, the job response will include a nextCursor.

To get the next batch of data, initiate a new export job, passing this cursor value in the request body. Repeat until the next_cursor field is null.


Limits

OperationLimitDescription
Export Job Size100,000 objects per jobA single export job will process a maximum of 100,000 objects. Use cursors to retrieve more.
Export Concurrency1 active job per userA user can only have one export job in a Pending or InProgress state at a time.