Linkit

Bulk Operations

Processing up to 1,000 records per request with partial-failure handling

Bulk Operations

Bulk endpoints let you create, update, or upsert up to 1,000 records in a single request. They are available for products, branches, SKUs, brands, generics, and orders. All bulk operations support partial success — individual failures don't roll back the entire batch.


Available Endpoints

ResourceEndpointMax Items
ProductsPOST /api/v1/products/bulk1,000
BranchesPOST /api/v1/branches/bulk1,000
SKUsPOST /api/v1/skus/bulk1,000
BrandsPOST /api/v1/brands/bulk1,000
GenericsPOST /api/v1/generics/bulk1,000
Orders (create)POST /api/v1/orders/bulk1,000
Orders (status)PATCH /api/v1/orders/bulk/status1,000
Orders (delete)DELETE /api/v1/orders/bulk1,000

Operation Modes

ModeBehavior
createInsert only — fails if the record already exists
updateUpdate only — fails if the record is not found
upsertCreate new records, update existing ones

Use upsert for most sync workflows. Use create when you need to catch duplicates explicitly.


Request Structure

Products

{
  "mode": "upsert",
  "products": [
    {
      "iv_id": "PROD-001",
      "name_en": "Product One",
      "average_price": 49.99,
      "is_enabled": true
    }
  ]
}

Branches

{
  "mode": "upsert",
  "branches": [
    {
      "iv_id": "STORE-001",
      "name_en": "Main Store",
      "active": true,
      "location": { "lat": 24.7136, "lon": 46.6753 }
    }
  ]
}

SKUs

{
  "mode": "upsert",
  "skus": [
    {
      "iv_id": "SKU-001",
      "branch_iv_id": "STORE-001",
      "product_iv_id": "PROD-001",
      "qty": 100,
      "price": 49.99,
      "available": true
    }
  ]
}

Orders (Bulk Create)

[
  {
    "source": "import",
    "customer_name": "John Doe",
    "products": [
      { "product_id": "prd_123", "name": "Widget", "quantity": 2, "price": 29.99 }
    ],
    "total_amount": 59.98,
    "currency": "SAR"
  }
]

Response Structure

All bulk operations return a consistent response:

{
  "success": true,
  "message": "Processed 100 products successfully",
  "data": {
    "succeeded": 98,
    "failed": 2,
    "errors": {
      "PROD-050": "Invalid brand_id: brand_xyz does not exist",
      "PROD-075": "Duplicate barcode: 1234567890123"
    }
  },
  "timestamp": "2024-01-15T10:30:00Z"
}
FieldDescription
succeededRecords processed successfully
failedRecords that failed
errorsMap of identifier to error message

Partial success is expected. 98/100 succeeding is a normal sync result. Review the errors map and retry the failures.


Example: Full Product Sync

async function syncProducts(externalProducts) {
  const BATCH_SIZE = 1000;
  const results = { total: 0, succeeded: 0, failed: 0, errors: {} };

  for (let i = 0; i < externalProducts.length; i += BATCH_SIZE) {
    const batch = externalProducts.slice(i, i + BATCH_SIZE);

    const products = batch.map(p => ({
      iv_id: p.external_id,
      name_en: p.title,
      average_price: p.price,
      is_enabled: p.active
    }));

    const response = await fetch('https://linkit.works/api/v1/products/bulk', {
      method: 'POST',
      headers: {
        'Authorization': 'Bearer your_token',
        'Content-Type': 'application/json'
      },
      body: JSON.stringify({ mode: 'upsert', products })
    });

    const data = await response.json();
    results.total += batch.length;
    results.succeeded += data.data.succeeded;
    results.failed += data.data.failed;
    Object.assign(results.errors, data.data.errors);
  }

  return results;
}
import requests

def sync_products(external_products, token):
    BATCH_SIZE = 1000
    results = {'total': 0, 'succeeded': 0, 'failed': 0, 'errors': {}}

    for i in range(0, len(external_products), BATCH_SIZE):
        batch = external_products[i:i + BATCH_SIZE]

        products = [{
            'iv_id': p['external_id'],
            'name_en': p['title'],
            'average_price': p['price'],
            'is_enabled': p.get('active', True)
        } for p in batch]

        response = requests.post(
            'https://linkit.works/api/v1/products/bulk',
            headers={'Authorization': f'Bearer {token}'},
            json={'mode': 'upsert', 'products': products},
            timeout=60
        )

        data = response.json()
        results['total'] += len(batch)
        results['succeeded'] += data['data']['succeeded']
        results['failed'] += data['data']['failed']
        results['errors'].update(data['data'].get('errors', {}))

    return results

Best Practices

  • Use batches of 500 for reliability, even though the limit is 1,000.
  • Add delay between batches (e.g., 500ms) to avoid rate limiting.
  • Implement retry with exponential backoff for transient failures.
  • Validate locally before sending to catch obvious issues (missing iv_id, negative prices).
  • Always inspect the errors map — don't just check success: true.

Rate Limits

OperationLimit
Bulk create/update100 requests/min
Bulk status update100 requests/min
Bulk delete60 requests/min

Error Reference

ErrorMeaningFix
Duplicate iv_idRecord exists (create mode)Use upsert mode
Not foundRecord missing (update mode)Use upsert mode
Invalid referenceForeign key doesn't existCreate the referenced record first
Validation failedField constraints not metCheck field requirements
Payload too largeToo many itemsSplit into smaller batches