Bulk Operations
Processing up to 1,000 records per request with partial-failure handling
Bulk Operations
Bulk endpoints let you create, update, or upsert up to 1,000 records in a single request. They are available for products, branches, SKUs, brands, generics, and orders. All bulk operations support partial success — individual failures don't roll back the entire batch.
Available Endpoints
| Resource | Endpoint | Max Items |
|---|---|---|
| Products | POST /api/v1/products/bulk | 1,000 |
| Branches | POST /api/v1/branches/bulk | 1,000 |
| SKUs | POST /api/v1/skus/bulk | 1,000 |
| Brands | POST /api/v1/brands/bulk | 1,000 |
| Generics | POST /api/v1/generics/bulk | 1,000 |
| Orders (create) | POST /api/v1/orders/bulk | 1,000 |
| Orders (status) | PATCH /api/v1/orders/bulk/status | 1,000 |
| Orders (delete) | DELETE /api/v1/orders/bulk | 1,000 |
Operation Modes
| Mode | Behavior |
|---|---|
create | Insert only — fails if the record already exists |
update | Update only — fails if the record is not found |
upsert | Create new records, update existing ones |
Use upsert for most sync workflows. Use create when you need to catch duplicates explicitly.
Request Structure
Products
{
"mode": "upsert",
"products": [
{
"iv_id": "PROD-001",
"name_en": "Product One",
"average_price": 49.99,
"is_enabled": true
}
]
}Branches
{
"mode": "upsert",
"branches": [
{
"iv_id": "STORE-001",
"name_en": "Main Store",
"active": true,
"location": { "lat": 24.7136, "lon": 46.6753 }
}
]
}SKUs
{
"mode": "upsert",
"skus": [
{
"iv_id": "SKU-001",
"branch_iv_id": "STORE-001",
"product_iv_id": "PROD-001",
"qty": 100,
"price": 49.99,
"available": true
}
]
}Orders (Bulk Create)
[
{
"source": "import",
"customer_name": "John Doe",
"products": [
{ "product_id": "prd_123", "name": "Widget", "quantity": 2, "price": 29.99 }
],
"total_amount": 59.98,
"currency": "SAR"
}
]Response Structure
All bulk operations return a consistent response:
{
"success": true,
"message": "Processed 100 products successfully",
"data": {
"succeeded": 98,
"failed": 2,
"errors": {
"PROD-050": "Invalid brand_id: brand_xyz does not exist",
"PROD-075": "Duplicate barcode: 1234567890123"
}
},
"timestamp": "2024-01-15T10:30:00Z"
}| Field | Description |
|---|---|
succeeded | Records processed successfully |
failed | Records that failed |
errors | Map of identifier to error message |
Partial success is expected. 98/100 succeeding is a normal sync result. Review the errors map and retry the failures.
Example: Full Product Sync
async function syncProducts(externalProducts) {
const BATCH_SIZE = 1000;
const results = { total: 0, succeeded: 0, failed: 0, errors: {} };
for (let i = 0; i < externalProducts.length; i += BATCH_SIZE) {
const batch = externalProducts.slice(i, i + BATCH_SIZE);
const products = batch.map(p => ({
iv_id: p.external_id,
name_en: p.title,
average_price: p.price,
is_enabled: p.active
}));
const response = await fetch('https://linkit.works/api/v1/products/bulk', {
method: 'POST',
headers: {
'Authorization': 'Bearer your_token',
'Content-Type': 'application/json'
},
body: JSON.stringify({ mode: 'upsert', products })
});
const data = await response.json();
results.total += batch.length;
results.succeeded += data.data.succeeded;
results.failed += data.data.failed;
Object.assign(results.errors, data.data.errors);
}
return results;
}import requests
def sync_products(external_products, token):
BATCH_SIZE = 1000
results = {'total': 0, 'succeeded': 0, 'failed': 0, 'errors': {}}
for i in range(0, len(external_products), BATCH_SIZE):
batch = external_products[i:i + BATCH_SIZE]
products = [{
'iv_id': p['external_id'],
'name_en': p['title'],
'average_price': p['price'],
'is_enabled': p.get('active', True)
} for p in batch]
response = requests.post(
'https://linkit.works/api/v1/products/bulk',
headers={'Authorization': f'Bearer {token}'},
json={'mode': 'upsert', 'products': products},
timeout=60
)
data = response.json()
results['total'] += len(batch)
results['succeeded'] += data['data']['succeeded']
results['failed'] += data['data']['failed']
results['errors'].update(data['data'].get('errors', {}))
return resultsBest Practices
- Use batches of 500 for reliability, even though the limit is 1,000.
- Add delay between batches (e.g., 500ms) to avoid rate limiting.
- Implement retry with exponential backoff for transient failures.
- Validate locally before sending to catch obvious issues (missing
iv_id, negative prices). - Always inspect the
errorsmap — don't just checksuccess: true.
Rate Limits
| Operation | Limit |
|---|---|
| Bulk create/update | 100 requests/min |
| Bulk status update | 100 requests/min |
| Bulk delete | 60 requests/min |
Error Reference
| Error | Meaning | Fix |
|---|---|---|
Duplicate iv_id | Record exists (create mode) | Use upsert mode |
Not found | Record missing (update mode) | Use upsert mode |
Invalid reference | Foreign key doesn't exist | Create the referenced record first |
Validation failed | Field constraints not met | Check field requirements |
Payload too large | Too many items | Split into smaller batches |