Oracle ERP Cloud Rate Limits Deep Dive: Concurrent Requests, UCM Uploads, ESS Job Limits

Type: ERP Integration System: Oracle Fusion Cloud ERP (Release 24B / 25A) Confidence: 0.86 Sources: 7 Verified: 2026-03-09 Freshness: 2026-03-09

TL;DR

System Profile

This card covers Oracle Fusion Cloud ERP (also marketed as Oracle Cloud ERP, part of Oracle Fusion Cloud Applications Suite) across releases 24B and 25A. The rate limits documented here apply to all SaaS-hosted Oracle Fusion Cloud ERP instances including Financials, Procurement, Project Management, and Supply Chain modules. On-premise Oracle E-Business Suite has entirely different limits and is not covered.

PropertyValue
VendorOracle
SystemOracle Fusion Cloud ERP (Release 24B / 25A)
API SurfaceREST, SOAP, FBDI, BIP, ESS
Current API VersionRelease 25A (quarterly release cycle)
Editions CoveredAll SaaS editions
DeploymentCloud (Oracle-managed SaaS)
API DocsOracle Fusion Cloud REST API Documentation
StatusGA

API Surfaces & Capabilities

Oracle Fusion Cloud ERP exposes multiple integration surfaces, each with distinct rate limits and optimal use cases. Choosing the wrong surface is the most common integration design mistake.

API SurfaceProtocolBest ForMax Records/RequestRate LimitReal-time?Bulk?
REST APIHTTPS/JSONIndividual record CRUD, queries <500 records499 (GET), 500 (POST)Identity domain tierYesNo
SOAP Web ServicesHTTPS/XMLLegacy integrations, metadata operationsVaries by serviceShared with RESTYesNo
FBDICSV via UCMBulk data loads >500 records, migrations250 MB / 500K records per fileESS queue-basedNoYes
BIP ReportsSOAP/RESTData extraction, report generationN/A (report-based)ESS queue-basedNoPartial
ESS (Enterprise Scheduler)REST/UIScheduled batch processes, import jobsN/A (job-based)Thread pool limitedNoYes
Business EventsREST/WebhookEvent-driven outbound notificationsN/A (event-based)Publisher throughputYesN/A

Rate Limits & Quotas

Per-Request Limits

Limit TypeValueApplies ToNotes
Max records per GET query499REST APIUse offset + limit parameters for pagination; hasMore attribute indicates remaining records
Max records per POST500REST APISplit larger payloads; use FBDI for bulk operations
Max request body size1 MBREST API payloadApplies to single request body; attachments handled separately via UCM
Max bulk operation items50REST API bulk/batchPer-call limit for composite operations
Max FBDI file size250 MBFBDI upload to UCMSplit files exceeding this limit before upload
UCM upload timeout300 secondsFile transfer to UCMTimeout applies to the upload operation; large files over slow connections will fail
Max records per FBDI file (low-volume)50,000Low-volume import modePer-file limit; use high-volume mode for more
Max records per FBDI file (high-volume)500,000High-volume import modePer-file limit; split larger datasets across files

Rolling / Daily Limits

Oracle Fusion Cloud ERP rate limits are enforced at the identity domain level, not per-application. All REST and SOAP calls from all integrations targeting the same Fusion environment share a single rate limit pool.

Limit TypeValueWindowTier Differences
Authentication requests150–4,500 req/minPer minuteFree: 150, Oracle Apps: 1,000, Premium: 4,500
Token management requests150–5,000 req/minPer minuteFree: 150, Oracle Apps: 1,500, Premium: 5,000
Other API requests150–5,000 req/minPer minuteFree: 150, Oracle Apps: 1,500, Premium: 5,000
Bulk API requests200+ req/minPer minuteFree: 200; higher tiers scale accordingly
Max CSV files per import job20Per jobREST service importActivities limit
Max parallel import batches10ConcurrentRecommended maximum to avoid performance degradation
Max records per import submission10,000,000Per submission10 jobs × 20 files × 50K records theoretical max

ESS (Enterprise Scheduler) Limits

ESS jobs run server-side and are controlled by thread pools and work assignments rather than traditional API rate limits. These limits determine how many scheduled processes can run concurrently.

Limit TypeDefault ValueNotes
Synchronous Java job threads5–25 per JMS ProcessorConfigurable; determines max concurrent RUNNING synchronous jobs
Default threads per JMS Processor5Can be increased to 15+ via Scheduler Configuration, but Oracle-managed in SaaS
Recommended jobs per job set15–20Oracle recommends limiting for troubleshooting simplicity
Asynchronous job concurrencyWork-assignment definedControlled via workshifts and work assignments

Authentication

FlowUse WhenToken LifetimeRefresh?Notes
OAuth 2.0 JWT AssertionServer-to-server integrationsConfigurable (default ~1h)New JWT per requestRecommended for unattended integrations
SAML 2.0 FederationSSO-based user-context operationsSession-basedVia IdPRequires SAML IdP configuration
Basic Auth (Username/Password)Development, testing, simple scriptsSession timeoutNoNot recommended for production; no MFA support
OAuth 2.0 Authorization CodeUser-delegated accessAccess: ~1h, Refresh: long-livedYesFor applications acting on behalf of a user

Authentication Gotchas

Constraints

Integration Pattern Decision Tree

START - Need to integrate with Oracle Fusion Cloud ERP
|-- What's the integration pattern?
|   |-- Real-time (individual records, <1s)
|   |   |-- Data volume < 500 records/operation?
|   |   |   |-- YES -> REST API (JSON, straightforward CRUD)
|   |   |   +-- NO -> NOT supported real-time; switch to batch/FBDI
|   |   +-- Need event notifications?
|   |       |-- YES -> Business Events (outbound) + REST callback
|   |       +-- NO -> REST API polling with hasMore pagination
|   |-- Batch/Bulk (scheduled, high volume)
|   |   |-- Data volume < 500 records?
|   |   |   |-- YES -> REST API POST (simpler, no file overhead)
|   |   |   +-- NO (go down)
|   |   |-- Data volume < 50,000 records?
|   |   |   |-- YES -> FBDI low-volume mode (single file)
|   |   |   +-- NO (go down)
|   |   |-- Data volume < 500,000 records?
|   |   |   |-- YES -> FBDI high-volume mode (single file)
|   |   |   +-- NO -> FBDI with file splitting (max 20 files/job, 10 parallel jobs)
|   |   +-- Need data extraction (outbound)?
|   |       |-- Small dataset -> REST API with pagination
|   |       +-- Large dataset -> BIP report extraction (BI Publisher)
|   |-- Event-driven
|   |   |-- Oracle publishes events?
|   |   |   |-- YES -> Business Events subscription
|   |   |   +-- NO -> BIP report polling or REST polling (scheduled)
|   |   +-- Guaranteed delivery needed?
|   |       |-- YES -> Middleware (OIC) with error handling + retry
|   |       +-- NO -> Direct REST webhook receiver
|   +-- File-based (CSV/XML)
|       +-- FBDI (standard Oracle pattern) -> Upload CSV to UCM -> Submit ESS import job
|-- Which direction?
|   |-- Inbound (writing to ERP) -> REST (<500 records) or FBDI (>500 records)
|   |-- Outbound (reading from ERP) -> REST API (small) or BIP extraction (large)
|   +-- Bidirectional -> Design conflict resolution FIRST; use REST for real-time sync
+-- Error tolerance?
    |-- Zero-loss required -> Middleware (OIC/MuleSoft) + dead letter queue + FBDI error callbacks
    +-- Best-effort acceptable -> Direct REST with exponential backoff retry

Quick Reference

OperationAPI SurfaceMethod / ApproachMax RecordsTimeoutRate Limited By
Query recordsRESTGET with offset/limit499 per pageStandard HTTPIdentity domain tier
Create single recordRESTPOST1Standard HTTPIdentity domain tier
Create batch (small)RESTPOST with payload500Standard HTTPIdentity domain tier
Bulk import (inbound)FBDICSV upload to UCM + ESS job500K per file300s upload + ESS queueESS thread pool
Bulk extract (outbound)BIPReport execution via ESSReport-dependentESS queueESS thread pool
Schedule processESSREST /scheduledProcesses endpointN/AJob-dependentThread pool (5-25 threads)
Upload fileUCMERP Integration web service250 MB max300 secondsFile server concurrency
Real-time eventBusiness EventsEvent subscriptionN/AEvent-drivenPublisher throughput
SOAP operationSOAP WSStandard SOAP envelopeService-dependentStandard HTTPShared identity domain pool
Composite operationRESTBatch endpoint50 itemsStandard HTTPIdentity domain tier

Step-by-Step Integration Guide

1. Authenticate and obtain access token

Obtain an OAuth 2.0 access token using JWT assertion for server-to-server integration. [src3]

# Exchange JWT for access token
curl -X POST "https://{your-idcs-domain}/oauth2/v1/token" \
  -H "Content-Type: application/x-www-form-urlencoded" \
  -d "grant_type=urn:ietf:params:oauth:grant-type:jwt-bearer" \
  -d "assertion={your-jwt-assertion}" \
  -d "scope=urn:opc:idm:__myscopes__"

Verify: Response contains access_token field with token_type: "Bearer".

2. Query records with pagination (respecting 499-record limit)

Oracle REST API returns a maximum of 499 records per GET request. Use the offset and limit parameters to paginate. [src3]

# First page
curl -X GET "https://{host}/fscmRestApi/resources/v1/invoices?limit=499&offset=0" \
  -H "Authorization: Bearer {access_token}"

# Check response for hasMore: true, then increment offset
curl -X GET "https://{host}/fscmRestApi/resources/v1/invoices?limit=499&offset=499" \
  -H "Authorization: Bearer {access_token}"

Verify: Response JSON contains hasMore: false when all records retrieved.

3. Upload FBDI file to UCM for bulk import

For bulk loads exceeding 500 records, upload a CSV file to UCM via the ERP Integration web service. [src4]

curl -X POST "https://{host}/fscmRestApi/resources/v1/erpintegrations" \
  -H "Authorization: Bearer {access_token}" \
  -H "Content-Type: application/json" \
  -d '{
    "OperationName": "uploadFileToUCM",
    "DocumentContent": "{base64-encoded-zip-file}",
    "DocumentAccount": "fin$/journal$/import$",
    "ContentType": "zip",
    "FileName": "JournalImport.zip"
  }'

Verify: Response returns DocumentId confirming successful upload.

4. Submit ESS import job and monitor status

After uploading the FBDI file, submit an ESS scheduled process to execute the import. [src7]

# Submit import job
curl -X POST "https://{host}/fscmRestApi/resources/v1/erpintegrations" \
  -H "Authorization: Bearer {access_token}" \
  -H "Content-Type: application/json" \
  -d '{
    "OperationName": "submitESSJobRequest",
    "JobPackageName": "/oracle/apps/ess/financials/generalLedger/programs/",
    "JobDefName": "JournalImportLauncher",
    "ESSParameters": "300000004976094,1,#NULL"
  }'

# Monitor job status
curl -X GET "https://{host}/fscmRestApi/resources/v1/erpintegrations?finder=ESSJobStatusRF;requestId={job_id}" \
  -H "Authorization: Bearer {access_token}"

Verify: Response RequestStatus transitions from RUNNING to SUCCEEDED.

5. Implement rate limit handling with exponential backoff

When you hit the identity domain rate limit, Oracle returns HTTP 429. Implement exponential backoff starting at 2 seconds. [src1, src5]

import time
import requests

def call_oracle_api(url, headers, max_retries=5):
    for attempt in range(max_retries):
        response = requests.get(url, headers=headers)
        if response.status_code == 429:
            wait = min(2 ** attempt, 60)  # max 60s
            time.sleep(wait)
            continue
        elif response.status_code == 503:
            wait = min(2 ** (attempt + 1), 120)
            time.sleep(wait)
            continue
        return response
    raise Exception("Max retries exceeded for Oracle API")

Verify: Function returns valid response without 429 errors after retry sequence.

Code Examples

Python: Paginated data extraction with rate limit handling

# Input:  Oracle Fusion REST API endpoint, OAuth token
# Output: Complete list of all matching records across all pages

import requests
import time

def extract_all_records(base_url, endpoint, token, query_params=None):
    headers = {"Authorization": f"Bearer {token}",
               "Content-Type": "application/json"}
    all_records = []
    offset = 0
    limit = 499  # Oracle max per page
    has_more = True

    while has_more:
        url = f"{base_url}/fscmRestApi/resources/v1/{endpoint}"
        params = {"offset": offset, "limit": limit}
        if query_params:
            params.update(query_params)

        for attempt in range(5):
            resp = requests.get(url, headers=headers, params=params)
            if resp.status_code == 429:
                time.sleep(min(2 ** attempt, 60))
                continue
            resp.raise_for_status()
            break

        data = resp.json()
        items = data.get("items", [])
        all_records.extend(items)
        has_more = data.get("hasMore", False)
        offset += limit

    return all_records

JavaScript/Node.js: FBDI file upload with retry

// Input:  Base64-encoded zip file, Oracle REST endpoint, OAuth token
// Output: UCM Document ID confirming successful upload

const axios = require('axios'); // v1.6+

async function uploadFBDI(host, token, fileContent, fileName, docAccount) {
  const url = `${host}/fscmRestApi/resources/v1/erpintegrations`;
  const payload = {
    OperationName: 'uploadFileToUCM',
    DocumentContent: fileContent, // base64 encoded
    DocumentAccount: docAccount,
    ContentType: 'zip',
    FileName: fileName
  };

  for (let attempt = 0; attempt < 5; attempt++) {
    try {
      const resp = await axios.post(url, payload, {
        headers: {
          'Authorization': `Bearer ${token}`,
          'Content-Type': 'application/json'
        },
        timeout: 300000 // 300s to match UCM timeout
      });
      return resp.data.DocumentId;
    } catch (err) {
      if (err.response?.status === 429 || err.response?.status === 503) {
        await new Promise(r => setTimeout(r, Math.min(2 ** attempt * 1000, 60000)));
        continue;
      }
      throw err;
    }
  }
  throw new Error('FBDI upload failed after 5 retries');
}

cURL: Check ESS job status

# Input:  Oracle host, Bearer token, ESS job request ID
# Output: Job status (RUNNING, SUCCEEDED, ERROR, WARNING)

curl -s -X GET \
  "https://{host}/fscmRestApi/resources/v1/erpintegrations?finder=ESSJobStatusRF;requestId={REQUEST_ID}" \
  -H "Authorization: Bearer {token}" \
  -H "Content-Type: application/json" | jq '.items[0].RequestStatus'

Data Mapping

Rate Limit Tiers by Identity Domain Type

Identity Domain TypeAuth Requests/minToken Mgmt/minOther API/minBulk/minTypical Use
Free Tier150150150200Development, sandbox
Oracle Apps1,0001,5001,500ScaledStandard production
Premium4,5005,0005,000ScaledHigh-volume enterprise

Data Type Gotchas

Error Handling & Failure Points

Common Error Codes

CodeMeaningCauseResolution
429Too Many RequestsIdentity domain rate limit exceededExponential backoff: wait 2^n seconds, starting at 2s, max 60s, max 5 retries
503Service UnavailableOracle Fusion under maintenance or overloadedRetry with longer backoff (start at 4s, max 120s); check Oracle Cloud Status page
401UnauthorizedToken expired or invalidRefresh OAuth token; check if integration user is locked or password expired
400Bad RequestInvalid payload format or field valuesValidate payload against REST API describe endpoint; check required fields
404Not FoundResource or endpoint does not existVerify endpoint URL matches your Fusion release version; check object accessibility
500Internal Server ErrorServer-side Fusion errorLog full response; retry once; if persistent, file Oracle SR with correlation ID
JBO-25013Too Many ObjectsExcessive child records in single requestReduce batch size; split parent-child creates across multiple requests

Failure Points in Production

Anti-Patterns

Wrong: Polling REST API in a tight loop for bulk extraction

# BAD - Hammering API without backoff, will hit 429 within seconds
while True:
    response = requests.get(f"{base}/invoices?offset={offset}")
    process(response.json())
    offset += 499

Correct: Paginated extraction with rate-aware throttling

# GOOD - Respects rate limits, handles 429, uses appropriate delays
import time
offset = 0
while True:
    resp = requests.get(f"{base}/invoices?offset={offset}&limit=499",
                        headers=headers)
    if resp.status_code == 429:
        time.sleep(float(resp.headers.get('Retry-After', 5)))
        continue
    data = resp.json()
    process(data['items'])
    if not data.get('hasMore'):
        break
    offset += 499
    time.sleep(0.5)  # voluntary throttle: ~120 req/min

Wrong: Using REST API POST for bulk data loads

// BAD - REST POST limited to 500 records; 10,000 records = 20 serial calls
for (const batch of chunk(records, 500)) {
    await axios.post(`${base}/journals`, { items: batch });
}

Correct: Use FBDI for bulk loads, REST only for small batches

// GOOD - FBDI handles 500K records per file; single upload + ESS job
const csv = generateCSV(records); // all 10,000 records
const zip = await compressToZip(csv);
const base64 = zip.toString('base64');
const docId = await uploadFBDI(host, token, base64, 'import.zip', account);
const jobId = await submitESSJob(host, token, docId);
await pollJobStatus(host, token, jobId);

Wrong: Creating separate integration users to bypass rate limits

BAD - Rate limits are per identity domain, NOT per user
Creating 5 integration users does not give you 5x the rate limit.
All users in the same identity domain share the same pool.

Correct: Optimize request efficiency within the rate limit budget

GOOD - Maximize value per request:
1. Use query filters (q parameter) to reduce result set size
2. Use fields parameter to request only needed attributes
3. Cache reference data (LOVs, lookups) locally
4. Batch related operations into composite requests (max 50)
5. Schedule bulk operations via FBDI during off-peak hours

Common Pitfalls

Diagnostic Commands

# Test authentication (obtain token)
curl -s -X POST "https://{idcs-domain}/oauth2/v1/token" \
  -H "Content-Type: application/x-www-form-urlencoded" \
  -d "grant_type=client_credentials&scope=urn:opc:idm:__myscopes__" \
  -u "{client_id}:{client_secret}" | jq '.access_token'

# Test REST API connectivity
curl -s -o /dev/null -w "%{http_code}" -X GET \
  "https://{host}/fscmRestApi/resources/v1/" \
  -H "Authorization: Bearer {token}"

# Check ESS scheduled process status
curl -s -X GET \
  "https://{host}/fscmRestApi/resources/v1/erpintegrations?finder=ESSJobStatusRF;requestId={JOB_ID}" \
  -H "Authorization: Bearer {token}" | jq '.items[0] | {Status: .RequestStatus, StartTime: .StartTime, EndTime: .EndTime}'

# List recent ESS jobs (last 20)
curl -s -X GET \
  "https://{host}/fscmRestApi/resources/v1/scheduledProcesses?limit=20&orderBy=SubmittedDT:desc" \
  -H "Authorization: Bearer {token}" | jq '.items[] | {id: .RequestId, name: .Name, status: .Status}'

# Verify object accessibility (describe endpoint)
curl -s -X GET \
  "https://{host}/fscmRestApi/resources/v1/{resource}/describe" \
  -H "Authorization: Bearer {token}" | jq '.Resources[0].attributes | length'

Version History & Compatibility

ReleaseDateStatusKey ChangesNotes
25A2025-01CurrentEnhanced Scheduler REST API endpointsQuarterly update
24D2024-10SupportedBusiness Events improvements
24C2024-07SupportedREST API performance improvements
24B2024-04SupportedRevised identity domain rate limit tiersRate limit tier structure updated
24A2024-01SupportedFBDI high-volume mode expandedNew high-volume import objects

Deprecation Policy

Oracle Fusion Cloud ERP follows a quarterly release cadence with no formal API version numbering. REST endpoints are release-specific; breaking changes are documented in Release Readiness guides published before each quarterly update. Oracle generally provides one quarter advance notice before removing or changing REST API behavior.

When to Use / When Not to Use

Use WhenDon't Use WhenUse Instead
Designing a new Oracle ERP Cloud integration and need to understand all rate limit boundariesWorking with Oracle Cloud Infrastructure (OCI) platform APIsOCI service limits documentation
Troubleshooting 429 errors or ESS job queue bottlenecksIntegrating with Oracle NetSuite (different API surface entirely)NetSuite SuiteTalk/REST API limits documentation
Planning FBDI import strategy for high-volume data loadsNeed Oracle Integration Cloud (OIC) message pack limitsOIC service limits documentation
Evaluating whether REST API or FBDI is appropriate for your data volumeWorking with on-premise Oracle E-Business SuiteE-Business Suite integration guide

Important Caveats

Related Units