Oracle HCM Cloud API Capabilities — REST, HDL, HCM Extracts, Payroll, Workers
Type: ERP Integration
System: Oracle Fusion Cloud HCM (24B)
Confidence: 0.85
Sources: 6
Verified: 2026-03-09
Freshness: 2026-03-09
TL;DR
- Bottom line: Oracle Fusion Cloud HCM exposes REST APIs for real-time worker management and HDL (HCM Data Loader) for bulk inbound operations. HCM Extracts and BIP handle outbound extraction. REST is limited to ~500 records per operation — anything above that must use HDL.
- Key limit: REST fair-use throttling at ~60 req/s (HTTP 429); HDL max 250 MB per file with 20 concurrent loader threads per pod; REST pagination capped at 500 records per page.
- Watch out for: Payroll REST APIs require a separate Oracle Payroll license — they return 403 or empty results without it, and the error message does not clearly indicate a licensing issue.
- Best for: REST for real-time individual worker operations (hire, terminate, query <500 records); HDL for bulk loads (new hire imports, compensation updates, benefits enrollment); HCM Extracts for scheduled outbound feeds.
- Authentication: OAuth 2.0 JWT Bearer flow recommended for server-to-server via Oracle IDCS (3600s token lifetime, no refresh — re-authenticate on expiry).
System Profile
Oracle Fusion Cloud HCM is Oracle's SaaS human capital management suite. The REST API covers Core HR (workers, assignments, employment), Workforce Management, Compensation, Benefits, Talent, Learning, and Payroll (separate license). HDL handles bulk inbound via structured .dat files. HCM Extracts provide configurable outbound extraction. This card covers Oracle Fusion Cloud HCM only — not EBS HRMS, Taleo, or PeopleSoft.
| Property | Value |
| Vendor | Oracle |
| System | Oracle Fusion Cloud HCM (Release 24B) |
| API Surface | REST (primary), HDL (bulk inbound), HCM Extracts (bulk outbound), BIP (reports), SOAP (legacy) |
| Current API Version | 11.13.18.05 (resource version, stable across releases) |
| Editions Covered | Enterprise (single edition for cloud) |
| Deployment | Cloud (Oracle Cloud Infrastructure) |
| API Docs | Oracle Fusion Cloud HCM REST API |
| Status | GA — quarterly feature releases |
API Surfaces & Capabilities
| API Surface | Protocol | Best For | Max Records/Request | Rate Limit | Real-time? | Bulk? |
| REST API | HTTPS/JSON | Individual worker CRUD, queries | 500/page | Fair-use ~60 req/s (429) | Yes | No |
| HDL (HCM Data Loader) | .dat files via REST or UCM | Bulk inbound: hires, mass updates | 250 MB/file | 20 threads/pod | No | Yes |
| HCM Extracts | XML/CSV via UCM or FTP | Bulk outbound: scheduled feeds | Extract-dependent | ESS scheduler | No | Yes |
| BI Publisher (BIP) | HTTPS/XML or CSV | Reporting, ad-hoc extraction | Report-dependent | ESS scheduler | No | Yes |
| SOAP Web Services | HTTPS/XML | Legacy integrations | Varies | Shared with REST | Yes | No |
| HCM Atom Feeds | HTTPS/Atom XML | Change tracking, event-driven | N/A | Configurable | Yes | N/A |
Rate Limits & Quotas
Per-Request Limits
| Limit Type | Value | Applies To | Notes |
| Max records per REST page | 500 | REST API (all HCM resources) | Default page size is 25; use limit param |
| Max HDL file size | 250 MB | .dat file upload | Split larger data sets into multiple files |
| Max concurrent HDL threads | 20 | Per pod | All HDL imports share the thread pool |
| Max attachment size | 2 GB | REST API attachments | Per-file limit |
Rolling / Daily Limits
| Limit Type | Value | Window | Edition Differences |
| REST API calls | No published hard limit | Fair-use | Dynamic throttling — ~60 req/s, returns 429 |
| HDL import submissions | No published hard limit | Per pod | Concurrent up to 20 threads |
| HCM Extract runs | No published hard limit | Per pod | Subject to ESS scheduler capacity |
Authentication
| Flow | Use When | Token Lifetime | Refresh? | Notes |
| OAuth 2.0 JWT Bearer | Server-to-server (recommended) | 3600 seconds | No (re-authenticate) | Register in Oracle IDCS |
| OAuth 2.0 Client Credentials | Automated batch processes | 3600 seconds | No (re-authenticate) | Same IDCS registration |
| Basic Auth over SSL | Testing only | Session-based | N/A | Not recommended for production |
| SAML 2.0 Bearer Token | SSO-federated operations | Session timeout | N/A | SAML assertion in HTTP header |
Authentication Gotchas
- OAuth JWT Bearer flow does not provide refresh tokens — re-authenticate every 3600s. IDCS admins can change the timeout. [src6]
- Integration user must have correct HCM duty roles — missing roles produce 403, not 401. Role assignment determines which workers are visible (data security). [src6]
- IDCS token endpoint URL varies by data center region. Wrong region endpoint returns connection errors, not auth errors. [src6]
- HDL imports inherit the integration user's security context — user needs HCM Data Loader role AND object-level security roles. [src2]
Constraints
- REST API is NOT suitable for bulk operations (>500 records) — use HDL. No bulk endpoint equivalent to Salesforce Bulk API.
- HDL .dat files must follow Oracle's exact Business Object structure with pipe-delimited fields and METADATA/MERGE/END blocks.
- Payroll REST APIs require separate Oracle Payroll license — endpoints return 403 or empty results without it.
- HCM Extracts are outbound-only — cannot be used for inbound data loading.
- SOAP APIs are maintenance-only — Oracle recommends REST + HDL for all new integrations.
- Worker records require correct hierarchical order (Legal Employer > Assignment > Work Relationship).
- Effective-dated records require explicit EffectiveStartDate — omitting defaults to system date.
Integration Pattern Decision Tree
START — User needs to integrate with Oracle HCM Cloud
├── What's the data direction?
│ ├── Inbound (writing to HCM)
│ │ ├── Volume < 500 records? → REST API
│ │ │ ├── Hire → POST /workers
│ │ │ ├── Update assignment → PATCH /workers/{id}/child/assignments/{id}
│ │ │ └── Terminate → POST /workers/{id}/child/workRelationships/{id}/action/terminate
│ │ ├── Volume 500-50,000? → HDL (.dat file via hcmImportAndLoadData)
│ │ └── Volume > 50,000? → HDL — split into multiple files (<250 MB each)
│ ├── Outbound (reading from HCM)
│ │ ├── Real-time queries → REST API: GET /workers
│ │ ├── Scheduled bulk extraction → HCM Extracts
│ │ └── Change tracking → HCM Atom Feeds
│ └── Bidirectional → REST + HCM Extracts + HDL
├── Error tolerance?
│ ├── Zero-loss → HDL with error file download + reconciliation
│ └── Best-effort → REST with retry on 429/5xx
Quick Reference
| Module | Resource | Endpoint Path | Operations | Notes |
| Core HR — Workers | workers | /hcmRestApi/resources/11.13.18.05/workers | GET, POST | Hire, query; child: assignments, names, emails |
| Core HR — Assignments | assignments (child) | /hcmRestApi/.../workers/{id}/child/assignments | GET, PATCH | Effective-dated |
| Compensation — Salary | salaries | /hcmRestApi/resources/11.13.18.05/salaries | GET, POST, PATCH | Individual salary management |
| Absence | absences | /hcmRestApi/resources/11.13.18.05/absences | GET, POST, PATCH | Leave requests and balances |
| Payroll — Relationships | payrollRelationships | /hcmRestApi/resources/11.13.18.05/payrollRelationships | GET | Requires Payroll license |
| Payroll — Payslips | payslips | /hcmRestApi/resources/11.13.18.05/payslips | GET | Requires Payroll license |
| Talent — Goals | goals | /hcmRestApi/resources/11.13.18.05/goals | GET, POST, PATCH | Goal management |
| Learning | learningRecordEnrollments | /hcmRestApi/resources/11.13.18.05/learningRecordEnrollments | GET, POST | Training enrollment |
| HDL Import | hcmImportAndLoadData | /hcmRestApi/resources/11.13.18.05/hcmImportAndLoadData | POST | HDL file upload + import trigger |
| Atom Feeds | atomFeeds | /hcmRestApi/resources/11.13.18.05/atomFeeds/{feedId} | GET | Change tracking |
Step-by-Step Integration Guide
1. Authenticate via OAuth 2.0 JWT Bearer
Register a Confidential Application in Oracle IDCS, obtain Client ID and Client Secret. [src6]
curl -X POST \
"https://{identity-domain}.identity.oraclecloud.com/oauth2/v1/token" \
-H "Content-Type: application/x-www-form-urlencoded" \
-u "{client_id}:{client_secret}" \
-d "grant_type=client_credentials&scope=urn:opc:resource:consumer::all"
Verify: Response contains "access_token" field and "expires_in": 3600
2. Query Workers via REST
Retrieve worker records with expanded child objects. [src1]
curl -X GET \
"https://{host}.fa.{dc}.oraclecloud.com/hcmRestApi/resources/11.13.18.05/workers?q=PersonNumber=12345&expand=assignments,names,emails&onlyData=true" \
-H "Authorization: Bearer {access_token}"
Verify: Response contains "items" array with worker details
3. Hire a New Worker via REST
POST with the required hierarchy: Person > WorkRelationship > Assignment. [src1]
curl -X POST \
"https://{host}.fa.{dc}.oraclecloud.com/hcmRestApi/resources/11.13.18.05/workers" \
-H "Authorization: Bearer {access_token}" \
-H "Content-Type: application/vnd.oracle.adf.resourceitem+json" \
-d '{ "names": [{"FirstName":"Jane","LastName":"Smith","LegislationCode":"US"}],
"workRelationships": [{"LegalEmployerName":"US Legal Entity","WorkerType":"E",
"StartDate":"2026-04-01","assignments":[{"BusinessUnitName":"US BU",
"JobName":"Software Engineer","ActionCode":"HIRE"}]}] }'
Verify: Response HTTP 201 with PersonId populated
4. Upload HDL File for Bulk Import
Prepare .dat file, ZIP, base64-encode, upload via hcmImportAndLoadData. [src2]
curl -X POST \
"https://{host}.fa.{dc}.oraclecloud.com/hcmRestApi/resources/11.13.18.05/hcmImportAndLoadData" \
-H "Authorization: Bearer {access_token}" \
-H "Content-Type: application/vnd.oracle.adf.resourceitem+json" \
-d '{"ContentType":"zip","FileName":"Worker.zip","DocumentContent":"{base64_encoded_zip}"}'
Verify: Poll GET /hcmImportAndLoadData/{FlowId} until status = COMPLETED
Code Examples
Python: Hire Worker with Error Handling
# Input: Oracle HCM host, OAuth token, hire data dict
# Output: Created PersonId or error details
import requests # requests==2.31.0
def hire_worker(host, token, hire_data):
url = f"https://{host}/hcmRestApi/resources/11.13.18.05/workers"
headers = {
"Authorization": f"Bearer {token}",
"Content-Type": "application/vnd.oracle.adf.resourceitem+json"
}
resp = requests.post(url, headers=headers, json=hire_data)
if resp.status_code == 201:
return {"success": True, "person_id": resp.json().get("PersonId")}
elif resp.status_code == 429:
return {"success": False, "error": "throttled",
"retry_after": resp.headers.get("Retry-After", "60")}
else:
return {"success": False, "status": resp.status_code, "detail": resp.text}
JavaScript/Node.js: Query Workers by Department
// Input: Oracle host URL, OAuth token, department name
// Output: Array of worker objects
import fetch from 'node-fetch'; // [email protected]
async function getWorkersByDepartment(host, token, department) {
const url = new URL(`https://${host}/hcmRestApi/resources/11.13.18.05/workers`);
url.searchParams.set('q', `assignments.DepartmentName=${department}`);
url.searchParams.set('expand', 'names,assignments,emails');
url.searchParams.set('onlyData', 'true');
url.searchParams.set('limit', '500');
const resp = await fetch(url.toString(), {
headers: { 'Authorization': `Bearer ${token}` }
});
if (!resp.ok) throw new Error(`Query failed: ${resp.status}`);
return (await resp.json()).items || [];
}
cURL: Test Authentication
# Step 1: Get token
curl -s -X POST \
"https://{idcs}.identity.oraclecloud.com/oauth2/v1/token" \
-u "{client_id}:{client_secret}" \
-d "grant_type=client_credentials&scope=urn:opc:resource:consumer::all" \
| python -m json.tool
# Step 2: Test against workers endpoint
curl -s -X GET \
"https://{host}.fa.{dc}.oraclecloud.com/hcmRestApi/resources/11.13.18.05/workers?limit=5&onlyData=true" \
-H "Authorization: Bearer {access_token}" \
| python -m json.tool
Data Mapping
HDL Worker Business Object — Key Fields
| HDL Column | Field | Type | Required | Notes |
| PersonNumber | PERSON_NUMBER | String | Yes (MERGE) | Unique person identifier |
| DateOfBirth | DATE_OF_BIRTH | Date | No | Format: YYYY/MM/DD |
| LegislationCode | LEGISLATION_CODE | String | Yes | Country code (US, GB, DE) |
| LegalEmployerName | LEGAL_EMPLOYER_NAME | String | Yes | Must match configured legal employer |
| WorkerType | WORKER_TYPE | String | Yes | E=Employee, C=Contingent |
| ActionCode | ACTION_CODE | String | Yes | HIRE, REHIRE, TRANSFER, etc. |
Data Type Gotchas
- HDL uses YYYY/MM/DD dates; REST uses YYYY-MM-DD — mixing causes silent failures. [src1]
- HDL uses pipe-delimited METADATA/MERGE/END blocks — any formatting deviation fails the entire file. [src2]
- Person names in HDL require separate Name business object rows — NOT inline columns on the Worker row. [src2]
Error Handling & Failure Points
Common Error Codes
| Code | Meaning | Cause | Resolution |
| 400 | Bad Request | Invalid payload, missing hierarchy | Check required fields and Person > WorkRelationship > Assignment structure |
| 401 | Unauthorized | Invalid/expired token | Re-authenticate via OAuth |
| 403 | Forbidden | Missing HCM roles or Payroll license | Assign HCM duty roles; verify Payroll license |
| 404 | Not Found | Invalid endpoint or PersonId | Verify resource version (11.13.18.05) |
| 409 | Conflict | Duplicate PersonNumber | Check for existing records; retry with jitter |
| 429 | Too Many Requests | Fair-use throttling (~60 req/s) | Exponential backoff with Retry-After header |
| 500 | Internal Server Error | Transient server issue | Retry after 30-60 seconds |
| HDL-001 | Validation Error | Invalid .dat format | Validate METADATA/MERGE/END structure |
Failure Points in Production
- HDL upload succeeds but import fails silently: POST returns 200 but import job fails. Errors only in HDL error log. Fix:
Always poll import status AND download error file. [src2]
- Payroll endpoints return 403 without clear licensing error: Without Payroll license, endpoints return 403. Fix:
Verify Payroll license before building payroll integrations. [src3]
- Effective date confusion creates incorrect history: Omitting EffectiveStartDate defaults to system date. Fix:
Always set EffectiveStartDate explicitly. [src1]
- OAuth token expires during HDL processing: Large imports take hours. Fix:
Re-authenticate before each poll; never cache tokens beyond 3500s. [src6]
Anti-Patterns
Wrong: Querying all workers one-at-a-time for change detection
# BAD — O(N) API calls; hits throttling quickly
for pid in all_person_ids:
resp = requests.get(f"{base}/workers/{pid}", headers=headers)
if resp.json()["LastUpdateDate"] > last_sync:
process_change(resp.json())
Correct: Use HCM Atom Feeds for change detection
# GOOD — single feed request returns all changes
resp = requests.get(f"{base}/atomFeeds/workers?since={last_token}", headers=headers)
for entry in parse_atom(resp.text):
process_change(entry)
Wrong: Using REST for mass compensation update
# BAD — 10,000 individual PATCH calls
for emp in employees:
requests.patch(f"{base}/salaries/{emp['id']}", json={"Amount": emp["new"]}, headers=headers)
Correct: Use HDL for bulk compensation changes
# GOOD — single HDL file for any volume
buf = io.StringIO()
buf.write("METADATA|Salary|AssignmentNumber|SalaryAmount|DateFrom\n")
for emp in employees:
buf.write(f"MERGE|Salary|{emp['asn']}|{emp['new']}|2026-04-01\n")
buf.write("END\n")
# ZIP, base64, upload via hcmImportAndLoadData
Common Pitfalls
- Assuming REST handles bulk: Oracle HCM REST has no bulk endpoint. Fix:
Design for HDL if volume exceeds 500 records. [src5]
- Building payroll integrations without verifying license: Payroll endpoints exist in docs regardless of licensing. Fix:
Confirm Payroll license with Oracle admin first. [src3]
- Not downloading HDL error files: Developers see "FAILED" but miss the actual errors. Fix:
Always download the error file after every import. [src2]
- Using INSERT mode in HDL: INSERT fails on retry if record exists. Fix:
Always use MERGE mode in production. [src2]
- Ignoring hierarchical worker model: Flat designs miss required Person > WorkRelationship > Assignment structure. Fix:
Study the composite object before designing. [src1]
Diagnostic Commands
# Test OAuth authentication
curl -s -X POST "https://{idcs}.identity.oraclecloud.com/oauth2/v1/token" \
-u "{client_id}:{client_secret}" \
-d "grant_type=client_credentials&scope=urn:opc:resource:consumer::all" | python -m json.tool
# Query first 5 workers (verify connectivity + HCM access)
curl -s -X GET "https://{host}.fa.{dc}.oraclecloud.com/hcmRestApi/resources/11.13.18.05/workers?limit=5&onlyData=true" \
-H "Authorization: Bearer {token}" | python -m json.tool
# Check HDL import status
curl -s -X GET "https://{host}.fa.{dc}.oraclecloud.com/hcmRestApi/resources/11.13.18.05/hcmImportAndLoadData/{FlowId}" \
-H "Authorization: Bearer {token}" | python -m json.tool
# Describe workers resource (field metadata)
curl -s -X GET "https://{host}.fa.{dc}.oraclecloud.com/hcmRestApi/resources/11.13.18.05/workers/describe" \
-H "Authorization: Bearer {token}" | python -m json.tool
Version History & Compatibility
| Release | Release Date | Status | Key Changes | Migration Notes |
| 24B | 2024-07 | Current | Expanded workers resource; enhanced HDL error reporting | Evaluate new child objects |
| 24A | 2024-01 | Supported | HCM Atom Feeds GA; new absence REST endpoints | Atom Feeds replace polling |
| 23D | 2023-10 | Supported | Talent management REST endpoints | New goals/performance resources |
| 23B | 2023-04 | Supported | hcmImportAndLoadData REST endpoint | Replaces UCM-only HDL upload |
When to Use / When Not to Use
| Use When | Don't Use When | Use Instead |
| Real-time individual worker operations <500 records | Bulk imports >500 records | HDL via hcmImportAndLoadData |
| Querying worker details, absences | Mass compensation changes (annual review) | HDL with Salary business object |
| Individual absence requests | Scheduled outbound HR feeds | HCM Extracts |
| Change detection for real-time sync | Full HCM data migration | HDL with full business object set |
Important Caveats
- Oracle does not publish fixed API rate limits — fair-use throttling varies by pod and time of day. Production may hit 429s during payroll runs or open enrollment.
- Payroll REST APIs require a separate Oracle Payroll license. Documentation lists endpoints regardless of licensing.
- HDL is the only supported bulk inbound path — no REST bulk endpoint exists.
- The hierarchical worker data model is non-negotiable — flat integration patterns will fail.
- Effective-dated operations are pervasive — nearly every HCM update requires explicit effective dates.
- HCM Extracts are outbound-only despite the name suggesting bidirectional capability.
- API endpoint availability depends on which Oracle HCM modules are licensed and opted in.
Related Units