Oracle ERP Cloud FBDI Import Capabilities

Type: ERP Integration System: Oracle ERP Cloud (Release 26A) Confidence: 0.91 Sources: 8 Verified: 2026-03-01 Freshness: 2026-03-01

TL;DR

System Profile

Oracle ERP Cloud FBDI is a platform-level capability available across all Oracle Fusion Cloud Applications editions. It provides a standardized file-based import mechanism using Excel/CSV templates that map directly to interface tables. FBDI covers all major Financials modules (GL, AP, AR, FA, CM), Supply Chain Management, HCM, and Project Management. This card focuses on the Financials modules but the FBDI architecture is identical across all pillars. [src1]

FBDI is not a standalone API — it is a file processing pipeline that can be triggered manually via the Scheduled Processes UI, or programmatically via the ErpIntegrationService SOAP/REST web service (importBulkData operation). For automated integrations, Oracle Integration Cloud (OIC) provides a native ERP Cloud Adapter with a dedicated "Import Bulk Data" action. [src2, src7]

PropertyValue
VendorOracle
SystemOracle Fusion Cloud ERP (Release 26A)
API SurfaceFile-Based (FBDI) + ErpIntegrationService (SOAP/REST)
Current Release26A (January 2026)
Editions CoveredAll editions — FBDI is a core platform capability
DeploymentCloud
API DocsFBDI for Financials
StatusGA — actively maintained, templates updated quarterly

API Surfaces & Capabilities

FBDI is not a traditional API but a file-processing pipeline. It can be invoked through multiple channels, each suited to different integration patterns. [src2, src4]

ChannelProtocolBest ForMax Records/JobFile Size LimitAutomated?Bulk?
Manual UI UploadHTTPS (browser)Ad-hoc imports, testing~100K per CSV250 MB ZIPNoYes
ErpIntegrationService SOAPSOAP/XMLServer-to-server automation2M (20 files x 100K)250 MB ZIPYesYes
ErpIntegrationService RESTREST/JSONModern server-to-server2M (20 files x 100K)250 MB ZIPYesYes
OIC ERP Cloud AdapterAdapter-basedOracle-to-Oracle, iPaaS orchestration2M (20 files x 100K)250 MB ZIPYesYes
External Data Loader Client (EDLC)CLI toolLarge-volume migrations, auto-splitsUnlimited (auto-split)250 MB per splitYesYes

Rate Limits & Quotas

Per-File / Per-Job Limits

Limit TypeValueApplies ToNotes
Max ZIP file size250 MBAll channelsHard limit — UCM rejects larger uploads [src3]
Recommended records per CSV~100,000Each CSV within ZIPPerformance degrades above this; Oracle recommends 10K for error triage [src3, src6]
Max CSV files per ZIP20Per import jobOracle recommended limit for 2M total records [src3]
Max total records per job~2,000,000Per import job20 files x 100K records [src3]
Max parallel import batches10Per Oracle instanceRunning more than 10 causes ESS queue contention [src3]

ESS Job Scheduling Limits

Limit TypeValueNotes
Concurrent ESS jobsInstance-dependentShared pool with all scheduled processes — not FBDI-specific [src4]
Job timeoutConfigurable per jobLarge imports can run 30+ minutes; monitor via ESS Console [src4]
Callback supportYesErpIntegrationService supports web service callbacks on completion [src2]
Notification optionsEmail, Bell, Both, None2-digit code: first digit = channel, second = condition [src2]

UCM Upload Limits

Limit TypeValueNotes
Max upload size (ErpIntegrationService)250 MBBase64-encoded content in SOAP/REST payload [src2, src3]
Document security groupFAFusionImportExportRequired for FBDI uploads; Attachment group also works [src2]
Account path (Financials)fin$/payables$/import$, fin$/generalLedger$/import$Module-specific UCM paths [src7]

Authentication

FlowUse WhenToken LifetimeRefresh?Notes
Basic AuthQuick automation, testingSession-basedN/AUsername:password base64-encoded; works for SOAP/REST [src2]
OAuth 2.0 (3-legged)OIC, external iPaaSAccess: 1h typicalYesRequires Oracle Identity Cloud Service (IDCS) setup [src2]
OAuth 2.0 (JWT assertion)Server-to-server, unattendedConfigurableNew JWT per requestRecommended for production integrations [src2]
SAML assertionFederated identity scenariosSession-basedN/AComplex setup; not recommended for FBDI automation [src2]

Authentication Gotchas

Constraints

Integration Pattern Decision Tree

START — User needs to bulk-import data into Oracle ERP Cloud
|
+-- What module?
|   +-- GL (Journals) --> Template: JournalImportTemplate.xlsm
|   |   +-- ESS Phase 1: "Load Interface File for Import"
|   |   +-- ESS Phase 2: "Import Journals"
|   |   +-- Interface table: GL_INTERFACE
|   |
|   +-- AP (Invoices) --> Template: PayablesStandardInvoiceImportTemplate.xlsm
|   |   +-- ESS Phase 1: "Load Interface File for Import"
|   |   +-- ESS Phase 2: "Import Payables Invoices"
|   |   +-- Interface tables: AP_INVOICES_INTERFACE, AP_INVOICE_LINES_INTERFACE
|   |
|   +-- AR (Transactions) --> Template: AutoInvoiceImportTemplate.xlsm
|   |   +-- ESS Phase 1: "Load Interface File for Import"
|   |   +-- ESS Phase 2: "Import AutoInvoice"
|   |   +-- Interface table: RA_INTERFACE_LINES_ALL
|   |
|   +-- FA (Assets) --> Template: AssetImportTemplate.xlsm
|   |   +-- ESS Phase 1: "Load Interface File for Import"
|   |   +-- ESS Phase 2: "Post Mass Additions"
|   |   +-- Interface table: FA_MASS_ADDITIONS
|   |
|   +-- Other --> Download module-specific template from Oracle docs
|
+-- How many records?
|   +-- < 10,000 --> Single CSV, single job, manual or automated
|   +-- 10,000 - 100,000 --> Single CSV, single job, automated
|   +-- 100,000 - 2,000,000 --> Split into multiple CSVs (100K each), 1 ZIP
|   +-- > 2,000,000 --> Multiple jobs (max 2M per job), max 10 parallel
|
+-- Automation method?
    +-- One-time / ad-hoc --> Manual UI upload via Scheduled Processes
    +-- Scheduled recurring --> ErpIntegrationService (SOAP/REST)
    +-- Oracle-to-Oracle --> OIC with ERP Cloud Adapter
    +-- Large migration --> External Data Loader Client (EDLC)

Quick Reference

FBDI Process Flow (All Modules)

StepActionSystemDetails
1Download XLSM templateOracle DocsModule-specific; re-download each release [src1]
2Populate data tabExcel/ETLFollow column order exactly; mandatory fields per module [src4]
3Generate CSV via macroExcel"Generate CSV File" button creates ZIP with CSV(s) + .properties [src5]
4Upload ZIP to UCMUI or APIManual: Tools > File Import and Export; Automated: importBulkData [src2]
5Run "Load Interface File for Import"ESSSpawns Transfer File + Load File to Interface child jobs [src5]
6Run module-specific import jobESSe.g., "Import Journals," "Import Payables Invoices" [src5]
7Verify importESS Console / UICheck ESS log for errors; review interface table rejections [src4]
8Purge interface tablesESSClean up processed/failed records to prevent reprocessing issues [src8]

Module Template Reference

ModuleTemplate File (XLSM)Interface Table(s)ESS Import JobUCM Account Path
GL JournalsJournalImportTemplate.xlsmGL_INTERFACEImport Journalsfin$/generalLedger$/import$
AP InvoicesPayablesStandardInvoiceImportTemplate.xlsmAP_INVOICES_INTERFACE, AP_INVOICE_LINES_INTERFACEImport Payables Invoicesfin$/payables$/import$
AR TransactionsAutoInvoiceImportTemplate.xlsmRA_INTERFACE_LINES_ALLImport AutoInvoicefin$/receivables$/import$
FA AssetsAssetImportTemplate.xlsmFA_MASS_ADDITIONSPost Mass Additionsfin$/fixedAssets$/import$
AP SuppliersSupplierImportTemplate.xlsmAP_SUPPLIERS_INT, AP_SUPPLIER_SITES_INTImport Suppliersfin$/payables$/import$
CM Bank StatementsBankStatementImportTemplate.xlsmCE_STATEMENT_HEADERS_INTImport Bank Statementfin$/cashManagement$/import$

Step-by-Step Integration Guide

1. Download and prepare the FBDI template

Download the current-release XLSM template from Oracle documentation for your target module. Open in Excel with macros enabled. The template contains an "Instructions and CSV Generation" tab and one or more data tabs. [src1, src5]

# Download GL Journal Import template for Release 26A
# Navigate to: Oracle Help Center > Fusion Cloud Financials > FBDI for Financials
# Download: JournalImportTemplate.xlsm
# Data tab: GL_INTERFACE
# Required columns: Status, Ledger Name, Accounting Date, Journal Source,
#                   Category, Currency Code, Actual Flag, Entered Debit/Credit,
#                   Segment1-SegmentN (chart of accounts segments)

Verify: Open XLSM, check version matches your Oracle release (e.g., "26A" in Instructions tab).

2. Populate data and generate CSV

Fill in the data tab with your records. Run the "Generate CSV File" macro to produce a ZIP archive containing the CSV file(s) and a .properties file. [src5]

# Click "Generate CSV File" button on Instructions tab
# Output: JournalImportTemplate.zip containing GlInterface.csv + .properties

Verify: Unzip the generated archive and confirm CSV column count matches template definition.

3. Upload and submit via ErpIntegrationService REST API (automated)

For automated integrations, use the importBulkData operation on the ErpIntegrationService REST endpoint. [src2, src7]

# REST endpoint: POST /fscmRestApi/resources/11.13.18.05/erpintegrations
curl -X POST \
  "https://<instance>.fa.us2.oraclecloud.com/fscmRestApi/resources/11.13.18.05/erpintegrations" \
  -H "Content-Type: application/json" \
  -u "integration_user:password" \
  -d '{
    "OperationName": "importBulkData",
    "DocumentContent": "<base64-encoded-ZIP>",
    "ContentType": "zip",
    "FileName": "JournalImportTemplate.zip",
    "DocumentAccount": "fin$/generalLedger$/import$",
    "JobName": "/oracle/apps/ess/financials/generalLedger/programs/,JournalImportLauncher",
    "ParameterList": "<LEDGER_ID>#NULL#NULL#NULL#N",
    "JobOptions": "InterfaceDetails=2"
  }'
# Response: { "ReqstId": 12345678, "RequestStatus": "SUBMITTED" }

Verify: GET ...?finder=ESSJobStatusRF;requestId=12345678 returns "RequestStatus": "SUCCEEDED".

4. Monitor and handle errors

Poll ESS job status and download execution logs for error details. [src2, src4]

# Check job status
curl -s -X GET \
  "https://<instance>/fscmRestApi/resources/11.13.18.05/erpintegrations?finder=ESSJobStatusRF;requestId=12345678" \
  -u "user:pass"

# Download job log files
curl -s -X POST \
  "https://<instance>/fscmRestApi/resources/11.13.18.05/erpintegrations" \
  -H "Content-Type: application/json" \
  -u "user:pass" \
  -d '{"OperationName":"downloadESSJobExecutionDetails","ReqstId":"12345678"}'

Verify: Status is "SUCCEEDED" or "WARNING" (check logs for warning details).

Code Examples

Python: Automated FBDI Import via REST API

# Input:  CSV data file (e.g., GL journals), Oracle ERP Cloud credentials
# Output: ESS job request ID, final job status

import requests
import base64
import time

def fbdi_import(instance_url, username, password, zip_path,
                document_account, job_name, parameter_list,
                job_options="InterfaceDetails=2"):
    """Upload FBDI ZIP and submit import via ErpIntegrationService REST."""
    with open(zip_path, "rb") as f:
        doc_content = base64.b64encode(f.read()).decode("utf-8")

    endpoint = f"{instance_url}/fscmRestApi/resources/11.13.18.05/erpintegrations"
    payload = {
        "OperationName": "importBulkData",
        "DocumentContent": doc_content,
        "ContentType": "zip",
        "FileName": zip_path.split("/")[-1],
        "DocumentAccount": document_account,
        "JobName": job_name,
        "ParameterList": parameter_list,
        "JobOptions": job_options
    }
    resp = requests.post(endpoint, json=payload, auth=(username, password))
    resp.raise_for_status()
    request_id = resp.json().get("ReqstId")

    # Poll for completion (max 30 min, 30s intervals)
    for attempt in range(60):
        time.sleep(30)
        status_resp = requests.get(
            f"{endpoint}?finder=ESSJobStatusRF;requestId={request_id}",
            auth=(username, password))
        items = status_resp.json().get("items", [])
        if items:
            status = items[0].get("RequestStatus", "UNKNOWN")
            if status in ("SUCCEEDED", "ERROR", "WARNING"):
                return {"request_id": request_id, "status": status}
    return {"request_id": request_id, "status": "TIMEOUT"}

cURL: Quick FBDI status check

# Input:  ESS job request ID from importBulkData
# Output: Job status (SUBMITTED, RUNNING, SUCCEEDED, ERROR, WARNING)

curl -s -X GET \
  "https://<instance>/fscmRestApi/resources/11.13.18.05/erpintegrations?finder=ESSJobStatusRF;requestId=12345678" \
  -u "integration_user:password" | python3 -m json.tool

Data Mapping

GL Journal Import — Key Fields

CSV ColumnInterface Table ColumnTypeRequiredGotcha
StatusSTATUSStringYesMust be "NEW" for new records [src5]
Ledger NameLEDGER_NAMEStringYesMust match exactly — case-sensitive [src5]
Accounting DateACCOUNTING_DATEDateYesFormat: YYYY/MM/DD — not YYYY-MM-DD [src5]
Currency CodeCURRENCY_CODEStringYesISO 4217 — "USD", not "US Dollar" [src5]
Entered DebitENTERED_DRNumberConditionalEither debit or credit per line, not both [src5]
Entered CreditENTERED_CRNumberConditionalEither debit or credit per line, not both [src5]
Segment1-SegmentNSEGMENT1-SEGMENTNStringYesMust match chart of accounts structure [src5]
Group IDGROUP_IDNumberNoEnables selective batch import — highly recommended [src5]

Data Type Gotchas

Error Handling & Failure Points

Common Error Codes

ErrorMeaningCauseResolution
FBDI-001: Load File to Interface failedCSV structure mismatchTemplate version mismatch or wrong column countRe-download template for current release; verify CSV column count [src4]
GL-IMPORT: Unbalanced journalDebit/credit imbalanceTotal debits do not equal total creditsVerify amounts; check for rounding errors in currency conversion [src5]
AP-IMPORT: Duplicate invoiceDuplicate detectionSame invoice number + supplier + BU existsUse unique invoice numbers; check existing invoices first [src1]
ESS-TIMEOUTJob exceeded time limitToo many records or ESS queue congestionSplit into smaller batches; schedule during off-peak hours [src4]
UCM-UPLOAD: File too large250 MB limit exceededZIP file exceeds UCM upload limitSplit data into multiple ZIP files; submit as separate jobs [src3]
INTERFACE-PURGE: Records existStale interface recordsPrevious failed import left recordsRun purge process for the module before resubmitting [src8]

Failure Points in Production

Anti-Patterns

Wrong: Uploading all records in a single massive CSV

# BAD — 500,000 records in one CSV file
# Performance degrades exponentially above 100K records
# Error triage is nearly impossible
# ESS job may time out or consume excessive memory

Correct: Split into batches of 10,000-100,000 records

# GOOD — split into manageable batches
# 500K records -> 5 CSV files of 100K each -> 1 ZIP -> 1 job
# Use Group ID to enable selective reprocessing of failed batches

Wrong: Skipping Phase 2 (module-specific import job)

# BAD — submitting "Load Interface File for Import" and assuming done
# Data only reaches interface tables after Phase 1
# Application tables remain unchanged until Phase 2 runs

Correct: Always chain Phase 1 and Phase 2

# GOOD — chain both phases with status checking
result1 = fbdi_import(...)  # Phase 1: Load Interface File for Import
if result1["status"] == "SUCCEEDED":
    result2 = submit_ess_job(  # Phase 2: Module-specific import
        job_name="ImportJournals",
        parameters={"group_id": batch_group_id}
    )

Wrong: Resubmitting failed batches without purging interface tables

# BAD — import fails, fix CSV, resubmit same batch
# Failed records from attempt 1 still in interface tables
# Result: duplicate processing, constraint violations

Correct: Purge interface tables before resubmission

# GOOD — clean slate before retry
# 1. Run "Purge Interface Tables" ESS job for the module
# 2. Wait for purge to complete
# 3. Resubmit corrected CSV with both Phase 1 and Phase 2

Common Pitfalls

Diagnostic Commands

# Check ESS job status for a specific request ID
curl -s -X GET \
  "https://<instance>/fscmRestApi/resources/11.13.18.05/erpintegrations?finder=ESSJobStatusRF;requestId=<REQUEST_ID>" \
  -u "user:pass" | python3 -m json.tool

# Download ESS job log files (base64-encoded ZIP)
curl -s -X POST \
  "https://<instance>/fscmRestApi/resources/11.13.18.05/erpintegrations" \
  -H "Content-Type: application/json" \
  -u "user:pass" \
  -d '{"OperationName":"downloadESSJobExecutionDetails","ReqstId":"<REQUEST_ID>"}'

# List documents in UCM by file prefix (verify upload)
curl -s -X POST \
  "https://<instance>/fscmRestApi/resources/11.13.18.05/erpintegrations" \
  -H "Content-Type: application/json" \
  -u "user:pass" \
  -d '{"OperationName":"getDocumentIdsForFilePrefix","DocumentPrefix":"JournalImport"}'

# ErpIntegrationService WSDL (19 operations available)
# https://<instance>/fscmService/ErpIntegrationService?WSDL

Version History & Compatibility

ReleaseDateStatusTemplate ChangesNotes
26A2026-01CurrentUpdated macro structure in XLSM templatesRe-download all templates
25D2025-10SupportedMinor column additions in AP/AR templatesCheck AP_INVOICE_LINES_INTERFACE for new optional fields
25C2025-07SupportedNo breaking template changesStable release for FBDI
25B2025-04SupportedNew FBDI templates for Project ManagementNo impact on Financials templates
25A2025-01SupportedErpIntegrationService documentation refresh

Oracle Fusion Cloud Applications follow a quarterly release cadence (A=Jan, B=Apr, C=Jul, D=Oct). FBDI templates are updated each release but Oracle maintains backward compatibility for at least 2 releases. Using templates more than 2 releases old is unsupported and may cause silent data mapping errors. [src1]

When to Use / When Not to Use

Use WhenDon't Use WhenUse Instead
Bulk data migration (10K-2M records)Real-time individual record CRUDOracle ERP Cloud REST API
Scheduled periodic batch imports (daily/weekly journals, invoices)Outbound data extraction from OracleBICC or BIP Reports
Data corrections requiring mass updates via interface tablesLow-latency event-driven integrationsOracle Business Events + OIC
Initial data load during implementationFewer than 100 records per transactionREST API with JSON payload
Non-Oracle source systems pushing data to Oracle ERPOracle-to-Oracle cloud integrationsOIC pre-built integrations or ADM

Important Caveats

Related Units