Oracle BI Publisher (rebranded as Oracle Analytics Publisher in 24C) is the embedded reporting engine in Oracle Fusion Cloud Applications. It sits atop Oracle Transactional Business Intelligence (OTBI) data models and can generate output in PDF, CSV, XML, Excel, and HTML. While designed for formatted reporting, BIP is frequently used as a data extraction mechanism because it can reach Fusion data that BICC view objects do not expose. This card covers Oracle Fusion Cloud SaaS deployments (ERP, HCM, SCM, CX) running Release 25A/25B.
| Property | Value |
|---|---|
| Vendor | Oracle |
| System | Oracle Fusion Cloud Applications + BI Publisher |
| API Surface | SOAP (ReportService, ScheduleService) + REST (v1/reports) |
| Current API Version | v2 (SOAP), v1 (REST) |
| Editions Covered | Enterprise (all Fusion Cloud SKUs include BIP) |
| Deployment | Cloud (SaaS) |
| API Docs | BI Publisher Web Services |
| Status | GA — but data extraction pattern is officially unsupported |
| API Surface | Protocol | Best For | Max Records/Request | Timeout | Real-time? | Bulk? |
|---|---|---|---|---|---|---|
| SOAP ReportService (runReport) | HTTPS/XML, base64 response | Small reports <50K rows, synchronous | Memory-guarded | 240s | Yes | No |
| SOAP ExternalReportWSSService | HTTPS/XML, WS-Security headers | Same as ReportService, secure auth | Memory-guarded | 240s | Yes | No |
| SOAP ScheduleService | HTTPS/XML | Scheduling reports, retrieving completed output | N/A (async) | None (async) | No | Partial |
| REST v1/reports/{path}/run | HTTPS/JSON + multipart | Ad-hoc report execution | Memory-guarded | 240s | Yes | No |
| ESS Job (BIPJobType) | Oracle Enterprise Scheduler | Large extractions via async execution | No hard cap | None (async) | No | Yes |
| OCI Data Integration BIP Connector | REST + Object Storage staging | ETL pipelines, incremental loads | Report-dependent | Configurable | No | Yes |
| Limit Type | Value | Applies To | Notes |
|---|---|---|---|
| Synchronous timeout | 240 seconds | runReport (SOAP and REST) | Request fails with SOAP fault after timeout; no partial results returned |
| Memory guard | Server-enforced (no published limit) | All BIP report execution | Oracle enforces in-memory caps to protect shared BI domain |
| Base64 response overhead | ~33% size inflation | SOAP responses | 100MB CSV becomes ~133MB in SOAP envelope |
| Max concurrent BI threads | Shared pool (typically 10-25) | All BIP operations per pod | Heavy extraction competes with interactive reporting users |
| sizeOfDataChunkDownload | Configurable (integer, -1 = all) | ReportRequest parameter | Use chunking for large outputs |
| Limit Type | Value | Window | Notes |
|---|---|---|---|
| API call limit | No published per-user limit | N/A | Shared thread pool acts as natural throttle |
| ESS concurrent jobs | Pod-dependent (typically 16-32 threads) | Per ESS domain | Heavy BIP ESS jobs reduce available threads |
| Report scheduling | No hard limit on schedule count | N/A | Oracle recommends max 20 jobs per job set |
| UCM storage | Subject to UCM quotas | Per Fusion pod | Async report output has retention policies |
| Flow | Use When | Token Lifetime | Refresh? | Notes |
|---|---|---|---|---|
| SOAP WS-Security (username token) | Server-to-server, recommended | Session-based | Per-request credentials | Use ExternalReportWSSService |
| SOAP payload credentials | Legacy, simple integrations | Per-request | No | ReportService v2 — less secure |
| REST Basic Auth | Ad-hoc REST report execution | Per-request | No | Base64-encoded username:password |
| IDCS/OCI IAM federation | Enterprise SSO integrations | Token-based (configurable) | Yes | Requires IDCS configuration |
START — Need to extract data from Oracle Fusion Cloud
|
+-- Is the data available via BICC view objects?
| +-- YES --> Use BICC (bulk extraction, incremental support, no timeout)
| +-- NO --> Continue
|
+-- Is the data available via REST API?
| +-- YES, volume < 1K records/day --> Use REST API (real-time)
| +-- YES, volume > 1K records/day --> REST API with pagination
| +-- NO --> Continue (BIP may be your only option)
|
+-- Does an existing BIP report cover the required data?
| +-- YES --> Continue with BIP extraction approach
| +-- NO --> Create custom BIP report (requires BI Author role)
|
+-- Expected data volume?
| +-- < 10K rows --> Synchronous runReport via SOAP/REST
| +-- 10K-50K rows --> ESS Job (BIPJobType) async
| +-- 50K-500K rows --> ESS Job with date-range chunking
| +-- > 500K rows --> STOP: BIP is wrong tool; escalate for BICC
|
+-- Scheduling requirements?
| +-- One-time/ad-hoc --> SOAP runReport or REST
| +-- Recurring --> ESS Job via Scheduled Processes
| +-- ETL pipeline --> OCI Data Integration BIP Connector
|
+-- Output format?
+-- Raw data for processing --> CSV or XML (data)
+-- Need formatted output --> PDF, Excel, HTML
+-- Need type-safe parsing --> XML (preserves data types)
| Criterion | BI Publisher | BICC | REST API |
|---|---|---|---|
| Primary purpose | Reporting | Bulk data extraction | Transactional CRUD |
| Oracle recommendation | Reporting only | Primary extraction tool | Real-time operations |
| Extraction support | Unsupported (but functional) | Fully supported | Supported (low volume) |
| Max practical volume | ~50K rows per report | Millions of rows | ~1K records per request |
| Incremental extraction | Manual (date parameter) | Built-in (CDC/delta) | Manual (filter by date) |
| Timeout risk | High (240s sync limit) | None (async) | Low (paginated) |
| Output formats | CSV, XML, Excel, PDF, HTML | CSV (Object Storage) | JSON |
| Async support | Via ESS job wrapper | Native | No (sync only) |
| Thread pool impact | Shares with interactive BI | Dedicated threads | Dedicated threads |
| Data coverage | Any data model with BIP report | Limited to published VOs | REST-enabled objects |
| Scheduling | ESS/ScheduleService | Built-in BICC scheduling | External orchestration |
| Third-party connectors | Fivetran, Celonis, OCI DI | OCI DI, most iPaaS | Most iPaaS |
| When to use | Data not in BICC, <50K rows | Bulk/warehouse loads | Real-time, <1K records/day |
Navigate to Tools > Reports and Analytics in Oracle Fusion, find the report, and note its absolute path (e.g., /Custom/Financials/AP_Invoice_Listing.xdo). [src4]
# Report paths follow this pattern:
# /shared/{module}/{report_name}.xdo (seeded Oracle reports)
# /Custom/{module}/{report_name}.xdo (custom reports)
Verify: Confirm the report runs manually in Fusion UI with expected parameters and output before automating.
Use ExternalReportWSSService for secure authentication with credentials in the SOAP header. [src5]
<!-- SOAP Envelope for ExternalReportWSSService runReport -->
<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"
xmlns:pub="http://xmlns.oracle.com/oxp/service/PublicReportService">
<soapenv:Header>
<wsse:Security xmlns:wsse="http://docs.oasis-open.org/wss/2004/01/...">
<wsse:UsernameToken>
<wsse:Username>INTEGRATION_USER</wsse:Username>
<wsse:Password>PASSWORD</wsse:Password>
</wsse:UsernameToken>
</wsse:Security>
</soapenv:Header>
<soapenv:Body>
<pub:runReport>
<pub:reportRequest>
<pub:attributeFormat>csv</pub:attributeFormat>
<pub:reportAbsolutePath>/Custom/Financials/AP_Invoice_Listing.xdo</pub:reportAbsolutePath>
<pub:parameterNameValues>
<pub:listOfParamNameValues>
<pub:item>
<pub:name>p_from_date</pub:name>
<pub:values><pub:item>2026-01-01</pub:item></pub:values>
</pub:item>
</pub:listOfParamNameValues>
</pub:parameterNameValues>
</pub:reportRequest>
</pub:runReport>
</soapenv:Body>
</soapenv:Envelope>
Verify: HTTP 200 response with <reportBytes> element containing base64-encoded report output.
SOAP response wraps report output in base64 encoding. Decode to obtain the actual file. [src4]
import base64
from lxml import etree
tree = etree.fromstring(soap_response_bytes)
ns = {'pub': 'http://xmlns.oracle.com/oxp/service/PublicReportService'}
report_bytes_b64 = tree.find('.//pub:reportBytes', ns).text
raw_output = base64.b64decode(report_bytes_b64)
with open('ap_invoices.csv', 'wb') as f:
f.write(raw_output)
Verify: wc -l ap_invoices.csv returns expected row count.
For reports exceeding 10K rows, register the BIP report as an ESS job of type BIPJobType to eliminate the 240-second timeout. [src2, src7]
Steps to register BIP report as ESS job:
1. Navigate to Setup and Maintenance > Manage Enterprise Scheduler Jobs
2. Create new job definition (Job Type: BIPJobType)
3. Set Report ID to BIP report path
4. Define parameters matching the BIP report parameters
5. Submit via Scheduled Processes UI or ERP Integration Service SOAP
6. Poll job status; download output from UCM or Object Storage
Verify: Job status = "Succeeded" in Scheduled Processes monitor.
Since BIP has no native CDC, implement manual incremental extraction using date parameters. [src1]
from datetime import datetime, timedelta
def generate_date_chunks(start_date, end_date, chunk_days=7):
chunks = []
current = start_date
while current < end_date:
chunk_end = min(current + timedelta(days=chunk_days), end_date)
chunks.append((current.strftime('%Y-%m-%d'),
chunk_end.strftime('%Y-%m-%d')))
current = chunk_end
return chunks
# Extract last 90 days in 7-day chunks
chunks = generate_date_chunks(datetime(2026, 1, 1), datetime(2026, 3, 9), chunk_days=7)
Verify: Each chunk returns within timeout; union of all chunks matches expected total row count.
# Input: BIP report path, parameters, Fusion Cloud host
# Output: Decoded CSV file content
import requests, base64, time
from lxml import etree
def extract_bip_report(host, username, password, report_path,
params, output_format='csv', max_retries=3):
url = f"https://{host}/xmlpserver/services/ExternalReportWSSService"
param_xml = ""
for name, value in params.items():
param_xml += f"""
<pub:item>
<pub:name>{name}</pub:name>
<pub:values><pub:item>{value}</pub:item></pub:values>
</pub:item>"""
# Build SOAP envelope with WS-Security header
# ... (full implementation in .md file)
for attempt in range(max_retries):
try:
resp = requests.post(url, data=envelope,
headers=headers, timeout=230)
resp.raise_for_status()
tree = etree.fromstring(resp.content)
ns = {'pub': 'http://xmlns.oracle.com/oxp/service/PublicReportService'}
b64_data = tree.find('.//pub:reportBytes', ns).text
return base64.b64decode(b64_data)
except requests.exceptions.Timeout:
if attempt < max_retries - 1:
time.sleep(2 ** attempt * 5)
continue
raise RuntimeError("BIP report timed out — switch to ESS async")
# Test BIP SOAP endpoint availability
curl -s -o /dev/null -w "%{http_code}" \
"https://fa-xxxx.fa.ocs.oraclecloud.com/xmlpserver/services/ExternalReportWSSService?wsdl"
# Expected: HTTP 200 with WSDL response
# Test REST API endpoint
curl -s -o /dev/null -w "%{http_code}" \
"https://fa-xxxx.fa.ocs.oraclecloud.com/xmlpserver/services/rest/v1/reports"
# Expected: HTTP 401 (confirms endpoint exists)
| Format | MIME Type | Best For | Parsing Complexity | Type Preservation |
|---|---|---|---|---|
| csv | text/plain | ETL pipelines, data warehouse loads | Low | None (all strings) |
| xml (data) | application/xml | Structured parsing, type-safe extraction | Medium | Full (typed elements) |
| excel | application/vnd.ms-excel | Human review + extraction hybrid | High | Partial |
| application/pdf | Compliance/audit output only | N/A | N/A (not parseable) | |
| html | text/html | Web display, email delivery | High | None |
| Error | Meaning | Cause | Resolution |
|---|---|---|---|
| SOAP Fault: Timeout | Report execution exceeded 240s | Dataset too large for sync execution | Switch to ESS async pattern; reduce date range |
| HTTP 401 | Authentication failure | Invalid credentials or missing WS-Security header | Verify credentials; confirm BI Consumer role |
| HTTP 500 + oracle.xdo stack trace | BIP internal error | Memory guard triggered or data model SQL error | Reduce data volume; check data model SQL |
| "Report not found" | Invalid report path | Wrong absolute path or missing permissions | Verify path in BI Catalog; check folder security |
| "Invalid parameter" | Parameter name mismatch | Name in request doesn't match data model | Use getReportParameters to discover exact names |
| ESS "Error" status | ESS job failed | Various — check ESS job log | Review ESS diagnostic log for root cause |
Compare row counts against baseline query; implement checksum validation. [src2]Serialize extraction jobs; schedule during off-peak hours. [src2]Use sizeOfDataChunkDownload parameter or switch to ESS + UCM download. [src4]Use getReportParameters to discover dateFormatString. [src4]Parse CSV by header name, never by position. [src1]# BAD — synchronous SOAP call for large dataset extraction
# This will timeout at 240s and consume BI thread pool capacity
def extract_all_invoices():
return call_bip_soap(
report='/Custom/Financials/ALL_AP_Invoices.xdo',
params={'p_from_date': '2020-01-01'}, # 6 years of data!
format='csv'
) # Will timeout; may return partial data silently
# GOOD — register as ESS BIPJobType, chunk by date range
def extract_invoices_incremental():
chunks = generate_date_chunks(start=datetime(2026, 1, 1),
end=datetime(2026, 3, 9), chunk_days=7)
for from_date, to_date in chunks:
job_id = submit_ess_job('AP_INVOICE_EXTRACT_ESS',
params={'p_from_date': from_date,
'p_to_date': to_date})
wait_for_completion(job_id)
download_output(job_id)
# BAD — extracting GL balances via BIP when BICC covers them
data = call_bip_soap(
report='/Custom/GL/GL_Balances_Extract.xdo',
params={'p_ledger': 'US Primary'}, format='csv'
)
# GOOD — BICC handles bulk extraction natively with incremental support
bicc_extract = submit_bicc_offering(
offering='FinancialsTransactionalBusinessIntelligence',
view_objects=['FscmTopModelAM.GlBalancesAM.GlBalancesVO'],
mode='incremental', target='object_storage_bucket'
)
# BAD — column positions change when Oracle patches update templates
invoice_number = row[0]
vendor_name = row[3]
amount = row[7]
# GOOD — header-based parsing survives template column changes
import csv
reader = csv.DictReader(open('invoices.csv'))
for row in reader:
invoice_number = row['Invoice Number']
vendor_name = row['Vendor Name']
amount = float(row['Invoice Amount'])
Always verify if data is available via BICC view objects first. [src1]Set client timeout to 230s. [src2]Budget 2x expected output size; use streaming for reports >20MB. [src4]Schedule all extraction jobs outside peak hours. [src2]Include COUNT(*) summary row; validate every extraction. [src2]Use BI Catalog SOAP service to discover current paths; store paths in config. [src1]# Test BIP SOAP connectivity (expect HTTP 200 with WSDL)
curl -s -o /dev/null -w "%{http_code}" \
"https://fa-xxxx.fa.ocs.oraclecloud.com/xmlpserver/services/ExternalReportWSSService?wsdl"
# Test REST API connectivity (expect HTTP 401, confirms endpoint exists)
curl -s -o /dev/null -w "%{http_code}" \
"https://fa-xxxx.fa.ocs.oraclecloud.com/xmlpserver/services/rest/v1/reports"
# Discover report parameters via SOAP getReportParameters
# Returns parameter names, types, default values, and LOV options
# Check ESS job status via REST (Fusion Cloud 25A+)
curl -u "INTEGRATION_USER:PASSWORD" \
"https://fa-xxxx.fa.ocs.oraclecloud.com/hcmRestApi/resources/11.13.18.05/ess-job-details?q=RequestId=12345"
# Monitor BI server health (admin role required)
curl -u "ADMIN_USER:PASSWORD" \
"https://fa-xxxx.fa.ocs.oraclecloud.com/xmlpserver/services/rest/v1/server/health"
| Release | Date | Status | Breaking Changes | Migration Notes |
|---|---|---|---|---|
| 25B | 2026-02 | Current | None | -- |
| 25A | 2025-11 | Supported | None | ESS REST API enhanced |
| 24D | 2025-08 | Supported | None | -- |
| 24C | 2025-05 | Supported | Rebranded to Oracle Analytics Publisher | UI changed; API backward-compatible |
| 24B | 2025-02 | Supported | None | -- |
| 24A | 2024-11 | EOL | REST v1 enhanced | -- |
| Use When | Don't Use When | Use Instead |
|---|---|---|
| Data is only available via BIP reports (no BICC VO coverage) | Data is available via BICC view objects | BICC bulk extraction |
| Need pre-formatted output (PDF, Excel) for compliance/audit | Need raw transactional data for warehouse loading | BICC or REST API |
| Dataset is <50K rows per extraction run | Dataset exceeds 100K rows | BICC (no row limits) |
| Existing BIP reports already model the required data | Need real-time individual record operations | REST API |
| One-time or low-frequency extractions (daily or less) | Need near-real-time data feeds (<5 min latency) | Business Events or REST API |
| Need complex report logic (calculated fields, cross-module joins) | Simple single-table extraction | REST API or BICC |
| Integration team has BIP/OTBI expertise | Team has no Oracle reporting expertise | REST API (simpler learning curve) |
| Capability | BI Publisher (BIP) | BICC | REST API | FBDI |
|---|---|---|---|---|
| Direction | Outbound only | Outbound only | Inbound + outbound | Inbound only |
| Volume sweet spot | 1K-50K rows | 50K-10M+ rows | 1-1K records | 1K-500K rows |
| Timeout risk | High (240s sync) | None (async) | Low | None (async) |
| CDC/Incremental | Manual (date params) | Built-in | Manual (filters) | N/A |
| Output formats | CSV, XML, Excel, PDF, HTML | CSV | JSON | CSV (template-based) |
| Scheduling | ESS jobs / ScheduleService | Built-in | External | ESS jobs |
| Auth methods | WS-Security, Basic | OAuth 2.0 | OAuth 2.0, Basic | N/A (Fusion internal) |
| Thread pool impact | Shared with reporting | Dedicated | Dedicated | ESS thread pool |
| Data coverage | Broadest (any data model) | Limited to published VOs | REST-enabled objects | Import templates only |
| Oracle support status | Unsupported for extraction | Fully supported | Fully supported | Fully supported |
| Third-party connectors | Fivetran, Celonis, OCI DI | OCI DI, most iPaaS | Most iPaaS | Limited |