Oracle BI Publisher as Data Source: When to Use BIP vs BICC vs REST
TL;DR
- Bottom line: Use BIP for targeted, small-to-medium data pulls where an existing report already covers your data model; use BICC for bulk extraction and REST for real-time transactional operations. BIP data extraction is an unsupported pattern per Oracle — wrap it in an ESS job if you must.
- Key limit: Synchronous runReport times out at 240 seconds; memory guards cap in-process data volume with no published thresholds.
- Watch out for: Using synchronous SOAP calls for extraction — they consume BI domain thread pool capacity, degrade reporting for all users, and fail silently on large datasets.
- Best for: Extracting structured data from pre-built Fusion reports (AP, AR, GL, HCM) that already exist and return under 50K rows, where BICC does not expose the required view objects.
- Authentication: Username/password in SOAP header (ExternalReportWSSService) or payload (ReportService); OAuth not natively supported for SOAP.
System Profile
Oracle BI Publisher (rebranded as Oracle Analytics Publisher in 24C) is the embedded reporting engine in Oracle Fusion Cloud Applications. It sits atop Oracle Transactional Business Intelligence (OTBI) data models and can generate output in PDF, CSV, XML, Excel, and HTML. While designed for formatted reporting, BIP is frequently used as a data extraction mechanism because it can reach Fusion data that BICC view objects do not expose. This card covers Oracle Fusion Cloud SaaS deployments (ERP, HCM, SCM, CX) running Release 25A/25B.
| Property | Value |
|---|---|
| Vendor | Oracle |
| System | Oracle Fusion Cloud Applications + BI Publisher |
| API Surface | SOAP (ReportService, ScheduleService) + REST (v1/reports) |
| Current API Version | v2 (SOAP), v1 (REST) |
| Editions Covered | Enterprise (all Fusion Cloud SKUs include BIP) |
| Deployment | Cloud (SaaS) |
| API Docs | BI Publisher Web Services |
| Status | GA — but data extraction pattern is officially unsupported |
API Surfaces & Capabilities
| API Surface | Protocol | Best For | Max Records/Request | Timeout | Real-time? | Bulk? |
|---|---|---|---|---|---|---|
| SOAP ReportService (runReport) | HTTPS/XML, base64 response | Small reports <50K rows, synchronous | Memory-guarded | 240s | Yes | No |
| SOAP ExternalReportWSSService | HTTPS/XML, WS-Security headers | Same as ReportService, secure auth | Memory-guarded | 240s | Yes | No |
| SOAP ScheduleService | HTTPS/XML | Scheduling reports, retrieving completed output | N/A (async) | None (async) | No | Partial |
| REST v1/reports/{path}/run | HTTPS/JSON + multipart | Ad-hoc report execution | Memory-guarded | 240s | Yes | No |
| ESS Job (BIPJobType) | Oracle Enterprise Scheduler | Large extractions via async execution | No hard cap | None (async) | No | Yes |
| OCI Data Integration BIP Connector | REST + Object Storage staging | ETL pipelines, incremental loads | Report-dependent | Configurable | No | Yes |
Rate Limits & Quotas
Per-Request Limits
| Limit Type | Value | Applies To | Notes |
|---|---|---|---|
| Synchronous timeout | 240 seconds | runReport (SOAP and REST) | Request fails with SOAP fault after timeout; no partial results returned |
| Memory guard | Server-enforced (no published limit) | All BIP report execution | Oracle enforces in-memory caps to protect shared BI domain |
| Base64 response overhead | ~33% size inflation | SOAP responses | 100MB CSV becomes ~133MB in SOAP envelope |
| Max concurrent BI threads | Shared pool (typically 10-25) | All BIP operations per pod | Heavy extraction competes with interactive reporting users |
| sizeOfDataChunkDownload | Configurable (integer, -1 = all) | ReportRequest parameter | Use chunking for large outputs |
Rolling / Daily Limits
| Limit Type | Value | Window | Notes |
|---|---|---|---|
| API call limit | No published per-user limit | N/A | Shared thread pool acts as natural throttle |
| ESS concurrent jobs | Pod-dependent (typically 16-32 threads) | Per ESS domain | Heavy BIP ESS jobs reduce available threads |
| Report scheduling | No hard limit on schedule count | N/A | Oracle recommends max 20 jobs per job set |
| UCM storage | Subject to UCM quotas | Per Fusion pod | Async report output has retention policies |
Authentication
| Flow | Use When | Token Lifetime | Refresh? | Notes |
|---|---|---|---|---|
| SOAP WS-Security (username token) | Server-to-server, recommended | Session-based | Per-request credentials | Use ExternalReportWSSService |
| SOAP payload credentials | Legacy, simple integrations | Per-request | No | ReportService v2 — less secure |
| REST Basic Auth | Ad-hoc REST report execution | Per-request | No | Base64-encoded username:password |
| IDCS/OCI IAM federation | Enterprise SSO integrations | Token-based (configurable) | Yes | Requires IDCS configuration |
Authentication Gotchas
- BIP SOAP services do NOT support OAuth 2.0 natively — if your middleware requires OAuth, you need an API gateway or OIC adapter as a proxy [src5]
- ExternalReportWSSService (WS-Security headers) is the recommended approach, but many platforms default to the less-secure ReportService with payload credentials — explicitly choose the WSS variant [src5]
- Integration user accounts should have BI Consumer and BI Author roles at minimum — missing roles cause cryptic access denied errors that look like connectivity failures [src1]
Constraints
- Using Oracle Analytics Publisher to extract data from Oracle Fusion Applications is an unsupported pattern per Oracle A-Team guidance — always check if BICC or REST can fulfill the requirement first
- Synchronous runReport has a hard 240-second timeout with no configuration override in Fusion Cloud SaaS
- Memory guards are enforced server-side — unlike BICC, BIP cannot process arbitrarily large datasets; exact thresholds are not published
- Report output is base64-encoded in SOAP responses, roughly doubling memory requirements on the client side
- BIP shares the BI server domain thread pool with interactive reporting users — high-volume extraction degrades reporting for all users
- No built-in Change Data Capture — incremental extraction requires manually parameterizing reports with date range filters
Integration Pattern Decision Tree
START — Need to extract data from Oracle Fusion Cloud
|
+-- Is the data available via BICC view objects?
| +-- YES --> Use BICC (bulk extraction, incremental support, no timeout)
| +-- NO --> Continue
|
+-- Is the data available via REST API?
| +-- YES, volume < 1K records/day --> Use REST API (real-time)
| +-- YES, volume > 1K records/day --> REST API with pagination
| +-- NO --> Continue (BIP may be your only option)
|
+-- Does an existing BIP report cover the required data?
| +-- YES --> Continue with BIP extraction approach
| +-- NO --> Create custom BIP report (requires BI Author role)
|
+-- Expected data volume?
| +-- < 10K rows --> Synchronous runReport via SOAP/REST
| +-- 10K-50K rows --> ESS Job (BIPJobType) async
| +-- 50K-500K rows --> ESS Job with date-range chunking
| +-- > 500K rows --> STOP: BIP is wrong tool; escalate for BICC
|
+-- Scheduling requirements?
| +-- One-time/ad-hoc --> SOAP runReport or REST
| +-- Recurring --> ESS Job via Scheduled Processes
| +-- ETL pipeline --> OCI Data Integration BIP Connector
|
+-- Output format?
+-- Raw data for processing --> CSV or XML (data)
+-- Need formatted output --> PDF, Excel, HTML
+-- Need type-safe parsing --> XML (preserves data types)
Quick Reference
BIP vs BICC vs REST — Decision Matrix
| Criterion | BI Publisher | BICC | REST API |
|---|---|---|---|
| Primary purpose | Reporting | Bulk data extraction | Transactional CRUD |
| Oracle recommendation | Reporting only | Primary extraction tool | Real-time operations |
| Extraction support | Unsupported (but functional) | Fully supported | Supported (low volume) |
| Max practical volume | ~50K rows per report | Millions of rows | ~1K records per request |
| Incremental extraction | Manual (date parameter) | Built-in (CDC/delta) | Manual (filter by date) |
| Timeout risk | High (240s sync limit) | None (async) | Low (paginated) |
| Output formats | CSV, XML, Excel, PDF, HTML | CSV (Object Storage) | JSON |
| Async support | Via ESS job wrapper | Native | No (sync only) |
| Thread pool impact | Shares with interactive BI | Dedicated threads | Dedicated threads |
| Data coverage | Any data model with BIP report | Limited to published VOs | REST-enabled objects |
| Scheduling | ESS/ScheduleService | Built-in BICC scheduling | External orchestration |
| Third-party connectors | Fivetran, Celonis, OCI DI | OCI DI, most iPaaS | Most iPaaS |
| When to use | Data not in BICC, <50K rows | Bulk/warehouse loads | Real-time, <1K records/day |
Step-by-Step Integration Guide
1. Identify the correct BIP report path
Navigate to Tools > Reports and Analytics in Oracle Fusion, find the report, and note its absolute path (e.g., /Custom/Financials/AP_Invoice_Listing.xdo). [src4]
# Report paths follow this pattern:
# /shared/{module}/{report_name}.xdo (seeded Oracle reports)
# /Custom/{module}/{report_name}.xdo (custom reports)
Verify: Confirm the report runs manually in Fusion UI with expected parameters and output before automating.
2. Authenticate and run report via SOAP (synchronous)
Use ExternalReportWSSService for secure authentication with credentials in the SOAP header. [src5]
<!-- SOAP Envelope for ExternalReportWSSService runReport -->
<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"
xmlns:pub="http://xmlns.oracle.com/oxp/service/PublicReportService">
<soapenv:Header>
<wsse:Security xmlns:wsse="http://docs.oasis-open.org/wss/2004/01/...">
<wsse:UsernameToken>
<wsse:Username>INTEGRATION_USER</wsse:Username>
<wsse:Password>PASSWORD</wsse:Password>
</wsse:UsernameToken>
</wsse:Security>
</soapenv:Header>
<soapenv:Body>
<pub:runReport>
<pub:reportRequest>
<pub:attributeFormat>csv</pub:attributeFormat>
<pub:reportAbsolutePath>/Custom/Financials/AP_Invoice_Listing.xdo</pub:reportAbsolutePath>
<pub:parameterNameValues>
<pub:listOfParamNameValues>
<pub:item>
<pub:name>p_from_date</pub:name>
<pub:values><pub:item>2026-01-01</pub:item></pub:values>
</pub:item>
</pub:listOfParamNameValues>
</pub:parameterNameValues>
</pub:reportRequest>
</pub:runReport>
</soapenv:Body>
</soapenv:Envelope>
Verify: HTTP 200 response with <reportBytes> element containing base64-encoded report output.
3. Decode the base64 response
SOAP response wraps report output in base64 encoding. Decode to obtain the actual file. [src4]
import base64
from lxml import etree
tree = etree.fromstring(soap_response_bytes)
ns = {'pub': 'http://xmlns.oracle.com/oxp/service/PublicReportService'}
report_bytes_b64 = tree.find('.//pub:reportBytes', ns).text
raw_output = base64.b64decode(report_bytes_b64)
with open('ap_invoices.csv', 'wb') as f:
f.write(raw_output)
Verify: wc -l ap_invoices.csv returns expected row count.
4. Switch to async ESS job for large datasets
For reports exceeding 10K rows, register the BIP report as an ESS job of type BIPJobType to eliminate the 240-second timeout. [src2, src7]
Steps to register BIP report as ESS job:
1. Navigate to Setup and Maintenance > Manage Enterprise Scheduler Jobs
2. Create new job definition (Job Type: BIPJobType)
3. Set Report ID to BIP report path
4. Define parameters matching the BIP report parameters
5. Submit via Scheduled Processes UI or ERP Integration Service SOAP
6. Poll job status; download output from UCM or Object Storage
Verify: Job status = "Succeeded" in Scheduled Processes monitor.
5. Implement date-range chunking for incremental loads
Since BIP has no native CDC, implement manual incremental extraction using date parameters. [src1]
from datetime import datetime, timedelta
def generate_date_chunks(start_date, end_date, chunk_days=7):
chunks = []
current = start_date
while current < end_date:
chunk_end = min(current + timedelta(days=chunk_days), end_date)
chunks.append((current.strftime('%Y-%m-%d'),
chunk_end.strftime('%Y-%m-%d')))
current = chunk_end
return chunks
# Extract last 90 days in 7-day chunks
chunks = generate_date_chunks(datetime(2026, 1, 1), datetime(2026, 3, 9), chunk_days=7)
Verify: Each chunk returns within timeout; union of all chunks matches expected total row count.
Code Examples
Python: Extract BIP report via SOAP with retry logic
# Input: BIP report path, parameters, Fusion Cloud host
# Output: Decoded CSV file content
import requests, base64, time
from lxml import etree
def extract_bip_report(host, username, password, report_path,
params, output_format='csv', max_retries=3):
url = f"https://{host}/xmlpserver/services/ExternalReportWSSService"
param_xml = ""
for name, value in params.items():
param_xml += f"""
<pub:item>
<pub:name>{name}</pub:name>
<pub:values><pub:item>{value}</pub:item></pub:values>
</pub:item>"""
# Build SOAP envelope with WS-Security header
# ... (full implementation in .md file)
for attempt in range(max_retries):
try:
resp = requests.post(url, data=envelope,
headers=headers, timeout=230)
resp.raise_for_status()
tree = etree.fromstring(resp.content)
ns = {'pub': 'http://xmlns.oracle.com/oxp/service/PublicReportService'}
b64_data = tree.find('.//pub:reportBytes', ns).text
return base64.b64decode(b64_data)
except requests.exceptions.Timeout:
if attempt < max_retries - 1:
time.sleep(2 ** attempt * 5)
continue
raise RuntimeError("BIP report timed out — switch to ESS async")
cURL: Quick test of BIP SOAP connectivity
# Test BIP SOAP endpoint availability
curl -s -o /dev/null -w "%{http_code}" \
"https://fa-xxxx.fa.ocs.oraclecloud.com/xmlpserver/services/ExternalReportWSSService?wsdl"
# Expected: HTTP 200 with WSDL response
# Test REST API endpoint
curl -s -o /dev/null -w "%{http_code}" \
"https://fa-xxxx.fa.ocs.oraclecloud.com/xmlpserver/services/rest/v1/reports"
# Expected: HTTP 401 (confirms endpoint exists)
Data Mapping
Output Format Selection Guide
| Format | MIME Type | Best For | Parsing Complexity | Type Preservation |
|---|---|---|---|---|
| csv | text/plain | ETL pipelines, data warehouse loads | Low | None (all strings) |
| xml (data) | application/xml | Structured parsing, type-safe extraction | Medium | Full (typed elements) |
| excel | application/vnd.ms-excel | Human review + extraction hybrid | High | Partial |
| application/pdf | Compliance/audit output only | N/A | N/A (not parseable) | |
| html | text/html | Web display, email delivery | High | None |
Data Type Gotchas
- BIP CSV output treats all fields as strings — dates come in the report's display format, not ISO 8601. Parse with explicit date format matching the BIP template's date mask [src4]
- XML (data) output preserves Oracle date types as xsd:dateTime — this is the most reliable format for downstream type mapping [src4]
- Numeric fields in CSV lose precision for large numbers (>15 digits) — use XML or set NUMBER columns as VARCHAR in the data model [src6]
- BIP report column names in CSV output use template display labels, not data model column names — if the template changes, your parser breaks [src1]
Error Handling & Failure Points
Common Error Codes
| Error | Meaning | Cause | Resolution |
|---|---|---|---|
| SOAP Fault: Timeout | Report execution exceeded 240s | Dataset too large for sync execution | Switch to ESS async pattern; reduce date range |
| HTTP 401 | Authentication failure | Invalid credentials or missing WS-Security header | Verify credentials; confirm BI Consumer role |
| HTTP 500 + oracle.xdo stack trace | BIP internal error | Memory guard triggered or data model SQL error | Reduce data volume; check data model SQL |
| "Report not found" | Invalid report path | Wrong absolute path or missing permissions | Verify path in BI Catalog; check folder security |
| "Invalid parameter" | Parameter name mismatch | Name in request doesn't match data model | Use getReportParameters to discover exact names |
| ESS "Error" status | ESS job failed | Various — check ESS job log | Review ESS diagnostic log for root cause |
Failure Points in Production
- Silent truncation on memory guard: BIP may return partial dataset without error. Fix:
Compare row counts against baseline query; implement checksum validation. [src2] - Thread pool exhaustion: 5+ concurrent BIP extractions can exhaust BI domain thread pool. Fix:
Serialize extraction jobs; schedule during off-peak hours. [src2] - Base64 out-of-memory on client: Large reports (>50MB) cause client-side OOM during decode. Fix:
Use sizeOfDataChunkDownload parameter or switch to ESS + UCM download. [src4] - Date format mismatch: Data model expects locale-specific dates while integration sends ISO 8601. Fix:
Use getReportParameters to discover dateFormatString. [src4] - Template version drift: Oracle patches can change seeded report template columns. Fix:
Parse CSV by header name, never by position. [src1]
Anti-Patterns
Wrong: Synchronous BIP extraction for large datasets
# BAD — synchronous SOAP call for large dataset extraction
# This will timeout at 240s and consume BI thread pool capacity
def extract_all_invoices():
return call_bip_soap(
report='/Custom/Financials/ALL_AP_Invoices.xdo',
params={'p_from_date': '2020-01-01'}, # 6 years of data!
format='csv'
) # Will timeout; may return partial data silently
Correct: ESS async job with date chunking
# GOOD — register as ESS BIPJobType, chunk by date range
def extract_invoices_incremental():
chunks = generate_date_chunks(start=datetime(2026, 1, 1),
end=datetime(2026, 3, 9), chunk_days=7)
for from_date, to_date in chunks:
job_id = submit_ess_job('AP_INVOICE_EXTRACT_ESS',
params={'p_from_date': from_date,
'p_to_date': to_date})
wait_for_completion(job_id)
download_output(job_id)
Wrong: Using BIP when BICC view objects exist
# BAD — extracting GL balances via BIP when BICC covers them
data = call_bip_soap(
report='/Custom/GL/GL_Balances_Extract.xdo',
params={'p_ledger': 'US Primary'}, format='csv'
)
Correct: Use BICC for any data it covers
# GOOD — BICC handles bulk extraction natively with incremental support
bicc_extract = submit_bicc_offering(
offering='FinancialsTransactionalBusinessIntelligence',
view_objects=['FscmTopModelAM.GlBalancesAM.GlBalancesVO'],
mode='incremental', target='object_storage_bucket'
)
Wrong: Parsing BIP CSV by column position
# BAD — column positions change when Oracle patches update templates
invoice_number = row[0]
vendor_name = row[3]
amount = row[7]
Correct: Parse by header name
# GOOD — header-based parsing survives template column changes
import csv
reader = csv.DictReader(open('invoices.csv'))
for row in reader:
invoice_number = row['Invoice Number']
vendor_name = row['Vendor Name']
amount = float(row['Invoice Amount'])
Common Pitfalls
- Using runReport without checking BICC first: Many teams default to BIP because they already have reports. Fix:
Always verify if data is available via BICC view objects first. [src1] - Not setting client-side timeout lower than 240s: Server timeout triggers first, but if client timeout is lower you miss the error message. Fix:
Set client timeout to 230s. [src2] - Ignoring base64 memory overhead: 100MB report becomes ~133MB in SOAP response. Fix:
Budget 2x expected output size; use streaming for reports >20MB. [src4] - Running BIP extractions during business hours: BIP shares BI thread pool with interactive users. Fix:
Schedule all extraction jobs outside peak hours. [src2] - Not validating output completeness: BIP can return partial data when memory guards trigger, with no error. Fix:
Include COUNT(*) summary row; validate every extraction. [src2] - Hardcoding report paths for seeded Oracle reports: Oracle can change paths between releases. Fix:
Use BI Catalog SOAP service to discover current paths; store paths in config. [src1]
Diagnostic Commands
# Test BIP SOAP connectivity (expect HTTP 200 with WSDL)
curl -s -o /dev/null -w "%{http_code}" \
"https://fa-xxxx.fa.ocs.oraclecloud.com/xmlpserver/services/ExternalReportWSSService?wsdl"
# Test REST API connectivity (expect HTTP 401, confirms endpoint exists)
curl -s -o /dev/null -w "%{http_code}" \
"https://fa-xxxx.fa.ocs.oraclecloud.com/xmlpserver/services/rest/v1/reports"
# Discover report parameters via SOAP getReportParameters
# Returns parameter names, types, default values, and LOV options
# Check ESS job status via REST (Fusion Cloud 25A+)
curl -u "INTEGRATION_USER:PASSWORD" \
"https://fa-xxxx.fa.ocs.oraclecloud.com/hcmRestApi/resources/11.13.18.05/ess-job-details?q=RequestId=12345"
# Monitor BI server health (admin role required)
curl -u "ADMIN_USER:PASSWORD" \
"https://fa-xxxx.fa.ocs.oraclecloud.com/xmlpserver/services/rest/v1/server/health"
Version History & Compatibility
| Release | Date | Status | Breaking Changes | Migration Notes |
|---|---|---|---|---|
| 25B | 2026-02 | Current | None | -- |
| 25A | 2025-11 | Supported | None | ESS REST API enhanced |
| 24D | 2025-08 | Supported | None | -- |
| 24C | 2025-05 | Supported | Rebranded to Oracle Analytics Publisher | UI changed; API backward-compatible |
| 24B | 2025-02 | Supported | None | -- |
| 24A | 2024-11 | EOL | REST v1 enhanced | -- |
When to Use / When Not to Use
| Use When | Don't Use When | Use Instead |
|---|---|---|
| Data is only available via BIP reports (no BICC VO coverage) | Data is available via BICC view objects | BICC bulk extraction |
| Need pre-formatted output (PDF, Excel) for compliance/audit | Need raw transactional data for warehouse loading | BICC or REST API |
| Dataset is <50K rows per extraction run | Dataset exceeds 100K rows | BICC (no row limits) |
| Existing BIP reports already model the required data | Need real-time individual record operations | REST API |
| One-time or low-frequency extractions (daily or less) | Need near-real-time data feeds (<5 min latency) | Business Events or REST API |
| Need complex report logic (calculated fields, cross-module joins) | Simple single-table extraction | REST API or BICC |
| Integration team has BIP/OTBI expertise | Team has no Oracle reporting expertise | REST API (simpler learning curve) |
Cross-System Comparison
| Capability | BI Publisher (BIP) | BICC | REST API | FBDI |
|---|---|---|---|---|
| Direction | Outbound only | Outbound only | Inbound + outbound | Inbound only |
| Volume sweet spot | 1K-50K rows | 50K-10M+ rows | 1-1K records | 1K-500K rows |
| Timeout risk | High (240s sync) | None (async) | Low | None (async) |
| CDC/Incremental | Manual (date params) | Built-in | Manual (filters) | N/A |
| Output formats | CSV, XML, Excel, PDF, HTML | CSV | JSON | CSV (template-based) |
| Scheduling | ESS jobs / ScheduleService | Built-in | External | ESS jobs |
| Auth methods | WS-Security, Basic | OAuth 2.0 | OAuth 2.0, Basic | N/A (Fusion internal) |
| Thread pool impact | Shared with reporting | Dedicated | Dedicated | ESS thread pool |
| Data coverage | Broadest (any data model) | Limited to published VOs | REST-enabled objects | Import templates only |
| Oracle support status | Unsupported for extraction | Fully supported | Fully supported | Fully supported |
| Third-party connectors | Fivetran, Celonis, OCI DI | OCI DI, most iPaaS | Most iPaaS | Limited |
Important Caveats
- Oracle's A-Team explicitly states that using BI Publisher for data extraction is an unsupported pattern — Oracle Support may decline to assist if extraction causes performance issues
- Memory guards and timeout thresholds are pod-level configurations managed by Oracle — customers cannot adjust them in Fusion Cloud SaaS
- The 24C rebranding from "BI Publisher" to "Oracle Analytics Publisher" changed UI labels but not API endpoints — existing integrations continue to work
- Third-party connectors (Fivetran, Celonis) that use BIP as a data source are subject to the same memory and timeout limitations
- BIP report performance varies significantly based on data model SQL complexity and pod load — always performance-test with production-representative data volumes