Oracle BICC Deep Dive — PVO Mapping, Extraction Internals & UCM Staging
TL;DR
- Bottom line: Oracle BICC extracts data through a three-layer architecture: Offerings (functional areas) contain Data Stores, which reference Public View Objects (PVOs). Each PVO maps to denormalized database views built for bulk extraction. Configure data stores by selecting specific PVO attributes, set extraction type (full or incremental), choose a storage target (UCM or OCI Object Storage), and schedule via the BICC Console or SOAP API. [src1, src4]
- Key limit: PVOs are offering-scoped and immutable — you cannot reassign a PVO to a different offering, and OTBI reporting PVOs cause severe performance degradation if used for bulk extraction. [src1, src4]
- Watch out for: Custom fields require BI-enablement AND BI-publishing before they appear in BICC PVOs. Missing this step is the #1 reason teams report "missing columns" in their extracts. [src6]
- Best for: Understanding the internal mechanics of BICC — PVO-to-table lineage, data store configuration, incremental extraction logic, UCM file layout, and REST API automation — to build production-grade extraction pipelines. [src1, src2]
- Authentication: ESS Administrator + BIA_ADMINISTRATOR_DUTY + OBIA_EXTRACTTRANSFORMLOAD_RWD roles required. REST metadata API uses the same Fusion authentication (Basic Auth or JWT). [src1, src2]
System Profile
Oracle BICC is a native component of Oracle Fusion Cloud Applications that implements a structured extraction pipeline: Offerings map to functional modules (Financials, HCM, SCM, Procurement, CX), each containing curated Data Stores that reference one or more Public View Objects (PVOs). PVOs are pre-built, denormalized views over the Fusion transactional schema, designed specifically for efficient bulk data extraction. This card covers the internal architecture and configuration mechanics. For rate limits and scheduling constraints, see the companion BICC Data Extraction card.
| Property | Value |
|---|---|
| Vendor | Oracle |
| System | Oracle Fusion Cloud Applications (Release 25C) |
| API Surface | BICC Console (UI) + SOAP scheduling API + REST metadata API |
| Current Version | 25C (Update 25.06) |
| Editions Covered | All Oracle Fusion Cloud editions (ERP, HCM, SCM, CX, Procurement) |
| Deployment | Cloud |
| API Docs | BICC Documentation |
| Status | GA — actively maintained |
API Surfaces & Capabilities
BICC exposes three distinct interfaces: the Console UI for interactive configuration, a SOAP API for programmatic scheduling and triggering, and a REST API for metadata and PVO management.
| API Surface | Protocol | Best For | Endpoint Pattern | Auth Required | Notes |
|---|---|---|---|---|---|
| BICC Console (UI) | Web UI | Interactive PVO selection, data store config | Navigator > Tools > BI Cloud Connector | Fusion SSO | Primary configuration interface |
| BICC SOAP API | SOAP/XML | Programmatic extract triggering, scheduling | /biacm/ws/BIACMService | Basic Auth / JWT | Submit, cancel, monitor jobs |
| BICC REST Metadata API | HTTPS/JSON | PVO attribute discovery, filter management | /biacm/rest/meta/offerings | Basic Auth / JWT | Read and update PVO metadata |
| UCM WebDAV | WebDAV/HTTPS | Download extracted files from UCM staging | /cs/idcplg | Basic Auth | File retrieval only |
| OCI Object Storage API | HTTPS/JSON | Download extracted files from OCI storage | Standard OCI API | OCI IAM / API Key | S3-compatible access available |
Rate Limits & Quotas
REST Metadata API Limits
| Limit Type | Value | Applies To | Notes |
|---|---|---|---|
| Offerings list retrieval | No explicit limit | GET /biacm/rest/meta/offerings | Returns all offerings in one call |
| Data store metadata read | No explicit limit | GET /biacm/rest/meta/datastores/{id} | Full PVO attribute list per request |
| Data store metadata update | No explicit limit | PUT /biacm/rest/meta/datastores/ | Updates filters and column selections |
| Concurrent API sessions | Subject to Fusion connection pool | All REST/SOAP endpoints | Shared with other Fusion API consumers |
PVO and Data Store Limits
| Limit Type | Value | Window | Notes |
|---|---|---|---|
| PVOs per offering | Varies (50-500+ per offering) | Static | Seeded by Oracle per functional module |
| Total PVOs available | 5,000+ across all offerings | Static | Grows with each quarterly release |
| Custom PVO attributes | Requires BI-enablement + BI-publish | Per custom field | Not automatic — manual admin action required |
| Filter expressions per data store | Simple WHERE predicates only | Per data store | No joins, subqueries, or complex SQL |
| Concurrent extract jobs | 1 per offering | Per offering | Multiple offerings can run concurrently |
Authentication
| Flow | Use When | Token Lifetime | Refresh? | Notes |
|---|---|---|---|---|
| Oracle Fusion SSO | BICC Console interactive access | Session-based | Yes | Standard Fusion login |
| Basic Auth | REST/SOAP API in dev/test | Session-based | No | Blocked if SSO enforced; not for production |
| JWT Token Authentication | Production REST/SOAP API automation | Configurable | New token per request | Requires X.509 certificate; recommended |
Authentication Gotchas
- Three roles must be combined: ESS Administrator (scheduling), BIA_ADMINISTRATOR_DUTY (BICC operations and REST metadata API), and OBIA_EXTRACTTRANSFORMLOAD_RWD (UCM file download). Assigning only BIA_ADMINISTRATOR_DUTY grants metadata API access but not file retrieval or job scheduling. [src1]
- The REST metadata API does not have a read-only role — anyone with BIA_ADMINISTRATOR_DUTY can both read AND modify PVO configurations. Implement change management controls externally. [src2]
- For OCI Object Storage targets, a separate OCI IAM policy is required on the BICC service account's API key — independent of Fusion role assignments. [src3]
Constraints
- PVO offering lock-in: Each PVO belongs to exactly one offering. You cannot extract a Finance PVO through the HCM offering.
- ExtractPVO requirement: Only PVOs named with Extract or BIC prefix patterns are designed for BICC bulk extraction. OTBI reporting PVOs cause locks, timeouts, and performance degradation.
- Custom field BI-enablement: Custom fields require explicit BI-enablement in Application Composer AND running the BI-publish ESS job before appearing in BICC PVOs.
- Filter expression limitations: Data store filters support only simple WHERE clause predicates. No joins, subqueries, or aggregate functions.
- No PVO customization: You cannot create custom PVOs or modify seeded PVO SQL definitions. Only attribute selection/deselection is supported.
- Incremental extraction dependency: Requires LastUpdateDate column selected in PVO attribute list. PVOs without this column only support full extraction.
- BICC log removal (25C): As of Release 25C, Oracle removed BICC extraction logs from the UI, reducing diagnostic visibility.
Integration Pattern Decision Tree
START — Need to understand or configure BICC extraction internals
|
|-- What do you need to configure?
| |
| |-- PVO selection and attribute mapping
| | |-- 1. Navigate to BICC Console > Manage Extracts > Select Offerings
| | |-- 2. Choose offering matching your functional area
| | |-- 3. Select ONLY ExtractPVOs (pattern: *ExtractPVO, *BICVO)
| | |-- 4. Choose attributes — MUST include PK + LastUpdateDate
| |
| |-- Data store filters (subset extraction)
| | |-- Option A: BICC Console > Data Store > Edit Data Store Details
| | |-- Option B: REST API > PUT /biacm/rest/meta/datastores/
| | |-- Limitation: Simple WHERE predicates only
| |
| |-- Storage target
| | |-- UCM (default, built-in, shared storage)
| | |-- OCI Object Storage (recommended for new deployments)
| |
| |-- Incremental vs Full extraction
| | |-- Full: ALL records — required for initial baseline
| | |-- Incremental: Changed records only — requires LastUpdateDate
| | |-- NOTE: Incremental does NOT capture hard deletes
| |
| |-- Automation via REST API
| |-- GET /biacm/rest/meta/offerings
| |-- GET /biacm/rest/meta/datastores/{id}
| |-- PUT /biacm/rest/meta/datastores/
Quick Reference
| Component | Description | Configuration Path | Key Detail |
|---|---|---|---|
| Offering | Functional module grouping | BICC Console > Select Offerings | Each offering has its own PVO set |
| Data Store | Wrapper around a single PVO | Offering > Data Stores list | Configure attributes, filters here |
| PVO | Denormalized view over transactional tables | Data Store > View Object details | Use ExtractPVOs only |
| ExtractPVO | PVO optimized for bulk extraction | Pattern: *ExtractPVO, *BICVO | Reads from denormalized views |
| PVO Attributes | Selectable columns within a PVO | Data Store > Edit > Select Columns | Must include PK + LastUpdateDate |
| Data Store Filter | WHERE clause predicate | Edit Data Store Details > Query Filter | Simple predicates only |
| Manifest File | JSON metadata per extract run | Root of extract output directory | VO name, row count, file list |
| UCM Staging | Default file storage target | No setup required | Shared with all Fusion consumers |
| OCI Object Storage | Recommended file storage target | Configure External Storage dialog | Requires OCI tenancy, bucket, API key |
| Full Extract | All records from PVO | Extract creation dialog | Required for initial baseline |
| Incremental Extract | Changed records since last extract | Extract creation dialog | Requires LastUpdateDate |
| Prune Time | Lookback hours for incremental | Per offering schedule settings | Set >= extract duration |
| BI-Enablement | Making custom fields visible in PVOs | Application Composer | Must also run BI-publish ESS job |
| PVO Lineage | PVO-to-table column mapping | Extract output lineage file | Maps attributes to source tables |
| REST API Base | Metadata API for PVO management | /biacm/rest/meta/ | Requires BIA_ADMINISTRATOR_DUTY |
Step-by-Step Integration Guide
1. Discover available offerings and PVOs via REST API
Query the BICC metadata API to programmatically list available offerings and their PVOs. [src2, src5]
# List all offerings
curl -s -u "bicc_admin:password" \
"https://your-instance.fa.us2.oraclecloud.com/biacm/rest/meta/offerings" \
-H "Accept: application/json" | python3 -m json.tool
# Get data store details for a specific offering
curl -s -u "bicc_admin:password" \
"https://your-instance.fa.us2.oraclecloud.com/biacm/rest/meta/datastores/FinancialAnalytics" \
-H "Accept: application/json" | python3 -m json.tool
Verify: Response contains a JSON array of offerings with data stores, PVO names, and attribute lists.
2. Select PVO attributes and configure data store
Choose the right PVO (ExtractPVO pattern) and select only needed attributes. [src4, src5]
1. BICC Console > Manage Extracts > Select Offerings
2. Choose offering (e.g., "Financial Analytics")
3. For each data store:
a. Verify PVO name follows ExtractPVO or BICVO pattern
b. Select: Primary key columns + LastUpdateDate + business columns
c. DESELECT unnecessary columns
4. Enable for Extract > Save
Verify: Data store shows "Enabled" status with selected attribute count.
3. Apply data store filters for targeted extraction
Restrict extraction to specific business units, ledgers, or date ranges. [src2]
# Update filter via REST API
curl -s -u "bicc_admin:password" \
-X PUT \
"https://your-instance.fa.us2.oraclecloud.com/biacm/rest/meta/datastores/" \
-H "Content-Type: application/json" \
-d '{"datastoreId":"GL_BALANCES_EXTRACT","filter":"LEDGER_ID = 300000001234"}'
Verify: GET the data store and confirm filter field reflects new predicate.
4. Configure external storage target
Set up OCI Object Storage as the extraction target. [src3]
1. BICC Console > Manage Extracts > Configure External Storage
2. Select "Oracle Cloud Infrastructure Object Storage" tab
3. Enter: Tenancy OCID, User OCID, Fingerprint, Private Key, Region, Namespace, Bucket
4. Test Connection > Save
Verify: Status shows "Connected". Test extract confirms files in OCI bucket.
5. Run initial full extraction
Execute full extraction to establish baseline for incremental. [src1]
1. BICC Console > Manage Extracts > Create Extract
2. Select offering, extract type: Full, storage target: OCI Object Storage
3. Set timeout: 20 hours > Submit
4. Monitor via BICC Console > Monitor Extracts
Verify: Job status "Completed". Validate manifest row counts against OTBI report.
6. Configure incremental extraction with prune time
Set up recurring incremental extracts. [src1]
1. BICC Console > Manage Extracts > Schedule
2. Extract type: Incremental
3. Prune time: 4 hours (>= typical extract duration)
4. Recurrence: Daily at 02:00
5. NOTE: Incremental does NOT capture hard deletes
Verify: First incremental run contains only changed records.
7. Retrieve PVO lineage mapping
Download PVO-to-database-table lineage for schema design. [src7]
# List lineage files in OCI bucket
oci os object list --bucket-name bicc-extracts \
--prefix "data/" --output json | \
python3 -c "
import json, sys
data = json.load(sys.stdin)
for obj in data['data']:
if 'lineage' in obj['name'].lower():
print(obj['name'])
"
Verify: Lineage file maps each PVO attribute to its source database table and column.
Code Examples
Python: Discover and inventory all BICC PVOs via REST API
# Input: Oracle Fusion URL, credentials
# Output: Complete inventory of offerings, data stores, and PVO attributes
import requests # requests>=2.31.0
import json
def inventory_bicc_pvos(fusion_url, username, password):
"""Build a complete inventory of all BICC offerings and PVOs."""
session = requests.Session()
session.auth = (username, password)
session.headers.update({"Accept": "application/json"})
offerings_resp = session.get(f"{fusion_url}/biacm/rest/meta/offerings")
offerings_resp.raise_for_status()
offerings = offerings_resp.json()
inventory = []
for offering in offerings:
offering_name = offering.get("name", "unknown")
ds_resp = session.get(
f"{fusion_url}/biacm/rest/meta/datastores/{offering_name}"
)
if ds_resp.status_code != 200:
continue
datastores = ds_resp.json()
for ds in datastores if isinstance(datastores, list) else [datastores]:
attributes = ds.get("attributes", [])
has_lud = any("LastUpdateDate" in a.get("name", "") for a in attributes)
inventory.append({
"offering": offering_name,
"datastore_id": ds.get("datastoreId", "unknown"),
"attribute_count": len(attributes),
"supports_incremental": has_lud,
"filter": ds.get("filter", None),
})
return inventory
Python: Update BICC data store filter via REST API
# Input: Fusion URL, credentials, data store ID, filter expression
# Output: Updated data store configuration
import requests # requests>=2.31.0
def update_bicc_filter(fusion_url, username, password, datastore_id, filter_expr):
"""Update a BICC data store filter via REST API."""
session = requests.Session()
session.auth = (username, password)
put_resp = session.put(
f"{fusion_url}/biacm/rest/meta/datastores/",
json={"datastoreId": datastore_id, "filter": filter_expr},
headers={"Content-Type": "application/json"}
)
put_resp.raise_for_status()
return put_resp.json()
cURL: Common BICC REST API operations
# List all offerings
curl -s -u "admin:password" \
"https://instance.fa.us2.oraclecloud.com/biacm/rest/meta/offerings" \
-H "Accept: application/json"
# Get PVO attributes for a data store
curl -s -u "admin:password" \
"https://instance.fa.us2.oraclecloud.com/biacm/rest/meta/datastores/AP_INVOICES_EXTRACT" \
-H "Accept: application/json"
# Update data store filter
curl -s -u "admin:password" -X PUT \
"https://instance.fa.us2.oraclecloud.com/biacm/rest/meta/datastores/" \
-H "Content-Type: application/json" \
-d '{"datastoreId":"AP_INVOICES_EXTRACT","filter":"ORG_ID = 300000001"}'
Data Mapping
BICC Architecture: Offering-to-PVO Hierarchy
| Layer | Name | Description | Example | Cardinality |
|---|---|---|---|---|
| Offering | Functional module grouping | Top-level organizational unit | Financial Analytics, HCM Analytics | 1 offering : N data stores |
| Data Store | PVO wrapper within an offering | Configurable extraction unit | GlBalancesExtract | 1 data store : 1 PVO |
| PVO | Denormalized database view | Read-only view over transactional tables | GlBalancesExtractPVO | 1 PVO : N attributes |
| PVO Attribute | Individual column in the PVO | Selectable for extraction | LEDGER_ID, AMOUNT | Mapped to 1+ source columns |
| Source Table | Fusion transactional table | Underlying data source | GL_BALANCES, AP_INVOICES_ALL | PVO joins multiple tables |
PVO Naming Convention
| Pattern | Type | Use For | Performance | Example |
|---|---|---|---|---|
*ExtractPVO | Extract PVO | BICC bulk extraction | Optimized (denormalized) | GlBalancesExtractPVO |
*BICVO | BIC View Object | BICC bulk extraction | Optimized (denormalized) | ApInvoicesBICVO |
*PVO (no Extract/BIC) | General PVO | OTBI reporting (NOT for BICC) | Poor for bulk | GlBalancesPVO |
*VO | View Object | OTBI reporting (NOT for BICC) | Poor for bulk | TransactionHeaderVO |
Data Type Gotchas
- PVO attribute names use UPPER_CASE_UNDERSCORE convention in CSV output, which may differ from the camelCase shown in the BICC Console. Use CSV header row as authoritative. [src5]
- LastUpdateDate in BICC output is always UTC, regardless of Fusion user timezone preference. [src1]
- Custom flexfield attributes appear with generated names (e.g., _FLEX_VALUE1). Cross-reference lineage file for functional names. [src6]
- NULL values are empty fields in CSV — indistinguishable from actual empty strings without PVO metadata. [src5]
Error Handling & Failure Points
Common Error Conditions
| Condition | Symptom | Cause | Resolution |
|---|---|---|---|
| Missing columns in extract | Expected PVO attribute absent | Custom field not BI-enabled or BI-publish job not run | BI-enable in App Composer, run BI-publish ESS job |
| Incremental returning full dataset | Unexpectedly large output | LastUpdateDate not selected, or baseline reset | Verify LastUpdateDate in PVO selection; check baseline |
| REST API 401 Unauthorized | Metadata API calls rejected | Missing BIA_ADMINISTRATOR_DUTY or expired JWT | Verify role; regenerate JWT |
| Filter expression error | Extract fails at submission | Invalid filter syntax | Simplify to column = value predicates |
| OTBI PVO degradation | Extract runs 10x slower | Selected OTBI PVO instead of ExtractPVO | Switch to ExtractPVO variant |
| OCI write failure | Files not in bucket | Expired API key or IAM policy missing | Rotate key; verify IAM policy |
| PVO not visible | Expected PVO missing from list | PVO belongs to different offering | Check other offerings; verify release notes |
Failure Points in Production
- Custom field invisibility: Requires two steps — BI-enablement in Application Composer AND running BI-publish ESS job. Fix:
create deployment checklist: BI-enable, run ESS job, verify in BICC Console. [src6] - Incremental baseline corruption: Resetting offering's last extract date destroys baseline. Next "incremental" extracts everything. Fix:
monitor row counts; alert if incremental exceeds 2x normal; re-run full extract. [src1] - Wrong PVO type causing slowdown: OTBI PVOs lock transactional tables during extraction. Fix:
audit PVO names — ExtractPVO or BICVO patterns only. [src1, src4] - UCM storage exhaustion cascade: Shared storage fills up, breaking all file operations. Fix:
automated cleanup or migrate to OCI Object Storage. [src3] - REST API filter overwrite during promotion: Env promotion may overwrite filters. Fix:
store filters externally; re-apply via REST API post-promotion. [src2]
Anti-Patterns
Wrong: Using OTBI reporting PVOs for BICC bulk extraction
-- BAD: TransactionHeaderPVO joins transactional tables, acquires row locks
Selected: FscmTopModelAM.FinancialAnalysisAM.TransactionHeaderPVO
Result: 10x slower extraction, Fusion UI sluggish for all users
Correct: Use designated ExtractPVOs
-- GOOD: ExtractPVOs read from pre-joined denormalized views
Selected: FscmTopModelAM.FinExtractAM.ApInvoicesExtractPVO
Result: Fast extraction, minimal Fusion impact
Naming rule: *ExtractPVO or *BICVO pattern
Wrong: Expecting custom fields automatically in BICC
-- BAD: Add field in Application Composer, immediately check BICC
Step 1: Add "Custom_Score__c" in Application Composer
Step 2: Look for it in BICC PVO
Result: Column not found — support ticket filed
Correct: BI-enable, BI-publish, then verify
-- GOOD: Follow the two-step BI-enablement process
Step 1: Application Composer > BI Enable custom field
Step 2: Run ESS Job: "Import Oracle Fusion Data Extensions for Transactional BI"
Step 3: Wait for ESS job completion (30-60 minutes)
Step 4: Verify field in BICC Console PVO attribute list
Wrong: Hardcoding filters in BICC Console per environment
-- BAD: Manual Console configuration — error-prone, not version-controlled
Dev: Filter = "ORG_ID = 100" (manual)
Test: Filter = "ORG_ID = 200" (copy-paste errors)
Prod: Filter = "ORG_ID = 300" (forgot to update)
Correct: Manage filters via REST API with external metadata
# GOOD: Version-controlled, auditable, repeatable
ENVIRONMENTS = {
"dev": {"org_id": 100},
"test": {"org_id": 200},
"prod": {"org_id": 300},
}
def apply_bicc_filters(env, fusion_url, username, password):
config = ENVIRONMENTS[env]
update_bicc_filter(fusion_url, username, password,
"AP_INVOICES_EXTRACT", f"ORG_ID = {config['org_id']}")
Common Pitfalls
- Not running BI-publish ESS job: BI-enablement is only step 1. The ESS job must run (30-60 min) to materialize fields. Fix:
add ESS job to deployment checklist; verify in BICC Console. [src6] - Prune time too narrow: If prune time < extraction duration, changed records are missed. Fix:
set prune time >= 2x expected extraction duration. [src1] - Ignoring PVO lineage: PVO attribute names can be misleading. Fix:
download lineage file; use as source of truth for target schema. [src7] - Using Console for environment promotion: Configurations not exportable. Fix:
use REST API GET+PUT to export/import as JSON; store in version control. [src2] - Not monitoring UCM utilization: Storage fills silently. Fix:
weekly monitoring; automated cleanup; or migrate to OCI Object Storage. [src3]
Diagnostic Commands
# List all BICC offerings via REST API
curl -s -u "admin:password" \
"https://instance.fa.us2.oraclecloud.com/biacm/rest/meta/offerings" \
-H "Accept: application/json" | python3 -m json.tool
# Get PVO attributes for a specific data store
curl -s -u "admin:password" \
"https://instance.fa.us2.oraclecloud.com/biacm/rest/meta/datastores/GL_BALANCES_EXTRACT" \
-H "Accept: application/json" | python3 -c "
import json, sys
ds = json.load(sys.stdin)
attrs = ds.get('attributes', [])
print(f'Total attributes: {len(attrs)}')
for a in attrs:
print(f' {a.get(\"name\", \"?\")} [{a.get(\"type\", \"?\")}]')
"
# Check if LastUpdateDate is selected (for incremental)
curl -s -u "admin:password" \
"https://instance.fa.us2.oraclecloud.com/biacm/rest/meta/datastores/AP_INVOICES_EXTRACT" \
-H "Accept: application/json" | python3 -c "
import json, sys
ds = json.load(sys.stdin)
lud = [a for a in ds.get('attributes', []) if 'LastUpdate' in a.get('name', '')]
print('Incremental supported' if lud else 'WARNING: Only full extraction possible')
"
# List extracted files in OCI Object Storage
oci os object list --bucket-name bicc-extracts --prefix "data/" --output table
Version History & Compatibility
| Release | Date | Status | Key BICC Changes | Notes |
|---|---|---|---|---|
| 25C (25.06) | 2025-06 | Current | BICC logs removed from UI; OCI OS enhancements | Reduced diagnostics; use manifest files |
| 25B (25.03) | 2025-03 | Supported | OCI Object Storage recommended over UCM | Official guidance shift |
| 24D (24.12) | 2024-12 | Supported | Custom object extraction; enhanced lineage | ExtractPVOs for custom objects |
| 24C (24.09) | 2024-09 | Supported | REST metadata API enhancements | Improved PVO discovery and filters |
| 24B (24.06) | 2024-06 | Supported | Split file size configurability | 1-5GB configurable per VO |
When to Use / When Not to Use
| Use When | Don't Use When | Use Instead |
|---|---|---|
| Need to understand PVO architecture and attribute mapping | Need BICC rate limits and file size constraints | Oracle BICC Data Extraction |
| Automating BICC configuration across environments via REST API | Need real-time individual record operations | Oracle ERP Cloud REST API |
| Debugging missing custom fields in BICC extracts | Need to import data INTO Oracle Fusion | Oracle FBDI Import |
| Setting up incremental extraction with proper prune time | Need middleware orchestration around BICC | Oracle Integration Cloud |
| Choosing between UCM and OCI Object Storage | Need Oracle Fusion authentication setup | Oracle ERP Cloud Authentication |
| Understanding PVO lineage for target schema design | Need simple BICC quick-start guide | Oracle BICC Data Extraction |
Important Caveats
- The BICC REST metadata API has no read-only access role. Any user with BIA_ADMINISTRATOR_DUTY can modify PVO selections, filters, and data store configurations. Implement external change management controls.
- As of Release 25C, Oracle removed BICC extraction logs from the UI. Diagnostic capabilities are now limited to manifest files and file inspection.
- PVO coverage is not uniform across Oracle Fusion modules. Always check the BICC Console for available PVOs before committing to a BICC architecture.
- Incremental extraction does NOT capture hard deletes — only inserts and updates. Implement periodic full extract + diff for delete detection.
- Data store filter expressions are not validated at save time — invalid expressions fail only at runtime. Always test with a small extract first.
- BICC configuration is not included in standard Fusion environment migration tools. Use REST API for environment promotion.