Oracle BICC Deep Dive — PVO Mapping, Extraction Internals & UCM Staging

Type: ERP Integration System: Oracle Fusion Cloud Applications (Release 25C) Confidence: 0.87 Sources: 7 Verified: 2026-03-09 Freshness: 2026-03-09

TL;DR

System Profile

Oracle BICC is a native component of Oracle Fusion Cloud Applications that implements a structured extraction pipeline: Offerings map to functional modules (Financials, HCM, SCM, Procurement, CX), each containing curated Data Stores that reference one or more Public View Objects (PVOs). PVOs are pre-built, denormalized views over the Fusion transactional schema, designed specifically for efficient bulk data extraction. This card covers the internal architecture and configuration mechanics. For rate limits and scheduling constraints, see the companion BICC Data Extraction card.

PropertyValue
VendorOracle
SystemOracle Fusion Cloud Applications (Release 25C)
API SurfaceBICC Console (UI) + SOAP scheduling API + REST metadata API
Current Version25C (Update 25.06)
Editions CoveredAll Oracle Fusion Cloud editions (ERP, HCM, SCM, CX, Procurement)
DeploymentCloud
API DocsBICC Documentation
StatusGA — actively maintained

API Surfaces & Capabilities

BICC exposes three distinct interfaces: the Console UI for interactive configuration, a SOAP API for programmatic scheduling and triggering, and a REST API for metadata and PVO management.

API SurfaceProtocolBest ForEndpoint PatternAuth RequiredNotes
BICC Console (UI)Web UIInteractive PVO selection, data store configNavigator > Tools > BI Cloud ConnectorFusion SSOPrimary configuration interface
BICC SOAP APISOAP/XMLProgrammatic extract triggering, scheduling/biacm/ws/BIACMServiceBasic Auth / JWTSubmit, cancel, monitor jobs
BICC REST Metadata APIHTTPS/JSONPVO attribute discovery, filter management/biacm/rest/meta/offeringsBasic Auth / JWTRead and update PVO metadata
UCM WebDAVWebDAV/HTTPSDownload extracted files from UCM staging/cs/idcplgBasic AuthFile retrieval only
OCI Object Storage APIHTTPS/JSONDownload extracted files from OCI storageStandard OCI APIOCI IAM / API KeyS3-compatible access available

Rate Limits & Quotas

REST Metadata API Limits

Limit TypeValueApplies ToNotes
Offerings list retrievalNo explicit limitGET /biacm/rest/meta/offeringsReturns all offerings in one call
Data store metadata readNo explicit limitGET /biacm/rest/meta/datastores/{id}Full PVO attribute list per request
Data store metadata updateNo explicit limitPUT /biacm/rest/meta/datastores/Updates filters and column selections
Concurrent API sessionsSubject to Fusion connection poolAll REST/SOAP endpointsShared with other Fusion API consumers

PVO and Data Store Limits

Limit TypeValueWindowNotes
PVOs per offeringVaries (50-500+ per offering)StaticSeeded by Oracle per functional module
Total PVOs available5,000+ across all offeringsStaticGrows with each quarterly release
Custom PVO attributesRequires BI-enablement + BI-publishPer custom fieldNot automatic — manual admin action required
Filter expressions per data storeSimple WHERE predicates onlyPer data storeNo joins, subqueries, or complex SQL
Concurrent extract jobs1 per offeringPer offeringMultiple offerings can run concurrently

Authentication

FlowUse WhenToken LifetimeRefresh?Notes
Oracle Fusion SSOBICC Console interactive accessSession-basedYesStandard Fusion login
Basic AuthREST/SOAP API in dev/testSession-basedNoBlocked if SSO enforced; not for production
JWT Token AuthenticationProduction REST/SOAP API automationConfigurableNew token per requestRequires X.509 certificate; recommended

Authentication Gotchas

Constraints

Integration Pattern Decision Tree

START — Need to understand or configure BICC extraction internals
|
|-- What do you need to configure?
|   |
|   |-- PVO selection and attribute mapping
|   |   |-- 1. Navigate to BICC Console > Manage Extracts > Select Offerings
|   |   |-- 2. Choose offering matching your functional area
|   |   |-- 3. Select ONLY ExtractPVOs (pattern: *ExtractPVO, *BICVO)
|   |   |-- 4. Choose attributes — MUST include PK + LastUpdateDate
|   |
|   |-- Data store filters (subset extraction)
|   |   |-- Option A: BICC Console > Data Store > Edit Data Store Details
|   |   |-- Option B: REST API > PUT /biacm/rest/meta/datastores/
|   |   |-- Limitation: Simple WHERE predicates only
|   |
|   |-- Storage target
|   |   |-- UCM (default, built-in, shared storage)
|   |   |-- OCI Object Storage (recommended for new deployments)
|   |
|   |-- Incremental vs Full extraction
|   |   |-- Full: ALL records — required for initial baseline
|   |   |-- Incremental: Changed records only — requires LastUpdateDate
|   |   |-- NOTE: Incremental does NOT capture hard deletes
|   |
|   |-- Automation via REST API
|       |-- GET /biacm/rest/meta/offerings
|       |-- GET /biacm/rest/meta/datastores/{id}
|       |-- PUT /biacm/rest/meta/datastores/

Quick Reference

ComponentDescriptionConfiguration PathKey Detail
OfferingFunctional module groupingBICC Console > Select OfferingsEach offering has its own PVO set
Data StoreWrapper around a single PVOOffering > Data Stores listConfigure attributes, filters here
PVODenormalized view over transactional tablesData Store > View Object detailsUse ExtractPVOs only
ExtractPVOPVO optimized for bulk extractionPattern: *ExtractPVO, *BICVOReads from denormalized views
PVO AttributesSelectable columns within a PVOData Store > Edit > Select ColumnsMust include PK + LastUpdateDate
Data Store FilterWHERE clause predicateEdit Data Store Details > Query FilterSimple predicates only
Manifest FileJSON metadata per extract runRoot of extract output directoryVO name, row count, file list
UCM StagingDefault file storage targetNo setup requiredShared with all Fusion consumers
OCI Object StorageRecommended file storage targetConfigure External Storage dialogRequires OCI tenancy, bucket, API key
Full ExtractAll records from PVOExtract creation dialogRequired for initial baseline
Incremental ExtractChanged records since last extractExtract creation dialogRequires LastUpdateDate
Prune TimeLookback hours for incrementalPer offering schedule settingsSet >= extract duration
BI-EnablementMaking custom fields visible in PVOsApplication ComposerMust also run BI-publish ESS job
PVO LineagePVO-to-table column mappingExtract output lineage fileMaps attributes to source tables
REST API BaseMetadata API for PVO management/biacm/rest/meta/Requires BIA_ADMINISTRATOR_DUTY

Step-by-Step Integration Guide

1. Discover available offerings and PVOs via REST API

Query the BICC metadata API to programmatically list available offerings and their PVOs. [src2, src5]

# List all offerings
curl -s -u "bicc_admin:password" \
  "https://your-instance.fa.us2.oraclecloud.com/biacm/rest/meta/offerings" \
  -H "Accept: application/json" | python3 -m json.tool

# Get data store details for a specific offering
curl -s -u "bicc_admin:password" \
  "https://your-instance.fa.us2.oraclecloud.com/biacm/rest/meta/datastores/FinancialAnalytics" \
  -H "Accept: application/json" | python3 -m json.tool

Verify: Response contains a JSON array of offerings with data stores, PVO names, and attribute lists.

2. Select PVO attributes and configure data store

Choose the right PVO (ExtractPVO pattern) and select only needed attributes. [src4, src5]

1. BICC Console > Manage Extracts > Select Offerings
2. Choose offering (e.g., "Financial Analytics")
3. For each data store:
   a. Verify PVO name follows ExtractPVO or BICVO pattern
   b. Select: Primary key columns + LastUpdateDate + business columns
   c. DESELECT unnecessary columns
4. Enable for Extract > Save

Verify: Data store shows "Enabled" status with selected attribute count.

3. Apply data store filters for targeted extraction

Restrict extraction to specific business units, ledgers, or date ranges. [src2]

# Update filter via REST API
curl -s -u "bicc_admin:password" \
  -X PUT \
  "https://your-instance.fa.us2.oraclecloud.com/biacm/rest/meta/datastores/" \
  -H "Content-Type: application/json" \
  -d '{"datastoreId":"GL_BALANCES_EXTRACT","filter":"LEDGER_ID = 300000001234"}'

Verify: GET the data store and confirm filter field reflects new predicate.

4. Configure external storage target

Set up OCI Object Storage as the extraction target. [src3]

1. BICC Console > Manage Extracts > Configure External Storage
2. Select "Oracle Cloud Infrastructure Object Storage" tab
3. Enter: Tenancy OCID, User OCID, Fingerprint, Private Key, Region, Namespace, Bucket
4. Test Connection > Save

Verify: Status shows "Connected". Test extract confirms files in OCI bucket.

5. Run initial full extraction

Execute full extraction to establish baseline for incremental. [src1]

1. BICC Console > Manage Extracts > Create Extract
2. Select offering, extract type: Full, storage target: OCI Object Storage
3. Set timeout: 20 hours > Submit
4. Monitor via BICC Console > Monitor Extracts

Verify: Job status "Completed". Validate manifest row counts against OTBI report.

6. Configure incremental extraction with prune time

Set up recurring incremental extracts. [src1]

1. BICC Console > Manage Extracts > Schedule
2. Extract type: Incremental
3. Prune time: 4 hours (>= typical extract duration)
4. Recurrence: Daily at 02:00
5. NOTE: Incremental does NOT capture hard deletes

Verify: First incremental run contains only changed records.

7. Retrieve PVO lineage mapping

Download PVO-to-database-table lineage for schema design. [src7]

# List lineage files in OCI bucket
oci os object list --bucket-name bicc-extracts \
  --prefix "data/" --output json | \
  python3 -c "
import json, sys
data = json.load(sys.stdin)
for obj in data['data']:
    if 'lineage' in obj['name'].lower():
        print(obj['name'])
"

Verify: Lineage file maps each PVO attribute to its source database table and column.

Code Examples

Python: Discover and inventory all BICC PVOs via REST API

# Input:  Oracle Fusion URL, credentials
# Output: Complete inventory of offerings, data stores, and PVO attributes

import requests  # requests>=2.31.0
import json

def inventory_bicc_pvos(fusion_url, username, password):
    """Build a complete inventory of all BICC offerings and PVOs."""
    session = requests.Session()
    session.auth = (username, password)
    session.headers.update({"Accept": "application/json"})

    offerings_resp = session.get(f"{fusion_url}/biacm/rest/meta/offerings")
    offerings_resp.raise_for_status()
    offerings = offerings_resp.json()

    inventory = []
    for offering in offerings:
        offering_name = offering.get("name", "unknown")
        ds_resp = session.get(
            f"{fusion_url}/biacm/rest/meta/datastores/{offering_name}"
        )
        if ds_resp.status_code != 200:
            continue

        datastores = ds_resp.json()
        for ds in datastores if isinstance(datastores, list) else [datastores]:
            attributes = ds.get("attributes", [])
            has_lud = any("LastUpdateDate" in a.get("name", "") for a in attributes)
            inventory.append({
                "offering": offering_name,
                "datastore_id": ds.get("datastoreId", "unknown"),
                "attribute_count": len(attributes),
                "supports_incremental": has_lud,
                "filter": ds.get("filter", None),
            })
    return inventory

Python: Update BICC data store filter via REST API

# Input:  Fusion URL, credentials, data store ID, filter expression
# Output: Updated data store configuration

import requests  # requests>=2.31.0

def update_bicc_filter(fusion_url, username, password, datastore_id, filter_expr):
    """Update a BICC data store filter via REST API."""
    session = requests.Session()
    session.auth = (username, password)

    put_resp = session.put(
        f"{fusion_url}/biacm/rest/meta/datastores/",
        json={"datastoreId": datastore_id, "filter": filter_expr},
        headers={"Content-Type": "application/json"}
    )
    put_resp.raise_for_status()
    return put_resp.json()

cURL: Common BICC REST API operations

# List all offerings
curl -s -u "admin:password" \
  "https://instance.fa.us2.oraclecloud.com/biacm/rest/meta/offerings" \
  -H "Accept: application/json"

# Get PVO attributes for a data store
curl -s -u "admin:password" \
  "https://instance.fa.us2.oraclecloud.com/biacm/rest/meta/datastores/AP_INVOICES_EXTRACT" \
  -H "Accept: application/json"

# Update data store filter
curl -s -u "admin:password" -X PUT \
  "https://instance.fa.us2.oraclecloud.com/biacm/rest/meta/datastores/" \
  -H "Content-Type: application/json" \
  -d '{"datastoreId":"AP_INVOICES_EXTRACT","filter":"ORG_ID = 300000001"}'

Data Mapping

BICC Architecture: Offering-to-PVO Hierarchy

LayerNameDescriptionExampleCardinality
OfferingFunctional module groupingTop-level organizational unitFinancial Analytics, HCM Analytics1 offering : N data stores
Data StorePVO wrapper within an offeringConfigurable extraction unitGlBalancesExtract1 data store : 1 PVO
PVODenormalized database viewRead-only view over transactional tablesGlBalancesExtractPVO1 PVO : N attributes
PVO AttributeIndividual column in the PVOSelectable for extractionLEDGER_ID, AMOUNTMapped to 1+ source columns
Source TableFusion transactional tableUnderlying data sourceGL_BALANCES, AP_INVOICES_ALLPVO joins multiple tables

PVO Naming Convention

PatternTypeUse ForPerformanceExample
*ExtractPVOExtract PVOBICC bulk extractionOptimized (denormalized)GlBalancesExtractPVO
*BICVOBIC View ObjectBICC bulk extractionOptimized (denormalized)ApInvoicesBICVO
*PVO (no Extract/BIC)General PVOOTBI reporting (NOT for BICC)Poor for bulkGlBalancesPVO
*VOView ObjectOTBI reporting (NOT for BICC)Poor for bulkTransactionHeaderVO

Data Type Gotchas

Error Handling & Failure Points

Common Error Conditions

ConditionSymptomCauseResolution
Missing columns in extractExpected PVO attribute absentCustom field not BI-enabled or BI-publish job not runBI-enable in App Composer, run BI-publish ESS job
Incremental returning full datasetUnexpectedly large outputLastUpdateDate not selected, or baseline resetVerify LastUpdateDate in PVO selection; check baseline
REST API 401 UnauthorizedMetadata API calls rejectedMissing BIA_ADMINISTRATOR_DUTY or expired JWTVerify role; regenerate JWT
Filter expression errorExtract fails at submissionInvalid filter syntaxSimplify to column = value predicates
OTBI PVO degradationExtract runs 10x slowerSelected OTBI PVO instead of ExtractPVOSwitch to ExtractPVO variant
OCI write failureFiles not in bucketExpired API key or IAM policy missingRotate key; verify IAM policy
PVO not visibleExpected PVO missing from listPVO belongs to different offeringCheck other offerings; verify release notes

Failure Points in Production

Anti-Patterns

Wrong: Using OTBI reporting PVOs for BICC bulk extraction

-- BAD: TransactionHeaderPVO joins transactional tables, acquires row locks
Selected: FscmTopModelAM.FinancialAnalysisAM.TransactionHeaderPVO
Result: 10x slower extraction, Fusion UI sluggish for all users

Correct: Use designated ExtractPVOs

-- GOOD: ExtractPVOs read from pre-joined denormalized views
Selected: FscmTopModelAM.FinExtractAM.ApInvoicesExtractPVO
Result: Fast extraction, minimal Fusion impact
Naming rule: *ExtractPVO or *BICVO pattern

Wrong: Expecting custom fields automatically in BICC

-- BAD: Add field in Application Composer, immediately check BICC
Step 1: Add "Custom_Score__c" in Application Composer
Step 2: Look for it in BICC PVO
Result: Column not found — support ticket filed

Correct: BI-enable, BI-publish, then verify

-- GOOD: Follow the two-step BI-enablement process
Step 1: Application Composer > BI Enable custom field
Step 2: Run ESS Job: "Import Oracle Fusion Data Extensions for Transactional BI"
Step 3: Wait for ESS job completion (30-60 minutes)
Step 4: Verify field in BICC Console PVO attribute list

Wrong: Hardcoding filters in BICC Console per environment

-- BAD: Manual Console configuration — error-prone, not version-controlled
Dev:  Filter = "ORG_ID = 100" (manual)
Test: Filter = "ORG_ID = 200" (copy-paste errors)
Prod: Filter = "ORG_ID = 300" (forgot to update)

Correct: Manage filters via REST API with external metadata

# GOOD: Version-controlled, auditable, repeatable
ENVIRONMENTS = {
    "dev":  {"org_id": 100},
    "test": {"org_id": 200},
    "prod": {"org_id": 300},
}
def apply_bicc_filters(env, fusion_url, username, password):
    config = ENVIRONMENTS[env]
    update_bicc_filter(fusion_url, username, password,
                       "AP_INVOICES_EXTRACT", f"ORG_ID = {config['org_id']}")

Common Pitfalls

Diagnostic Commands

# List all BICC offerings via REST API
curl -s -u "admin:password" \
  "https://instance.fa.us2.oraclecloud.com/biacm/rest/meta/offerings" \
  -H "Accept: application/json" | python3 -m json.tool

# Get PVO attributes for a specific data store
curl -s -u "admin:password" \
  "https://instance.fa.us2.oraclecloud.com/biacm/rest/meta/datastores/GL_BALANCES_EXTRACT" \
  -H "Accept: application/json" | python3 -c "
import json, sys
ds = json.load(sys.stdin)
attrs = ds.get('attributes', [])
print(f'Total attributes: {len(attrs)}')
for a in attrs:
    print(f'  {a.get(\"name\", \"?\")} [{a.get(\"type\", \"?\")}]')
"

# Check if LastUpdateDate is selected (for incremental)
curl -s -u "admin:password" \
  "https://instance.fa.us2.oraclecloud.com/biacm/rest/meta/datastores/AP_INVOICES_EXTRACT" \
  -H "Accept: application/json" | python3 -c "
import json, sys
ds = json.load(sys.stdin)
lud = [a for a in ds.get('attributes', []) if 'LastUpdate' in a.get('name', '')]
print('Incremental supported' if lud else 'WARNING: Only full extraction possible')
"

# List extracted files in OCI Object Storage
oci os object list --bucket-name bicc-extracts --prefix "data/" --output table

Version History & Compatibility

ReleaseDateStatusKey BICC ChangesNotes
25C (25.06)2025-06CurrentBICC logs removed from UI; OCI OS enhancementsReduced diagnostics; use manifest files
25B (25.03)2025-03SupportedOCI Object Storage recommended over UCMOfficial guidance shift
24D (24.12)2024-12SupportedCustom object extraction; enhanced lineageExtractPVOs for custom objects
24C (24.09)2024-09SupportedREST metadata API enhancementsImproved PVO discovery and filters
24B (24.06)2024-06SupportedSplit file size configurability1-5GB configurable per VO

When to Use / When Not to Use

Use WhenDon't Use WhenUse Instead
Need to understand PVO architecture and attribute mappingNeed BICC rate limits and file size constraintsOracle BICC Data Extraction
Automating BICC configuration across environments via REST APINeed real-time individual record operationsOracle ERP Cloud REST API
Debugging missing custom fields in BICC extractsNeed to import data INTO Oracle FusionOracle FBDI Import
Setting up incremental extraction with proper prune timeNeed middleware orchestration around BICCOracle Integration Cloud
Choosing between UCM and OCI Object StorageNeed Oracle Fusion authentication setupOracle ERP Cloud Authentication
Understanding PVO lineage for target schema designNeed simple BICC quick-start guideOracle BICC Data Extraction

Important Caveats

Related Units