Informatica IICS Capabilities for SAP and Oracle Enterprise Integration

Type: ERP Integration System: Informatica IICS (Winter 2026) Confidence: 0.85 Sources: 8 Verified: 2026-03-03 Freshness: 2026-03-03

TL;DR

System Profile

Informatica Intelligent Cloud Services (IICS), branded as Intelligent Data Management Cloud (IDMC), is Informatica's cloud-native iPaaS platform. It consolidates multiple services — Cloud Data Integration (CDI), Cloud Data Integration Elastic (CDI-E), Cloud Application Integration (CAI), Cloud Data Ingestion and Replication (CDIR), Data Quality, Master Data Management, and API Center — under a single multi-tenant control plane. The Secure Agent architecture enables hybrid connectivity to on-premise SAP and Oracle systems without exposing data to Informatica's cloud servers.

This card covers IICS capabilities for SAP and Oracle integration specifically. It does not cover Informatica PowerCenter (the legacy on-premise ETL tool), which has a different connector architecture and licensing model. [src1]

PropertyValue
VendorInformatica
SystemIICS / IDMC (Winter 2026 release)
API SurfaceREST API (platform management), CDI (ETL/ELT), CDIR (mass ingestion + CDC), CAI (real-time process orchestration), API Center (API lifecycle)
Current VersionWinter 2026
Editions CoveredAll — CDI, CDI-E (elastic), CAI, CDIR, MDM, Data Quality
DeploymentCloud (multi-tenant SaaS) + Secure Agent (on-premise bridge)
API DocsInformatica Documentation
StatusGA

API Surfaces & Capabilities

IICS provides integration capabilities through multiple service layers rather than traditional API endpoints. Each service addresses a different integration pattern. [src1, src2]

Service / SurfaceProtocolBest ForMax VolumeMeteringReal-time?Bulk?
Cloud Data Integration (CDI)REST/SOAP connectorsETL/ELT transformations, data pipelinesMillions of rows per jobIPU per compute unitNo (batch)Yes
CDI Elastic (CDI-E)Spark-basedLarge-scale data processingPetabyte-scaleIPU per elastic computeNo (batch)Yes
Cloud Data Ingestion & Replication (CDIR)Database CDC, fileMass ingestion, CDC replicationBillions of rowsIPU per volume ingestedYes (CDC)Yes
Cloud Application Integration (CAI)REST/SOAP/eventsReal-time process orchestrationEvent-drivenIPU per process executionYesNo
API CenterREST/SOAPAPI lifecycle managementPer-API policyIPU per API callYesN/A
SAP Connector (ODP)SAP ODP frameworkSAP extraction via ODPFull/delta loadsIPU per extraction volumeYes (delta)Yes
SAP Connector (BAPI/RFC)SAP RFC protocolReal-time SAP function callsPer-callIPU per invocationYesNo
Oracle CDC V2Oracle redo logsChange data capture from OracleContinuous streamingIPU per CDC volumeYesYes

Rate Limits & Quotas

IICS does not use traditional per-API-call rate limits. Instead, all processing is metered via Informatica Processing Units (IPUs). [src5]

IPU Consumption Model

MeterWhat It MeasuresConsumption FactorNotes
CDI Compute UnitsCPU time for mapping execution1 IPU per compute-unit-hourMeasured on Secure Agent or serverless runtime
CDI-E Elastic ComputeSpark cluster computeIPU per elastic compute-unit-hourAuto-scales; higher burst = more IPU
CDIR VolumeData volume ingested/replicatedIPU per GB ingestedInitial load + CDC delta both metered
CAI Process ExecutionProcess instances runIPU per 1,000 process executionsHigh-frequency APIs can consume rapidly
API Center API CallsManaged API invocationsIPU per 10,000 API callsIncludes rate-limited and cached calls
Data Quality RowsRows processed through DQ rulesIPU per million rowsProfiling, standardization, matching

Monitoring & Thresholds

FeatureDetails
DashboardReal-time IPU consumption visible under Organization Administration > Metering
AlertsAutomatic email at 25%, 50%, 75%, 95%, 100% of IPU allocation
OverageConfigurable — can hard-stop or allow overage at increased per-IPU rate
Billing CycleMonthly or annual, depending on contract
Detailed ReportsPer-job, per-service, per-user breakdown available

Secure Agent Resource Limits

Limit TypeDefault ValueNotes
Max concurrent jobs per agent20 (CDI), configurableIncrease requires agent tuning + host resources
Max DTM memory per job512 MB defaultConfigurable in Secure Agent properties
Max file size for file-based sources10 GB per fileSplit larger files or use CDIR
Secure Agent heap memory512 MB defaultIncrease for large-scale operations
Connection pool size50 per connection typeConfigurable per connector

Authentication

FlowUse WhenToken LifetimeRefresh?Notes
Session ID (username/password)IICS REST API management calls30 min defaultRe-login requiredUsed for programmatic job orchestration
SAML 2.0 SSOUI access with enterprise IdPIdP session durationVia IdPStandard enterprise SSO
OAuth 2.0API Center managed APIsConfigurable per policyYesFor APIs published through API Center
SAP RFC credentialsSAP connector (BAPI/RFC/IDoc)Persistent connectionN/AUses SAP logon parameters
Oracle DB credentialsOracle connector (JDBC/CDC)Persistent connectionN/AService account with redo log access

Authentication Gotchas

Constraints

Integration Pattern Decision Tree

START — Need to integrate SAP or Oracle with IICS
├── What's the data pattern?
│   ├── High-volume batch ETL/ELT
│   │   ├── < 1M rows per job? → CDI with Secure Agent (standard compute)
│   │   └── > 1M rows per job? → CDI Elastic (CDI-E) on Spark
│   ├── Real-time CDC / streaming
│   │   ├── Source is Oracle?
│   │   │   ├── Direct redo log access? → CDIR with Oracle CDC (best perf)
│   │   │   └── No direct log access → CDIR with LogMiner (easier, slower)
│   │   └── Source is SAP?
│   │       ├── Need delta extraction → SAP ODP connector (delta queue)
│   │       └── Need real-time triggers → SAP IDoc listener or CAI
│   ├── Real-time API orchestration
│   │   ├── Expose SAP data as REST API? → CAI + API Center
│   │   └── Need BAPI/RFC calls? → SAP BAPI/RFC connector
│   └── File-based (IDoc, CSV, XML)
│       ├── SAP IDoc → SAP IDoc Reader/Writer connector
│       └── Flat file → CDI with file connector
├── Where is the target?
│   ├── Cloud DW (Snowflake, BigQuery, Redshift) → CDIR or CDI-E
│   ├── Another ERP (Oracle ↔ SAP) → CDI with both connectors
│   ├── Data lake (S3, ADLS, GCS) → CDIR mass ingestion
│   └── API consumers → CAI + API Center
└── Need data quality?
    ├── YES → Add Data Quality service
    └── NO → Proceed with integration service only

Quick Reference

IICS Service Selection Matrix

RequirementServiceSAP SupportOracle SupportIPU Impact
Batch ETL with complex transformsCDIODP, BAPI/RFC, IDoc, tableJDBC, OCIModerate (compute)
Large-scale elastic processingCDI-ESame as CDISame as CDIHigh (Spark cluster)
Mass ingestion + CDC replicationCDIRODP delta, SLTOracle CDC (redo logs, LogMiner)Volume-based
Real-time process orchestrationCAIBAPI/RFC, IDoc eventsJDBC, RESTPer-execution
API lifecycle managementAPI CenterVia CAI processesVia CAI processesPer-API-call
Data profiling and cleansingData QualityAll SAP sourcesAll Oracle sourcesRow-based
Master data managementMDMSAP master data syncOracle master data syncRecord-based

SAP Connector Capabilities

SAP InterfaceIICS ServiceDirectionUse CaseNotes
ODPCDI, CDIROutboundFull/delta extraction from SAP DataSources, extractors, CDS viewsRequires ODP framework activation
BAPI/RFCCDI, CAIBidirectionalReal-time function calls, create/change/delete SAP recordsRequires SAP JCo libraries
IDoc (ALE)CDIBidirectionalMessage-based async integrationSAP must recognize IICS as logical system
OData v2/v4CDIBidirectionalSAP Gateway services, S/4HANA APIsStandard HTTP, no JCo needed
SAP Table (direct)CDIOutboundDirect table readsUse ODP instead when available
SAP BWCDIOutboundBW InfoProvider extractionVia ODP or direct BAPI

Oracle Connector Capabilities

Oracle InterfaceIICS ServiceDirectionUse CaseNotes
JDBCCDIBidirectionalStandard SQL operations, batch ETLUniversal Oracle connectivity
Oracle CDC V2 (redo logs)CDIROutboundReal-time change capture, continuous replicationBest performance; requires redo log access
Oracle CDC (LogMiner)CDIROutboundCDC without direct log access3-5x overhead on source DB
Oracle Cloud Object StorageCDIBidirectionalFile-based integration with OCIStandard connector
Oracle Autonomous DBCDIBidirectionalCloud database integrationVia JDBC with wallet

Step-by-Step Integration Guide

1. Install and configure the Secure Agent

Deploy a Secure Agent on a host with network access to both your SAP/Oracle systems and outbound HTTPS to Informatica Cloud. [src1, src2]

# Download Secure Agent installer from IICS Admin Console
# On Linux:
chmod +x agent64_install_ng_ext.bin
./agent64_install_ng_ext.bin

# Register agent with IICS org
./consoleagentmanager.sh configureToken <your-registration-token>
./consoleagentmanager.sh configure <your-iics-username> <your-iics-password>

# Start the agent
./infaagent.sh startup

Verify: In IICS Admin Console, navigate to Runtime Environments > Secure Agents. The agent should show status Up and Running with green indicator.

2. Configure SAP connectivity (ODP or BAPI/RFC)

Create a connection using the SAP connector. Ensure SAP JCo libraries are deployed on the Secure Agent host. [src3]

# Copy SAP JCo libraries to Secure Agent
cp sapjco3.jar /opt/informatica/agent/ext/
cp libsapjco3.so /opt/informatica/agent/ext/

# Restart Secure Agent to load libraries
./infaagent.sh shutdown
./infaagent.sh startup

Verify: In IICS > Connections, create a new SAP connection and click Test Connection — should return Connection successful.

3. Configure Oracle CDC

Configure the Oracle source database to enable log mining or direct redo log access. [src4]

-- On Oracle source database (as SYSDBA):
ALTER DATABASE ADD SUPPLEMENTAL LOG DATA;
ALTER DATABASE ADD SUPPLEMENTAL LOG DATA (ALL) COLUMNS;

-- Grant required privileges to the CDC user
GRANT SELECT ANY TRANSACTION TO iics_cdc_user;
GRANT LOGMINING TO iics_cdc_user;
GRANT CREATE SESSION TO iics_cdc_user;
GRANT SELECT ON V_$LOG TO iics_cdc_user;
GRANT SELECT ON V_$LOGFILE TO iics_cdc_user;
GRANT SELECT ON V_$ARCHIVED_LOG TO iics_cdc_user;

Verify: In IICS > Connections, create an Oracle CDC V2 connection and test — should return Connection successful.

4. Create and run a data integration mapping

Build a CDI mapping to extract from SAP/Oracle and load to target. Trigger programmatically via IICS REST API. [src1]

import requests, time

# Login and get session
login_url = "https://dm-us.informaticacloud.com/ma/api/v2/user/login"
resp = requests.post(login_url, json={"username": "[email protected]", "password": "pass"})
session_id = resp.json()["icSessionId"]
base_url = resp.json()["serverUrl"]  # Use returned URL, not hardcoded
headers = {"icSessionId": session_id, "Content-Type": "application/json"}

# Start a mapping task
run_resp = requests.post(f"{base_url}/api/v2/job",
    json={"taskId": "<task-id>", "taskType": "DSS"}, headers=headers)
run_id = run_resp.json()["runId"]

# Poll for completion
while True:
    status = requests.get(
        f"{base_url}/api/v2/activity/activityMonitor/{run_id}", headers=headers
    ).json()["executionState"]
    if status in ("SUCCESS", "FAILED", "STOPPED"):
        print(f"Job completed: {status}")
        break
    time.sleep(10)

Verify: Check Activity Monitor in IICS UI — job status should be SUCCESS with row counts matching expectations.

Code Examples

Python: Trigger IICS job and monitor via REST API

# Input:  IICS credentials, mapping task ID
# Output: Job run status and row counts

import requests, time

class IICSClient:
    def __init__(self, username, password,
                 login_url="https://dm-us.informaticacloud.com/ma/api/v2/user/login"):
        resp = requests.post(login_url, json={"username": username, "password": password})
        resp.raise_for_status()
        data = resp.json()
        self.session_id = data["icSessionId"]
        self.base_url = data["serverUrl"]
        self.headers = {"icSessionId": self.session_id, "Content-Type": "application/json"}

    def run_task(self, task_id, task_type="DSS"):
        resp = requests.post(f"{self.base_url}/api/v2/job",
            json={"taskId": task_id, "taskType": task_type}, headers=self.headers)
        resp.raise_for_status()
        return resp.json()["runId"]

    def wait_for_completion(self, run_id, poll_interval=15, timeout=3600):
        elapsed = 0
        while elapsed < timeout:
            resp = requests.get(
                f"{self.base_url}/api/v2/activity/activityMonitor/{run_id}",
                headers=self.headers)
            state = resp.json().get("executionState", "UNKNOWN")
            if state in ("SUCCESS", "FAILED", "STOPPED"):
                return resp.json()
            time.sleep(poll_interval)
            elapsed += poll_interval
        raise TimeoutError(f"Job {run_id} did not complete within {timeout}s")

client = IICSClient("[email protected]", "password")
run_id = client.run_task("0012ABC000000000DEFG")
result = client.wait_for_completion(run_id)
print(f"Status: {result['executionState']}")

cURL: Login and list tasks

# Input:  IICS username/password
# Output: Session ID, task list

# Login
curl -s -X POST "https://dm-us.informaticacloud.com/ma/api/v2/user/login" \
  -H "Content-Type: application/json" \
  -d '{"username":"[email protected]","password":"your_pass"}' \
  | jq '{sessionId: .icSessionId, serverUrl: .serverUrl}'

# List tasks (use serverUrl from login response)
curl -s -X GET "${SERVER_URL}/api/v2/task" \
  -H "icSessionId: ${SESSION_ID}" \
  | jq '.[] | {id: .id, name: .name, type: .type}'

# Start a job
curl -s -X POST "${SERVER_URL}/api/v2/job" \
  -H "icSessionId: ${SESSION_ID}" \
  -H "Content-Type: application/json" \
  -d '{"taskId":"<task-id>","taskType":"DSS"}'

Data Mapping

SAP-to-Target Field Mapping Patterns

SAP SourceCommon TargetTypeTransformGotcha
SAP date (DATS, YYYYMMDD)Target date (YYYY-MM-DD)DateTO_DATE(src, 'YYYYMMDD')SAP uses '00000000' for null dates
SAP amount (CURR)Target decimalCurrencyDivide by 100 if smallest unitCurrency-dependent: JPY has no decimals
SAP language key (1-char)ISO language codeStringLookup table (E→en, D→de)SAP uses single-char codes
SAP material number (MATNR)Target product IDString(18→40)Strip leading zerosInternal format has leading zeros
SAP text fields (CHAR)Target VARCHARStringTrim trailing spacesSAP pads CHAR fields to full length

Oracle-to-Target Field Mapping Patterns

Oracle SourceCommon TargetTypeTransformGotcha
Oracle DATETarget timestampDatetimeDirectOracle DATE always has time component
Oracle NUMBER(p,s)Target decimalNumericDirectPrecision loss if target has lower precision
Oracle CLOBTarget textStringStream in chunksIICS CDI default CLOB limit: 4000 chars
Oracle RAW/BLOBTarget binaryBinaryBase64 encode for RESTLarge BLOBs can exceed mapping memory
Oracle TIMESTAMP WITH TZTarget timestampDatetimeNormalize to UTCCDC captures in source TZ

Data Type Gotchas

Error Handling & Failure Points

Common Error Codes

CodeMeaningCauseResolution
INFA_CONN_001Connection failedAgent cannot reach source/targetVerify network path, firewall, agent status
SAP JCo ExceptionSAP library errorMissing/incompatible JCo librariesReinstall correct JCo version for agent OS/JVM
ORA-01291Missing log fileRedo log archived/deleted before CDC readIncrease log retention; reduce CDC poll interval
IPU_EXCEEDEDIPU allocation exhaustedMonthly IPU budget consumedWait for billing cycle reset or purchase more
INFA_AGENT_DOWNSecure Agent offlineAgent service crashed or host unreachableRestart agent; check host resources
SAP_RFC_AUTHRFC authorization failureMissing S_RFC authorizationGrant S_RFC auth object in SAP role
ORA-04031Shared pool memoryLogMiner consuming excessive shared poolSwitch to direct redo log or increase SGA

Failure Points in Production

Anti-Patterns

Wrong: Using CDI for high-volume real-time CDC

# BAD — CDI batch mapping scheduled every 1 minute for "real-time" CDC
# Creates 1,440 jobs/day, each consuming IPUs for startup overhead
schedule.every(1).minutes.do(run_cdi_mapping, "SAP_Orders_Extract")

Correct: Use CDIR for continuous CDC streaming

# GOOD — CDIR provides continuous CDC with minimal overhead
# Runs as a long-lived streaming process; IPU based on volume, not job count
cdir_task = create_cdir_task(
    source="Oracle_Production", mode="CDC",
    target="Snowflake_DW",
    tables=["ORDERS", "ORDER_LINES", "CUSTOMERS"])

Wrong: Hardcoding the IICS base URL

# BAD — IICS base URL varies by pod and can change
base_url = "https://usw3.dm-us.informaticacloud.com"
resp = requests.get(f"{base_url}/api/v2/task", headers=headers)

Correct: Always use the base URL returned from login

# GOOD — use serverUrl from login response
login_resp = requests.post(
    "https://dm-us.informaticacloud.com/ma/api/v2/user/login",
    json={"username": user, "password": pwd})
base_url = login_resp.json()["serverUrl"]  # Dynamic — correct for your pod

Wrong: Running SAP ODP full extraction on every schedule

# BAD — full extraction every time wastes IPUs and loads SAP
task_config = {"extractionMode": "FULL", "schedule": "DAILY"}

Correct: Use ODP delta extraction after initial full load

# GOOD — initial full load, then delta extractions only
# ODP delta queue tracks changes; only new/modified records extracted
initial_config = {"extractionMode": "FULL"}    # Run once
delta_config = {"extractionMode": "DELTA", "schedule": "EVERY_15_MIN"}

Common Pitfalls

Diagnostic Commands

# Check Secure Agent status
curl -s -X GET "${SERVER_URL}/api/v2/runtimeEnvironment" \
  -H "icSessionId: ${SESSION_ID}" \
  | jq '.[] | {name: .name, agents: [.agents[] | {name: .name, active: .active}]}'

# Check IPU consumption for current billing period
curl -s -X GET "${SERVER_URL}/api/v2/meteringInfo" \
  -H "icSessionId: ${SESSION_ID}" \
  | jq '{consumed: .consumed, purchased: .purchased, remaining: .remaining}'

# List recent job failures
curl -s -X GET "${SERVER_URL}/api/v2/activity/activityLog?rowLimit=20&type=MAPPING_TASK&executionState=FAILED" \
  -H "icSessionId: ${SESSION_ID}" \
  | jq '.[] | {taskName: .taskName, startTime: .startTime, errorMsg: .errorMsg}'

# Test SAP connection
curl -s -X POST "${SERVER_URL}/api/v2/connection/test" \
  -H "icSessionId: ${SESSION_ID}" -H "Content-Type: application/json" \
  -d '{"id":"<sap-connection-id>"}' | jq '.status'

# Test Oracle connection
curl -s -X POST "${SERVER_URL}/api/v2/connection/test" \
  -H "icSessionId: ${SESSION_ID}" -H "Content-Type: application/json" \
  -d '{"id":"<oracle-connection-id>"}' | jq '.status'

Version History & Compatibility

ReleaseDateStatusKey ChangesNotes
Winter 20262026-01CurrentCLAIRE AI copilot GA, MCP Server announcedMCP Server enables AI agent integration
Fall 20252025-10SupportedNew OCI integrations, Oracle CDC improvementsEnhanced Oracle Autonomous DB support
Summer 20252025-07SupportedCDI-E performance, new marketplace connectorsElastic compute auto-scaling enhancements
Spring 20252025-04SupportedCDIR CDC for SAP (GA), SAP ODP v2Major SAP CDC milestone
Winter 20252025-01SupportedAPI Center rate limiting, enhanced monitoringAPI management maturity

When to Use / When Not to Use

Use WhenDon't Use WhenUse Instead
Need deep SAP extraction (ODP, BAPI/RFC, IDoc) at scaleSimple REST-to-REST integrationBoomi or Workato
Need Oracle CDC replication with minimal source impactAPI-first architecture as primary focusMuleSoft Anypoint
Already a PowerCenter customer migrating to cloudSAP-to-SAP only integrationSAP Integration Suite
Need embedded data quality alongside integrationBudget is strictly per-connection flat rateBoomi (connection-based pricing)
Data volume exceeds 100M rows/dayNeed citizen integrator / low-code-first platformWorkato or Boomi
Need unified ETL + CDC + API + DQ + MDM platformNeed event-driven microservices architectureMuleSoft or Apache Kafka

Cross-System Comparison

CapabilityInformatica IICSMuleSoft AnypointBoomi AtomSphereNotes
Primary strengthData integration + qualityAPI-led connectivityLow-code iPaaSIICS data-first; MuleSoft API-first
Pricing modelIPU consumption-basedvCore allocationConnection + flow countIICS most granular metering
Starting price~$10,000+/month~$15,000+/month~$3,000+/monthBoomi cheapest entry
SAP connector depthDeep (ODP, BAPI/RFC, IDoc, OData, table)Moderate (JCo, IDoc, OData)Moderate (BAPI, IDoc)IICS most SAP surfaces
Oracle CDCNative (redo log + LogMiner)Via partner connectorsVia partner connectorsIICS only native Oracle CDC
Data qualityBuilt-in (profiling, standardization)None (requires partner)None (requires partner)IICS unique advantage
API managementAPI Center (maturing)Full lifecycle (market leader)API Management add-onMuleSoft dominant in APIs
Low-code UXModerate (drag-and-drop)Low (developer-focused)High (citizen-friendly)Boomi best for non-technical
Elastic computeCDI-E (Spark auto-scaling)CloudHub auto-scalingAtom auto-scalingIICS best for massive volumes
Connector count400+ (marketplace)350+ (Exchange)200+ (Flow)IICS largest library
Legacy migrationPowerCenter conversion toolN/AN/AUnique for Informatica customers
Gartner positioningLeader (data integration + iPaaS)Leader (iPaaS)Leader (iPaaS)IICS leads in data integration

Important Caveats

Related Units