Informatica IICS Capabilities for SAP and Oracle Enterprise Integration
Type: ERP Integration
System: Informatica IICS (Winter 2026)
Confidence: 0.85
Sources: 8
Verified: 2026-03-03
Freshness: 2026-03-03
TL;DR
- Bottom line: Informatica IICS is a data-integration-first iPaaS with the deepest SAP and Oracle connectors of any third-party platform, excelling at high-volume ETL/ELT, CDC replication, and data quality — but weaker than MuleSoft for API-led connectivity.
- Key limit: IPU (Informatica Processing Unit) consumption-based metering — no hard per-API-call rate limits, but uncontrolled workloads can exhaust IPU allocations rapidly.
- Watch out for: Conflating IICS (cloud-native) with PowerCenter (legacy on-premise) — migration is non-trivial and connectors are not 1:1 compatible.
- Best for: Enterprises needing deep SAP extraction (ODP, BAPI/RFC, IDoc), Oracle CDC replication, mass data ingestion at scale, and embedded data quality/governance.
- Authentication: IICS REST API uses username/password login for session ID; SAML 2.0 SSO for UI; OAuth 2.0 for API Center managed APIs.
System Profile
Informatica Intelligent Cloud Services (IICS), branded as Intelligent Data Management Cloud (IDMC), is Informatica's cloud-native iPaaS platform. It consolidates multiple services — Cloud Data Integration (CDI), Cloud Data Integration Elastic (CDI-E), Cloud Application Integration (CAI), Cloud Data Ingestion and Replication (CDIR), Data Quality, Master Data Management, and API Center — under a single multi-tenant control plane. The Secure Agent architecture enables hybrid connectivity to on-premise SAP and Oracle systems without exposing data to Informatica's cloud servers.
This card covers IICS capabilities for SAP and Oracle integration specifically. It does not cover Informatica PowerCenter (the legacy on-premise ETL tool), which has a different connector architecture and licensing model. [src1]
| Property | Value |
| Vendor | Informatica |
| System | IICS / IDMC (Winter 2026 release) |
| API Surface | REST API (platform management), CDI (ETL/ELT), CDIR (mass ingestion + CDC), CAI (real-time process orchestration), API Center (API lifecycle) |
| Current Version | Winter 2026 |
| Editions Covered | All — CDI, CDI-E (elastic), CAI, CDIR, MDM, Data Quality |
| Deployment | Cloud (multi-tenant SaaS) + Secure Agent (on-premise bridge) |
| API Docs | Informatica Documentation |
| Status | GA |
API Surfaces & Capabilities
IICS provides integration capabilities through multiple service layers rather than traditional API endpoints. Each service addresses a different integration pattern. [src1, src2]
| Service / Surface | Protocol | Best For | Max Volume | Metering | Real-time? | Bulk? |
| Cloud Data Integration (CDI) | REST/SOAP connectors | ETL/ELT transformations, data pipelines | Millions of rows per job | IPU per compute unit | No (batch) | Yes |
| CDI Elastic (CDI-E) | Spark-based | Large-scale data processing | Petabyte-scale | IPU per elastic compute | No (batch) | Yes |
| Cloud Data Ingestion & Replication (CDIR) | Database CDC, file | Mass ingestion, CDC replication | Billions of rows | IPU per volume ingested | Yes (CDC) | Yes |
| Cloud Application Integration (CAI) | REST/SOAP/events | Real-time process orchestration | Event-driven | IPU per process execution | Yes | No |
| API Center | REST/SOAP | API lifecycle management | Per-API policy | IPU per API call | Yes | N/A |
| SAP Connector (ODP) | SAP ODP framework | SAP extraction via ODP | Full/delta loads | IPU per extraction volume | Yes (delta) | Yes |
| SAP Connector (BAPI/RFC) | SAP RFC protocol | Real-time SAP function calls | Per-call | IPU per invocation | Yes | No |
| Oracle CDC V2 | Oracle redo logs | Change data capture from Oracle | Continuous streaming | IPU per CDC volume | Yes | Yes |
Rate Limits & Quotas
IICS does not use traditional per-API-call rate limits. Instead, all processing is metered via Informatica Processing Units (IPUs). [src5]
IPU Consumption Model
| Meter | What It Measures | Consumption Factor | Notes |
| CDI Compute Units | CPU time for mapping execution | 1 IPU per compute-unit-hour | Measured on Secure Agent or serverless runtime |
| CDI-E Elastic Compute | Spark cluster compute | IPU per elastic compute-unit-hour | Auto-scales; higher burst = more IPU |
| CDIR Volume | Data volume ingested/replicated | IPU per GB ingested | Initial load + CDC delta both metered |
| CAI Process Execution | Process instances run | IPU per 1,000 process executions | High-frequency APIs can consume rapidly |
| API Center API Calls | Managed API invocations | IPU per 10,000 API calls | Includes rate-limited and cached calls |
| Data Quality Rows | Rows processed through DQ rules | IPU per million rows | Profiling, standardization, matching |
Monitoring & Thresholds
| Feature | Details |
| Dashboard | Real-time IPU consumption visible under Organization Administration > Metering |
| Alerts | Automatic email at 25%, 50%, 75%, 95%, 100% of IPU allocation |
| Overage | Configurable — can hard-stop or allow overage at increased per-IPU rate |
| Billing Cycle | Monthly or annual, depending on contract |
| Detailed Reports | Per-job, per-service, per-user breakdown available |
Secure Agent Resource Limits
| Limit Type | Default Value | Notes |
| Max concurrent jobs per agent | 20 (CDI), configurable | Increase requires agent tuning + host resources |
| Max DTM memory per job | 512 MB default | Configurable in Secure Agent properties |
| Max file size for file-based sources | 10 GB per file | Split larger files or use CDIR |
| Secure Agent heap memory | 512 MB default | Increase for large-scale operations |
| Connection pool size | 50 per connection type | Configurable per connector |
Authentication
| Flow | Use When | Token Lifetime | Refresh? | Notes |
| Session ID (username/password) | IICS REST API management calls | 30 min default | Re-login required | Used for programmatic job orchestration |
| SAML 2.0 SSO | UI access with enterprise IdP | IdP session duration | Via IdP | Standard enterprise SSO |
| OAuth 2.0 | API Center managed APIs | Configurable per policy | Yes | For APIs published through API Center |
| SAP RFC credentials | SAP connector (BAPI/RFC/IDoc) | Persistent connection | N/A | Uses SAP logon parameters |
| Oracle DB credentials | Oracle connector (JDBC/CDC) | Persistent connection | N/A | Service account with redo log access |
Authentication Gotchas
- IICS REST API login returns a session ID and a base URL that varies by pod — you must use the returned
serverUrl for subsequent calls, not a hardcoded URL. [src2]
- SAP BAPI/RFC connections require SAP JCo libraries installed on the Secure Agent host — these are not bundled with IICS due to SAP licensing. [src3]
- Oracle CDC requires explicit grants:
ALTER SESSION, SELECT ANY TRANSACTION, LOGMINING, and read access to redo/archive logs. Insufficient permissions cause silent failures. [src4]
Constraints
- Secure Agent is mandatory for on-premise access: All connectivity to behind-firewall SAP or Oracle systems requires a Secure Agent installed on a host within the customer's network.
- IPU metering applies to all processing: Every row processed, every API call served, every CDC change captured consumes IPUs. Unpredictable workloads can cause budget overruns.
- SAP ODP requires SAP-side activation: The ODP framework must be activated in SAP, ODP providers configured, and RFC authorizations granted. Without these, ODP extraction silently returns empty results.
- Oracle CDC method determines performance: Direct redo log access provides best performance but requires OS-level file access. LogMiner has 3-5x performance overhead on the source database.
- PowerCenter mappings are not directly portable: Migration requires the conversion utility; complex PowerCenter workflows with custom Java transformations require manual refactoring.
- Marketplace connector licensing varies: Not all 400+ connectors are included in base IPU subscriptions. Premium connectors may require separate licensing.
Integration Pattern Decision Tree
START — Need to integrate SAP or Oracle with IICS
├── What's the data pattern?
│ ├── High-volume batch ETL/ELT
│ │ ├── < 1M rows per job? → CDI with Secure Agent (standard compute)
│ │ └── > 1M rows per job? → CDI Elastic (CDI-E) on Spark
│ ├── Real-time CDC / streaming
│ │ ├── Source is Oracle?
│ │ │ ├── Direct redo log access? → CDIR with Oracle CDC (best perf)
│ │ │ └── No direct log access → CDIR with LogMiner (easier, slower)
│ │ └── Source is SAP?
│ │ ├── Need delta extraction → SAP ODP connector (delta queue)
│ │ └── Need real-time triggers → SAP IDoc listener or CAI
│ ├── Real-time API orchestration
│ │ ├── Expose SAP data as REST API? → CAI + API Center
│ │ └── Need BAPI/RFC calls? → SAP BAPI/RFC connector
│ └── File-based (IDoc, CSV, XML)
│ ├── SAP IDoc → SAP IDoc Reader/Writer connector
│ └── Flat file → CDI with file connector
├── Where is the target?
│ ├── Cloud DW (Snowflake, BigQuery, Redshift) → CDIR or CDI-E
│ ├── Another ERP (Oracle ↔ SAP) → CDI with both connectors
│ ├── Data lake (S3, ADLS, GCS) → CDIR mass ingestion
│ └── API consumers → CAI + API Center
└── Need data quality?
├── YES → Add Data Quality service
└── NO → Proceed with integration service only
Quick Reference
IICS Service Selection Matrix
| Requirement | Service | SAP Support | Oracle Support | IPU Impact |
| Batch ETL with complex transforms | CDI | ODP, BAPI/RFC, IDoc, table | JDBC, OCI | Moderate (compute) |
| Large-scale elastic processing | CDI-E | Same as CDI | Same as CDI | High (Spark cluster) |
| Mass ingestion + CDC replication | CDIR | ODP delta, SLT | Oracle CDC (redo logs, LogMiner) | Volume-based |
| Real-time process orchestration | CAI | BAPI/RFC, IDoc events | JDBC, REST | Per-execution |
| API lifecycle management | API Center | Via CAI processes | Via CAI processes | Per-API-call |
| Data profiling and cleansing | Data Quality | All SAP sources | All Oracle sources | Row-based |
| Master data management | MDM | SAP master data sync | Oracle master data sync | Record-based |
SAP Connector Capabilities
| SAP Interface | IICS Service | Direction | Use Case | Notes |
| ODP | CDI, CDIR | Outbound | Full/delta extraction from SAP DataSources, extractors, CDS views | Requires ODP framework activation |
| BAPI/RFC | CDI, CAI | Bidirectional | Real-time function calls, create/change/delete SAP records | Requires SAP JCo libraries |
| IDoc (ALE) | CDI | Bidirectional | Message-based async integration | SAP must recognize IICS as logical system |
| OData v2/v4 | CDI | Bidirectional | SAP Gateway services, S/4HANA APIs | Standard HTTP, no JCo needed |
| SAP Table (direct) | CDI | Outbound | Direct table reads | Use ODP instead when available |
| SAP BW | CDI | Outbound | BW InfoProvider extraction | Via ODP or direct BAPI |
Oracle Connector Capabilities
| Oracle Interface | IICS Service | Direction | Use Case | Notes |
| JDBC | CDI | Bidirectional | Standard SQL operations, batch ETL | Universal Oracle connectivity |
| Oracle CDC V2 (redo logs) | CDIR | Outbound | Real-time change capture, continuous replication | Best performance; requires redo log access |
| Oracle CDC (LogMiner) | CDIR | Outbound | CDC without direct log access | 3-5x overhead on source DB |
| Oracle Cloud Object Storage | CDI | Bidirectional | File-based integration with OCI | Standard connector |
| Oracle Autonomous DB | CDI | Bidirectional | Cloud database integration | Via JDBC with wallet |
Step-by-Step Integration Guide
1. Install and configure the Secure Agent
Deploy a Secure Agent on a host with network access to both your SAP/Oracle systems and outbound HTTPS to Informatica Cloud. [src1, src2]
# Download Secure Agent installer from IICS Admin Console
# On Linux:
chmod +x agent64_install_ng_ext.bin
./agent64_install_ng_ext.bin
# Register agent with IICS org
./consoleagentmanager.sh configureToken <your-registration-token>
./consoleagentmanager.sh configure <your-iics-username> <your-iics-password>
# Start the agent
./infaagent.sh startup
Verify: In IICS Admin Console, navigate to Runtime Environments > Secure Agents. The agent should show status Up and Running with green indicator.
2. Configure SAP connectivity (ODP or BAPI/RFC)
Create a connection using the SAP connector. Ensure SAP JCo libraries are deployed on the Secure Agent host. [src3]
# Copy SAP JCo libraries to Secure Agent
cp sapjco3.jar /opt/informatica/agent/ext/
cp libsapjco3.so /opt/informatica/agent/ext/
# Restart Secure Agent to load libraries
./infaagent.sh shutdown
./infaagent.sh startup
Verify: In IICS > Connections, create a new SAP connection and click Test Connection — should return Connection successful.
3. Configure Oracle CDC
Configure the Oracle source database to enable log mining or direct redo log access. [src4]
-- On Oracle source database (as SYSDBA):
ALTER DATABASE ADD SUPPLEMENTAL LOG DATA;
ALTER DATABASE ADD SUPPLEMENTAL LOG DATA (ALL) COLUMNS;
-- Grant required privileges to the CDC user
GRANT SELECT ANY TRANSACTION TO iics_cdc_user;
GRANT LOGMINING TO iics_cdc_user;
GRANT CREATE SESSION TO iics_cdc_user;
GRANT SELECT ON V_$LOG TO iics_cdc_user;
GRANT SELECT ON V_$LOGFILE TO iics_cdc_user;
GRANT SELECT ON V_$ARCHIVED_LOG TO iics_cdc_user;
Verify: In IICS > Connections, create an Oracle CDC V2 connection and test — should return Connection successful.
4. Create and run a data integration mapping
Build a CDI mapping to extract from SAP/Oracle and load to target. Trigger programmatically via IICS REST API. [src1]
import requests, time
# Login and get session
login_url = "https://dm-us.informaticacloud.com/ma/api/v2/user/login"
resp = requests.post(login_url, json={"username": "[email protected]", "password": "pass"})
session_id = resp.json()["icSessionId"]
base_url = resp.json()["serverUrl"] # Use returned URL, not hardcoded
headers = {"icSessionId": session_id, "Content-Type": "application/json"}
# Start a mapping task
run_resp = requests.post(f"{base_url}/api/v2/job",
json={"taskId": "<task-id>", "taskType": "DSS"}, headers=headers)
run_id = run_resp.json()["runId"]
# Poll for completion
while True:
status = requests.get(
f"{base_url}/api/v2/activity/activityMonitor/{run_id}", headers=headers
).json()["executionState"]
if status in ("SUCCESS", "FAILED", "STOPPED"):
print(f"Job completed: {status}")
break
time.sleep(10)
Verify: Check Activity Monitor in IICS UI — job status should be SUCCESS with row counts matching expectations.
Code Examples
Python: Trigger IICS job and monitor via REST API
# Input: IICS credentials, mapping task ID
# Output: Job run status and row counts
import requests, time
class IICSClient:
def __init__(self, username, password,
login_url="https://dm-us.informaticacloud.com/ma/api/v2/user/login"):
resp = requests.post(login_url, json={"username": username, "password": password})
resp.raise_for_status()
data = resp.json()
self.session_id = data["icSessionId"]
self.base_url = data["serverUrl"]
self.headers = {"icSessionId": self.session_id, "Content-Type": "application/json"}
def run_task(self, task_id, task_type="DSS"):
resp = requests.post(f"{self.base_url}/api/v2/job",
json={"taskId": task_id, "taskType": task_type}, headers=self.headers)
resp.raise_for_status()
return resp.json()["runId"]
def wait_for_completion(self, run_id, poll_interval=15, timeout=3600):
elapsed = 0
while elapsed < timeout:
resp = requests.get(
f"{self.base_url}/api/v2/activity/activityMonitor/{run_id}",
headers=self.headers)
state = resp.json().get("executionState", "UNKNOWN")
if state in ("SUCCESS", "FAILED", "STOPPED"):
return resp.json()
time.sleep(poll_interval)
elapsed += poll_interval
raise TimeoutError(f"Job {run_id} did not complete within {timeout}s")
client = IICSClient("[email protected]", "password")
run_id = client.run_task("0012ABC000000000DEFG")
result = client.wait_for_completion(run_id)
print(f"Status: {result['executionState']}")
cURL: Login and list tasks
# Input: IICS username/password
# Output: Session ID, task list
# Login
curl -s -X POST "https://dm-us.informaticacloud.com/ma/api/v2/user/login" \
-H "Content-Type: application/json" \
-d '{"username":"[email protected]","password":"your_pass"}' \
| jq '{sessionId: .icSessionId, serverUrl: .serverUrl}'
# List tasks (use serverUrl from login response)
curl -s -X GET "${SERVER_URL}/api/v2/task" \
-H "icSessionId: ${SESSION_ID}" \
| jq '.[] | {id: .id, name: .name, type: .type}'
# Start a job
curl -s -X POST "${SERVER_URL}/api/v2/job" \
-H "icSessionId: ${SESSION_ID}" \
-H "Content-Type: application/json" \
-d '{"taskId":"<task-id>","taskType":"DSS"}'
Data Mapping
SAP-to-Target Field Mapping Patterns
| SAP Source | Common Target | Type | Transform | Gotcha |
| SAP date (DATS, YYYYMMDD) | Target date (YYYY-MM-DD) | Date | TO_DATE(src, 'YYYYMMDD') | SAP uses '00000000' for null dates |
| SAP amount (CURR) | Target decimal | Currency | Divide by 100 if smallest unit | Currency-dependent: JPY has no decimals |
| SAP language key (1-char) | ISO language code | String | Lookup table (E→en, D→de) | SAP uses single-char codes |
| SAP material number (MATNR) | Target product ID | String(18→40) | Strip leading zeros | Internal format has leading zeros |
| SAP text fields (CHAR) | Target VARCHAR | String | Trim trailing spaces | SAP pads CHAR fields to full length |
Oracle-to-Target Field Mapping Patterns
| Oracle Source | Common Target | Type | Transform | Gotcha |
| Oracle DATE | Target timestamp | Datetime | Direct | Oracle DATE always has time component |
| Oracle NUMBER(p,s) | Target decimal | Numeric | Direct | Precision loss if target has lower precision |
| Oracle CLOB | Target text | String | Stream in chunks | IICS CDI default CLOB limit: 4000 chars |
| Oracle RAW/BLOB | Target binary | Binary | Base64 encode for REST | Large BLOBs can exceed mapping memory |
| Oracle TIMESTAMP WITH TZ | Target timestamp | Datetime | Normalize to UTC | CDC captures in source TZ |
Data Type Gotchas
- SAP date format '00000000' is a valid SAP null date — mapping to a database date column causes errors. Use conditional transformation to convert to NULL. [src3]
- Oracle CDC captures changes in source database character set. If target uses different character set, conversion must be handled in the mapping. [src4]
- SAP amounts in BAPI/RFC returns may be in display format (with locale-specific commas/periods) — always use internal format fields. [src3]
Error Handling & Failure Points
Common Error Codes
| Code | Meaning | Cause | Resolution |
| INFA_CONN_001 | Connection failed | Agent cannot reach source/target | Verify network path, firewall, agent status |
| SAP JCo Exception | SAP library error | Missing/incompatible JCo libraries | Reinstall correct JCo version for agent OS/JVM |
| ORA-01291 | Missing log file | Redo log archived/deleted before CDC read | Increase log retention; reduce CDC poll interval |
| IPU_EXCEEDED | IPU allocation exhausted | Monthly IPU budget consumed | Wait for billing cycle reset or purchase more |
| INFA_AGENT_DOWN | Secure Agent offline | Agent service crashed or host unreachable | Restart agent; check host resources |
| SAP_RFC_AUTH | RFC authorization failure | Missing S_RFC authorization | Grant S_RFC auth object in SAP role |
| ORA-04031 | Shared pool memory | LogMiner consuming excessive shared pool | Switch to direct redo log or increase SGA |
Failure Points in Production
- Secure Agent memory exhaustion: High-concurrency CDI jobs exhaust agent heap memory, causing silent failures with partial loads. Fix:
Monitor agent JVM heap; increase -Xmx; limit concurrent jobs. [src2]
- SAP ODP delta queue corruption: Failed mid-stream extraction causes inconsistent delta queue, resulting in duplicate/missing records. Fix:
Reset delta queue in SAP (ODQMON), re-initialize with full extraction. [src3]
- Oracle CDC lag during peak writes: CDIR CDC falls behind during high database write activity, especially with LogMiner. Fix:
Switch to direct redo log access; increase CDIR parallelism. [src4]
- Session timeout during long-running automation: REST API session expires after 30 min of inactivity, breaking scripts. Fix:
Implement session keepalive pings every 20 minutes. [src2]
- Marketplace connector version mismatch: Connector updates can change field mappings silently. Fix:
Pin connector versions; test updates in sandbox org first. [src1]
Anti-Patterns
Wrong: Using CDI for high-volume real-time CDC
# BAD — CDI batch mapping scheduled every 1 minute for "real-time" CDC
# Creates 1,440 jobs/day, each consuming IPUs for startup overhead
schedule.every(1).minutes.do(run_cdi_mapping, "SAP_Orders_Extract")
Correct: Use CDIR for continuous CDC streaming
# GOOD — CDIR provides continuous CDC with minimal overhead
# Runs as a long-lived streaming process; IPU based on volume, not job count
cdir_task = create_cdir_task(
source="Oracle_Production", mode="CDC",
target="Snowflake_DW",
tables=["ORDERS", "ORDER_LINES", "CUSTOMERS"])
Wrong: Hardcoding the IICS base URL
# BAD — IICS base URL varies by pod and can change
base_url = "https://usw3.dm-us.informaticacloud.com"
resp = requests.get(f"{base_url}/api/v2/task", headers=headers)
Correct: Always use the base URL returned from login
# GOOD — use serverUrl from login response
login_resp = requests.post(
"https://dm-us.informaticacloud.com/ma/api/v2/user/login",
json={"username": user, "password": pwd})
base_url = login_resp.json()["serverUrl"] # Dynamic — correct for your pod
Wrong: Running SAP ODP full extraction on every schedule
# BAD — full extraction every time wastes IPUs and loads SAP
task_config = {"extractionMode": "FULL", "schedule": "DAILY"}
Correct: Use ODP delta extraction after initial full load
# GOOD — initial full load, then delta extractions only
# ODP delta queue tracks changes; only new/modified records extracted
initial_config = {"extractionMode": "FULL"} # Run once
delta_config = {"extractionMode": "DELTA", "schedule": "EVERY_15_MIN"}
Common Pitfalls
- PowerCenter-to-IICS migration underestimation: Complex mappings with Java transformations require 30-50% manual refactoring. Fix:
Run the conversion assessment tool first; budget for manual remediation. [src1]
- IPU budget blindness: One poorly designed elastic mapping can consume a month's IPU allocation in a day. Fix:
Set IPU threshold alerts at 50% and 75%; implement per-project budgets via sub-orgs. [src5]
- Secure Agent single point of failure: A single agent creates availability risk. Fix:
Deploy at minimum 2 Secure Agents in the same Runtime Environment for failover. [src2]
- SAP JCo library version mismatch: Wrong JCo version causes cryptic UnsatisfiedLinkError. Fix:
Match JCo to agent JVM architecture (64-bit JVM = 64-bit JCo). [src3]
- Oracle CDC without supplemental logging: Without ALL COLUMNS supplemental logging, CDC captures only changed columns. Fix:
Enable ALL COLUMNS supplemental logging before starting CDC. [src4]
- Ignoring CDIR initial load before CDC: Starting CDC without baseline creates a gap. Fix:
Always run initial load first, then enable CDC for ongoing changes. [src4]
Diagnostic Commands
# Check Secure Agent status
curl -s -X GET "${SERVER_URL}/api/v2/runtimeEnvironment" \
-H "icSessionId: ${SESSION_ID}" \
| jq '.[] | {name: .name, agents: [.agents[] | {name: .name, active: .active}]}'
# Check IPU consumption for current billing period
curl -s -X GET "${SERVER_URL}/api/v2/meteringInfo" \
-H "icSessionId: ${SESSION_ID}" \
| jq '{consumed: .consumed, purchased: .purchased, remaining: .remaining}'
# List recent job failures
curl -s -X GET "${SERVER_URL}/api/v2/activity/activityLog?rowLimit=20&type=MAPPING_TASK&executionState=FAILED" \
-H "icSessionId: ${SESSION_ID}" \
| jq '.[] | {taskName: .taskName, startTime: .startTime, errorMsg: .errorMsg}'
# Test SAP connection
curl -s -X POST "${SERVER_URL}/api/v2/connection/test" \
-H "icSessionId: ${SESSION_ID}" -H "Content-Type: application/json" \
-d '{"id":"<sap-connection-id>"}' | jq '.status'
# Test Oracle connection
curl -s -X POST "${SERVER_URL}/api/v2/connection/test" \
-H "icSessionId: ${SESSION_ID}" -H "Content-Type: application/json" \
-d '{"id":"<oracle-connection-id>"}' | jq '.status'
Version History & Compatibility
| Release | Date | Status | Key Changes | Notes |
| Winter 2026 | 2026-01 | Current | CLAIRE AI copilot GA, MCP Server announced | MCP Server enables AI agent integration |
| Fall 2025 | 2025-10 | Supported | New OCI integrations, Oracle CDC improvements | Enhanced Oracle Autonomous DB support |
| Summer 2025 | 2025-07 | Supported | CDI-E performance, new marketplace connectors | Elastic compute auto-scaling enhancements |
| Spring 2025 | 2025-04 | Supported | CDIR CDC for SAP (GA), SAP ODP v2 | Major SAP CDC milestone |
| Winter 2025 | 2025-01 | Supported | API Center rate limiting, enhanced monitoring | API management maturity |
When to Use / When Not to Use
| Use When | Don't Use When | Use Instead |
| Need deep SAP extraction (ODP, BAPI/RFC, IDoc) at scale | Simple REST-to-REST integration | Boomi or Workato |
| Need Oracle CDC replication with minimal source impact | API-first architecture as primary focus | MuleSoft Anypoint |
| Already a PowerCenter customer migrating to cloud | SAP-to-SAP only integration | SAP Integration Suite |
| Need embedded data quality alongside integration | Budget is strictly per-connection flat rate | Boomi (connection-based pricing) |
| Data volume exceeds 100M rows/day | Need citizen integrator / low-code-first platform | Workato or Boomi |
| Need unified ETL + CDC + API + DQ + MDM platform | Need event-driven microservices architecture | MuleSoft or Apache Kafka |
Cross-System Comparison
| Capability | Informatica IICS | MuleSoft Anypoint | Boomi AtomSphere | Notes |
| Primary strength | Data integration + quality | API-led connectivity | Low-code iPaaS | IICS data-first; MuleSoft API-first |
| Pricing model | IPU consumption-based | vCore allocation | Connection + flow count | IICS most granular metering |
| Starting price | ~$10,000+/month | ~$15,000+/month | ~$3,000+/month | Boomi cheapest entry |
| SAP connector depth | Deep (ODP, BAPI/RFC, IDoc, OData, table) | Moderate (JCo, IDoc, OData) | Moderate (BAPI, IDoc) | IICS most SAP surfaces |
| Oracle CDC | Native (redo log + LogMiner) | Via partner connectors | Via partner connectors | IICS only native Oracle CDC |
| Data quality | Built-in (profiling, standardization) | None (requires partner) | None (requires partner) | IICS unique advantage |
| API management | API Center (maturing) | Full lifecycle (market leader) | API Management add-on | MuleSoft dominant in APIs |
| Low-code UX | Moderate (drag-and-drop) | Low (developer-focused) | High (citizen-friendly) | Boomi best for non-technical |
| Elastic compute | CDI-E (Spark auto-scaling) | CloudHub auto-scaling | Atom auto-scaling | IICS best for massive volumes |
| Connector count | 400+ (marketplace) | 350+ (Exchange) | 200+ (Flow) | IICS largest library |
| Legacy migration | PowerCenter conversion tool | N/A | N/A | Unique for Informatica customers |
| Gartner positioning | Leader (data integration + iPaaS) | Leader (iPaaS) | Leader (iPaaS) | IICS leads in data integration |
Important Caveats
- IPU consumption is metered at the scaler level (compute units, volume, executions), making cost prediction difficult for variable workloads — always build in 20-30% IPU buffer.
- Secure Agent deployment requires dedicated infrastructure (VM or bare metal) with specific OS requirements (RHEL 7+, Windows Server 2016+); containerized agents are available but limited.
- SAP connector capabilities vary by SAP system type — S/4HANA Cloud has different interfaces than S/4HANA on-premise or ECC 6.0. Always verify connector compatibility.
- IICS is multi-tenant SaaS — maintenance windows can temporarily disrupt real-time integrations. Plan for maintenance handling in production architectures.
- This card reflects Winter 2026 capabilities. Informatica releases quarterly updates. Always verify against current release notes.
Related Units