This card covers the architecture pattern for data archival decisions and integration cutover mechanics during ERP migration. It is system-agnostic at the pattern level but includes specific tooling references for SAP, Microsoft Dynamics 365, Oracle ERP Cloud, and NetSuite. The card does NOT cover ongoing data synchronization between two live ERPs, master data governance strategy, or specific ERP API capabilities.
| System | Role | Migration Tool | Cutover Capability |
|---|---|---|---|
| SAP S/4HANA | Target ERP | Migration Cockpit (MOM), DMIS, SLT | Staging tables, delta load, migration objects |
| Microsoft D365 F&O | Target ERP | Data Management Framework (DMF/DIXF) | Data packages, entity-based import/export |
| Oracle ERP Cloud | Target ERP | FBDI, ADFdi | CSV/XML templates, scheduled imports |
| NetSuite | Target ERP | CSV Import Assistant, SuiteCloud | Record-type templates, SuiteTalk bulk |
| Middleware (iPaaS) | Integration router | MuleSoft, Boomi, Workato, Azure Integration Services | Traffic switching, parallel routing, canary |
Migration tools use different API surfaces than production integrations. Understanding which surface to use for each phase prevents bottlenecks.
| Phase | API Surface | Protocol | Best For | Volume Limit | Real-time? |
|---|---|---|---|---|---|
| Pre-cutover archival | Database-level export | Direct DB / RFC | Historical data extraction | Unlimited | No |
| Master data load | Vendor migration tool | Staging tables / data packages | Customers, vendors, items, COA | 50K-500K/batch | No |
| Open transaction load | Migration tool + custom scripts | REST/OData + bulk | Open POs, SOs, invoices | 10K-100K/batch | No |
| Delta data sync | CDC or incremental extract | REST, OData, Change tracking | Transactions during freeze | 1K-10K records | Near-real-time |
| Integration switching | Production API | HTTPS/JSON/XML | Live integration endpoints | Production rate limits | Yes |
| Post-cutover validation | Reporting API + direct queries | SQL, OData, REST | Reconciliation | N/A | No |
| ERP System | Migration Tool | Batch Size Limit | Concurrency | Throttle Behavior |
|---|---|---|---|---|
| SAP S/4HANA | Migration Cockpit | 50,000 records/object | Sequential | Queue-based |
| D365 F&O | DMF/DIXF | 500,000 records/package | Up to 5 parallel | Batch framework throttle |
| Oracle ERP Cloud | FBDI | 250 MB/file | 3 concurrent/module | ESS job queue |
| NetSuite | CSV Import | 25,000 records/import | 1 per record type | Governance-limited |
| Activity | Typical Duration | Parallelizable? | Risk if Delayed |
|---|---|---|---|
| System lockdown (freeze legacy) | 1-2 hours | No (gate) | Users transact in legacy during migration |
| Master data final delta | 2-4 hours | Partially | Stale customer/vendor records |
| Open transaction migration | 4-8 hours | Yes (by module) | Missing POs, SOs, invoices |
| Integration endpoint switching | 2-4 hours | Yes (by integration) | Data flows to wrong system |
| Validation & reconciliation | 4-8 hours | Yes (by domain) | Undetected data loss |
| Go/no-go decision | 1 hour | No (gate) | Delayed rollback decision |
| Total typical cutover | 24-48 hours | -- | -- |
| Credential Type | New ERP Action Required | Lead Time |
|---|---|---|
| Service account (username/password) | Create integration user with equivalent roles | 2-4 weeks |
| OAuth 2.0 client credentials | Register new OAuth app, configure scopes | 2-4 weeks |
| API key / token | Generate new key, whitelist IPs | 1-2 weeks |
| Certificate-based (mTLS, JWT) | Generate new cert, install in trust store | 4-6 weeks |
| SSO/SAML federation | Add new ERP as SP in IdP, test assertion mapping | 3-4 weeks |
START — ERP migration with integrations to cut over
+-- How many integrations?
| +-- < 10 --> Direct endpoint switching (manual reconfiguration)
| +-- 10-50 --> Middleware abstraction layer (route through iPaaS)
| +-- > 50 --> DNS/load-balancer switching + middleware routing
+-- Cutover strategy?
| +-- Big-bang (single weekend)
| | +-- All integrations switch simultaneously
| | +-- Requires: 2-3 dress rehearsals, detailed runbook, rollback plan
| | +-- Data < 50M records? --> Feasible in 48h window
| | +-- Data > 50M records? --> Pre-stage historical, cutover only delta
| +-- Phased (module-by-module or location-by-location)
| | +-- Integrations switch per module/location
| | +-- Requires: cross-system bridge for modules in different ERPs
| | +-- Budget 3-6 months total migration timeline
| +-- Parallel run (both systems live)
| +-- All integrations run against BOTH systems
| +-- Requires: 2x integration capacity, daily reconciliation
| +-- Duration: 1-4 weeks, only for mission-critical financial
+-- Historical data?
| +-- Active master data --> MIGRATE
| +-- Open transactions --> MIGRATE
| +-- Closed < retention period --> ARCHIVE
| +-- Closed > retention period --> EVALUATE
| +-- Obsolete/duplicates --> ABANDON
+-- Rollback plan?
+-- Before point-of-no-return --> Restore legacy, revert integrations
+-- After users transact --> Manual reversal (costly)
+-- No plan? --> STOP. Do not proceed.
| Phase | Task | Owner | Duration | Validation |
|---|---|---|---|---|
| T-6 weeks | Provision integration credentials in new ERP | Integration lead | Ongoing | Test each credential |
| T-4 weeks | First dress rehearsal (full simulation) | Migration lead | 48h | Record counts + balances |
| T-3 weeks | Archive historical data to data warehouse | Data architect | 1-2 weeks | Spot-check 100 records |
| T-2 weeks | Second dress rehearsal | Migration lead | 48h | < 5 open defects |
| T-1 week | Final dress rehearsal + timing | Migration lead | 48h | Cutover fits window |
| T-0 (Fri PM) | System lockdown — freeze legacy | ERP admin | 1h | No new transactions |
| T+1h | Extract delta data | Data engineer | 2-4h | Delta count matches |
| T+3h | Load delta into new ERP | Data engineer | 4-8h | Load success > 99.9% |
| T+8h | Switch integration endpoints | Integration lead | 2-4h | Test message per integration |
| T+12h | End-to-end smoke test | QA lead | 2-4h | Critical paths pass |
| T+16h | Reconciliation | Finance / data team | 4-8h | Balances match |
| T+24h | Go/no-go decision | Steering committee | 1h | Pass or rollback |
| T+48h | Monitor and stabilize | Support team | Ongoing | Issue count trending down |
| Data Category | Example | Migrate? | Archive? | Abandon? | Rationale |
|---|---|---|---|---|---|
| Active master data | Current customers, vendors, items | Yes | No | No | Required for day-1 ops |
| Open transactions | Unpaid invoices, open POs | Yes | No | No | Must continue processing |
| GL balances | Trial balance at cutover | Yes (opening bal) | No | No | Financial continuity |
| Closed txns < 3 years | Paid invoices 2023-2025 | No | Yes | No | SOX/audit requirement |
| Closed txns 3-7 years | Historical 2019-2022 | No | Yes | No | Tax/regulatory retention |
| Closed txns > 7 years | Pre-2019 data | No | Evaluate | Maybe | Industry-dependent |
| Duplicates/obsolete | Merged records, test data | No | No | Yes | No business/legal value |
| Attachments | PDFs, scanned documents | No | Yes (link) | No | Store in DMS |
Document every integration: source, target, direction, frequency, volume, and criticality (P1/P2/P3). [src3]
# integration_inventory.csv template
integration_id,name,source,target,direction,frequency,volume,protocol,criticality
INT-001,Order sync,Shopify,Legacy ERP,inbound,real-time,5000/day,REST,P1
INT-002,GL posting,Legacy ERP,Data warehouse,outbound,nightly,50000,SFTP,P2
INT-003,Inventory sync,WMS,Legacy ERP,bidirectional,15min,20000/day,SOAP,P1
Verify: wc -l integration_inventory.csv matches known integration count.
Run data profiling against legacy ERP to categorize every table. [src4]
-- Identify active vs historical records per entity
SELECT 'customers' AS entity, COUNT(*) AS total,
SUM(CASE WHEN last_txn_date > DATEADD(year,-2,GETDATE()) THEN 1 ELSE 0 END) AS active,
SUM(CASE WHEN last_txn_date <= DATEADD(year,-2,GETDATE()) THEN 1 ELSE 0 END) AS inactive
FROM customers;
Verify: Sum of migrate + archive + abandon = total legacy records per entity.
Route integrations through switchable middleware endpoints for instant cutover. [src2]
// integration-router.js — switchable routing
const ROUTES = {
'order-sync': {
legacy: { url: 'https://legacy-erp/api/orders', auth: 'LEGACY_CREDS' },
new_erp: { url: 'https://new-erp/api/salesorders', auth: 'NEW_CREDS' },
active: 'legacy', // switch to 'new_erp' during cutover
dual_write: false // set true for parallel run
}
};
async function switchIntegration(id, target) {
ROUTES[id].active = target;
const test = await sendTestMessage(ROUTES[id][target]);
if (!test.ok) throw new Error(`${id} post-switch test failed`);
}
Verify: Call switchIntegration('order-sync', 'new_erp') in test → confirm test message reaches new ERP.
Simulate the entire cutover weekend. Time every step against the window. [src1, src3]
#!/bin/bash
# dress-rehearsal.sh — Full cutover simulation
LOG="cutover_rehearsal_$(date +%Y%m%d).log"
echo "=== REHEARSAL START: $(date) ===" | tee -a "$LOG"
# Phase 1: Freeze legacy | Phase 2: Extract delta
# Phase 3: Load delta | Phase 4: Switch integrations
# Phase 5: Validate + reconcile
echo "=== REHEARSAL END: $(date) ===" | tee -a "$LOG"
# Target: total duration fits window with 25% buffer
Verify: Total duration fits cutover window. If not, identify bottlenecks and optimize.
Follow rehearsed runbook exactly. Run post-cutover reconciliation. [src1, src5]
# Post-cutover reconciliation
def reconcile(legacy_conn, new_conn, entities):
for entity in entities:
legacy_count = query_count(legacy_conn, entity['legacy_table'])
new_count = query_count(new_conn, entity['new_table'])
diff = abs(legacy_count - new_count)
status = 'PASS' if diff <= entity.get('tolerance', 0) else 'FAIL'
print(f"{entity['name']}: legacy={legacy_count} new={new_count} {status}")
# Financial: trial balance must match to the penny
Verify: All entities PASS. Any FAIL triggers investigation before go/no-go.
# Input: Integration inventory, new ERP endpoint config
# Output: Switch confirmation with health check per integration
import httpx, asyncio
async def switch_and_verify(config, client):
health = await client.get(f"{config.new_url}/health", headers=config.auth, timeout=10)
if health.status_code >= 400:
return {'id': config.id, 'status': 'BLOCKED', 'reason': f'Health: {health.status_code}'}
await update_route(config.id, config.new_url)
test = await client.post(f"{config.new_url}/api/test", headers=config.auth,
json={'cutover_test': True}, timeout=15)
if test.status_code >= 400:
await update_route(config.id, config.legacy_url) # rollback
return {'id': config.id, 'status': 'ROLLED_BACK'}
return {'id': config.id, 'status': 'SWITCHED'}
-- Generate archival manifest showing migrate/archive/abandon volumes
WITH data_age AS (
SELECT 'invoices' AS entity, COUNT(*) AS total,
SUM(CASE WHEN status='OPEN' THEN 1 ELSE 0 END) AS migrate,
SUM(CASE WHEN status='CLOSED' AND dt > DATEADD(year,-7,GETDATE()) THEN 1 ELSE 0 END) AS archive,
SUM(CASE WHEN status='CLOSED' AND dt <= DATEADD(year,-7,GETDATE()) THEN 1 ELSE 0 END) AS abandon
FROM invoices
)
SELECT entity, total, migrate, archive, abandon,
ROUND(100.0*(archive+abandon)/NULLIF(total,0),1) AS pct_not_migrated
FROM data_age;
-- Expected: 60-80% not migrated = faster cutover
# Upload delta to Migration Cockpit staging tables
curl -X POST "https://s4hana.company.com/sap/opu/odata/sap/API_MIG_OBJ_STAGING_SRV/StagingFiles" \
-H "Authorization: Bearer $SAP_TOKEN" -H "x-csrf-token: $CSRF_TOKEN" \
-H "Content-Type: application/octet-stream" -H "slug: CUSTOMER_DELTA.xml" \
--data-binary @customer_delta.xml
# Trigger migration run (SIMULATE first, then MIGRATE)
curl -X POST "https://s4hana.company.com/sap/opu/odata/sap/API_MIG_OBJ_STAGING_SRV/MigrationRuns" \
-H "Authorization: Bearer $SAP_TOKEN" -H "x-csrf-token: $CSRF_TOKEN" \
-H "Content-Type: application/json" \
-d '{"MigrationObject":"CUSTOMER","RunMode":"SIMULATE"}'
| Data Flow | Source | Cutover Action | Target | Validation |
|---|---|---|---|---|
| Master data delta | Legacy DB | Cleanse + transform + load | New ERP master | Record count + sampling |
| Open POs | Legacy PO tables | Map + load | New PO module | Count + total value |
| Open invoices | Legacy AR/AP | Convert to opening balances | New sub-ledger | Balance to penny |
| GL trial balance | Legacy GL | Opening balance journal | New GL | Debits = credits |
| Integration config | Legacy endpoints | Remap to new endpoints | Middleware | Smoke test each |
| Error | Cause | Resolution |
|---|---|---|
| Migration load timeout | File too large, lock contention | Split into smaller batches |
| Duplicate key violation | Delta includes records from prior load | Use upsert or pre-check existence |
| Foreign key constraint failure | Incorrect load order | Load masters first, then transactions |
| Integration auth failure (401/403) | Credentials not provisioned | Verify credential checklist |
| Data type mismatch | Source format incompatible | Add transform step in ETL |
| Rate limit exceeded (429) | Loading too fast via API | Use vendor migration tools instead |
| Sequence number collision | ID range overlap | Start sequences above migrated max |
Tight freeze-extract-validate loop before proceeding. [src1]Queue all messages in middleware DLQ during cutover; replay on rollback. [src2]Use full production-volume data in dress rehearsals. [src3]Deploy read-only archival system before decommissioning. [src4]Maintain cross-reference table (legacy_doc_id to new_doc_id) in archival. [src4]Define tolerance bands; auto-dismiss timing diffs < 24h. [src5]// BAD — migrating 10+ years of closed transactions
MIGRATE: ALL invoices 2010-2025 (5,000,000) // 80% closed, never accessed
MIGRATE: ALL POs 2010-2025 (3,000,000) // 75% closed, never accessed
MIGRATE: ALL journal entries (20,000,000) // historical, not operational
// Result: 96h cutover, degraded performance, $500K+ effort
// GOOD — archive first, migrate lean
MIGRATE: Active masters (50K) + Open transactions (23K)
MIGRATE: Trial balance as opening journal (1 entry per GL account)
ARCHIVE: Closed 2019-2025 to data warehouse
ABANDON: Pre-2019 closed + duplicates (after legal review)
// Result: 36h cutover, optimal performance, 6 weeks effort
// BAD — switch GL integration without verification
Friday 8PM: Switch GL posting to new ERP
Monday 8AM: Finance discovers GL out of balance by $2.3M
// Root cause: currency conversion logic different in new ERP
// No reconciliation, no parallel run, no smoke test
// GOOD — parallel run GL for 1-2 weeks
Week 1: GL posts to BOTH legacy and new ERP
Daily reconciliation: compare per GL account
Week 2: Discrepancies < 0.01% for 3 consecutive days
Deactivate legacy, new ERP is sole system of record
// Cost: 2 weeks dual effort. Benefit: $0 in restatements
// BAD — reconfiguring each source system individually
Hour 1: SSH into Shopify app, change endpoint, restart
Hour 2: SSH into WMS, update config file, restart
Hour 3: Log into Salesforce, update named credential
Hour 12: Discover WMS config typo — 8 hours of data lost
// GOOD — middleware handles routing; source systems unchanged
Pre-cutover: All integrations route through iPaaS to legacy
Cutover: Change iPaaS target config per integration (< 1 min each)
Each switch includes automated health check
Rollback: Change config back (< 1 min per integration)
// Source systems never change. Rollback is instant.
Mandate 2-3 full rehearsals in production-equivalent environment. [src1]Run pre-loads close to cutover; calculate expected delta from daily rates. [src1]Use UTC for all cutover timestamps and log entries. [src3]Define rollback criteria before cutover; decision gate at T+24h. [src5]Do not decommission until archival access verified; 90-day minimum verification. [src4]Start new sequences above legacy max + buffer. [src6]FIFO queues or handle out-of-order via timestamps. [src2]Run end-to-end business process tests during rehearsal. [src3]# Compare record counts: legacy vs new ERP (post-cutover)
psql -h legacy-erp -d erp -c "SELECT 'customers', COUNT(*) FROM customers WHERE status='ACTIVE'" \
&& psql -h new-erp -d erp -c "SELECT 'customers', COUNT(*) FROM bp_master WHERE status='ACTIVE'"
# Check D365 DMF import job status
curl -s "https://d365.operations.dynamics.com/data/DataManagementJobs" \
-H "Authorization: Bearer $D365_TOKEN" | jq '.value[] | {jobId:.JobId, status:.Status}'
# Check SAP Migration Cockpit errors
curl -s "https://s4hana.company.com/sap/opu/odata/sap/API_MIG_OBJ_STAGING_SRV/MigrationRuns?\$filter=RunStatus eq 'ERROR'" \
-H "Authorization: Bearer $SAP_TOKEN" | jq '.d.results[]'
# Validate trial balance match
psql -h legacy -c "SELECT SUM(debit), SUM(credit) FROM gl_entries WHERE period <= '2026-03'" \
&& psql -h new-erp -c "SELECT SUM(debit), SUM(credit) FROM gl_journal WHERE type='OPENING'"
| ERP System | Migration Tool | Version | Status | Key Capability |
|---|---|---|---|---|
| SAP S/4HANA | Migration Cockpit | 2408 | GA | Staging tables, migration objects, simulate mode |
| SAP S/4HANA | DMIS | 2011 SP29 | GA | SLT replication, CDC-based delta |
| D365 F&O | DMF | 10.0.40+ | GA | Data packages, composite entities |
| D365 F&O | Dual-write | 10.0.40+ | GA | Real-time sync for phased cutover |
| Oracle ERP Cloud | FBDI | 24B | GA | CSV templates, ESS scheduler |
| NetSuite | CSV Import | 2024.1 | GA | Record-type templates, 25K limit |
| NetSuite | SuiteCloud Data Loader | 2024.1 | GA | Bulk SOAP-based import |
| Use When | Don't Use When | Use Instead |
|---|---|---|
| Migrating from one ERP to another | Ongoing sync between two live ERPs | CDC patterns |
| Deciding what historical data to carry forward | Already decided scope, need API guidance | System-specific API card |
| Planning integration cutover for go-live | Building new integrations (no legacy) | Batch vs real-time |
| Managing rollback risk during transition | System already live and stable | Error handling & DLQs |
| Need per-ERP migration tool guidance | Need iPaaS comparison | iPaaS comparison |
| Capability | SAP Migration Cockpit | D365 DMF | Oracle FBDI | NetSuite CSV |
|---|---|---|---|---|
| Approach | Staging tables + objects | Data entities + packages | CSV/XML templates + ESS | CSV templates + Assistant |
| Max batch | 50K records/object | 500K records/package | 250 MB/file | 25K records/import |
| Simulate mode | Yes | Yes | Yes | Yes (preview) |
| Delta/incremental | Yes (DMIS/SLT) | Yes (change tracking) | Yes (differential) | No (full replace) |
| Custom objects | Yes (custom MOM) | Yes (custom entities) | Yes (custom templates) | Limited |
| Error handling | Per-record log | Per-record exceptions | ESS log + error rows | Error CSV download |
| Rollback | Manual delete | Delete via DMF | No built-in | No built-in |
| Learning curve | High | Medium | Medium | Low |