Data Archival & Integration Cutover During ERP Migration

Type: ERP Integration System: Cross-ERP (Architecture Pattern) Confidence: 0.83 Sources: 7 Verified: 2026-03-07 Freshness: evolving

TL;DR

System Profile

This card covers the architecture pattern for data archival decisions and integration cutover mechanics during ERP migration. It is system-agnostic at the pattern level but includes specific tooling references for SAP, Microsoft Dynamics 365, Oracle ERP Cloud, and NetSuite. The card does NOT cover ongoing data synchronization between two live ERPs, master data governance strategy, or specific ERP API capabilities.

SystemRoleMigration ToolCutover Capability
SAP S/4HANATarget ERPMigration Cockpit (MOM), DMIS, SLTStaging tables, delta load, migration objects
Microsoft D365 F&OTarget ERPData Management Framework (DMF/DIXF)Data packages, entity-based import/export
Oracle ERP CloudTarget ERPFBDI, ADFdiCSV/XML templates, scheduled imports
NetSuiteTarget ERPCSV Import Assistant, SuiteCloudRecord-type templates, SuiteTalk bulk
Middleware (iPaaS)Integration routerMuleSoft, Boomi, Workato, Azure Integration ServicesTraffic switching, parallel routing, canary

API Surfaces & Capabilities

Migration tools use different API surfaces than production integrations. Understanding which surface to use for each phase prevents bottlenecks.

PhaseAPI SurfaceProtocolBest ForVolume LimitReal-time?
Pre-cutover archivalDatabase-level exportDirect DB / RFCHistorical data extractionUnlimitedNo
Master data loadVendor migration toolStaging tables / data packagesCustomers, vendors, items, COA50K-500K/batchNo
Open transaction loadMigration tool + custom scriptsREST/OData + bulkOpen POs, SOs, invoices10K-100K/batchNo
Delta data syncCDC or incremental extractREST, OData, Change trackingTransactions during freeze1K-10K recordsNear-real-time
Integration switchingProduction APIHTTPS/JSON/XMLLive integration endpointsProduction rate limitsYes
Post-cutover validationReporting API + direct queriesSQL, OData, RESTReconciliationN/ANo

Rate Limits & Quotas

Migration Tool Limits

ERP SystemMigration ToolBatch Size LimitConcurrencyThrottle Behavior
SAP S/4HANAMigration Cockpit50,000 records/objectSequentialQueue-based
D365 F&ODMF/DIXF500,000 records/packageUp to 5 parallelBatch framework throttle
Oracle ERP CloudFBDI250 MB/file3 concurrent/moduleESS job queue
NetSuiteCSV Import25,000 records/import1 per record typeGovernance-limited

Cutover Window Time Budgets

ActivityTypical DurationParallelizable?Risk if Delayed
System lockdown (freeze legacy)1-2 hoursNo (gate)Users transact in legacy during migration
Master data final delta2-4 hoursPartiallyStale customer/vendor records
Open transaction migration4-8 hoursYes (by module)Missing POs, SOs, invoices
Integration endpoint switching2-4 hoursYes (by integration)Data flows to wrong system
Validation & reconciliation4-8 hoursYes (by domain)Undetected data loss
Go/no-go decision1 hourNo (gate)Delayed rollback decision
Total typical cutover24-48 hours----

Authentication

Pre-Cutover Credential Provisioning

Credential TypeNew ERP Action RequiredLead Time
Service account (username/password)Create integration user with equivalent roles2-4 weeks
OAuth 2.0 client credentialsRegister new OAuth app, configure scopes2-4 weeks
API key / tokenGenerate new key, whitelist IPs1-2 weeks
Certificate-based (mTLS, JWT)Generate new cert, install in trust store4-6 weeks
SSO/SAML federationAdd new ERP as SP in IdP, test assertion mapping3-4 weeks

Authentication Gotchas

Constraints

Integration Pattern Decision Tree

START — ERP migration with integrations to cut over
+-- How many integrations?
|   +-- < 10 --> Direct endpoint switching (manual reconfiguration)
|   +-- 10-50 --> Middleware abstraction layer (route through iPaaS)
|   +-- > 50 --> DNS/load-balancer switching + middleware routing
+-- Cutover strategy?
|   +-- Big-bang (single weekend)
|   |   +-- All integrations switch simultaneously
|   |   +-- Requires: 2-3 dress rehearsals, detailed runbook, rollback plan
|   |   +-- Data < 50M records? --> Feasible in 48h window
|   |   +-- Data > 50M records? --> Pre-stage historical, cutover only delta
|   +-- Phased (module-by-module or location-by-location)
|   |   +-- Integrations switch per module/location
|   |   +-- Requires: cross-system bridge for modules in different ERPs
|   |   +-- Budget 3-6 months total migration timeline
|   +-- Parallel run (both systems live)
|       +-- All integrations run against BOTH systems
|       +-- Requires: 2x integration capacity, daily reconciliation
|       +-- Duration: 1-4 weeks, only for mission-critical financial
+-- Historical data?
|   +-- Active master data --> MIGRATE
|   +-- Open transactions --> MIGRATE
|   +-- Closed < retention period --> ARCHIVE
|   +-- Closed > retention period --> EVALUATE
|   +-- Obsolete/duplicates --> ABANDON
+-- Rollback plan?
    +-- Before point-of-no-return --> Restore legacy, revert integrations
    +-- After users transact --> Manual reversal (costly)
    +-- No plan? --> STOP. Do not proceed.

Quick Reference

Cutover Checklist

PhaseTaskOwnerDurationValidation
T-6 weeksProvision integration credentials in new ERPIntegration leadOngoingTest each credential
T-4 weeksFirst dress rehearsal (full simulation)Migration lead48hRecord counts + balances
T-3 weeksArchive historical data to data warehouseData architect1-2 weeksSpot-check 100 records
T-2 weeksSecond dress rehearsalMigration lead48h< 5 open defects
T-1 weekFinal dress rehearsal + timingMigration lead48hCutover fits window
T-0 (Fri PM)System lockdown — freeze legacyERP admin1hNo new transactions
T+1hExtract delta dataData engineer2-4hDelta count matches
T+3hLoad delta into new ERPData engineer4-8hLoad success > 99.9%
T+8hSwitch integration endpointsIntegration lead2-4hTest message per integration
T+12hEnd-to-end smoke testQA lead2-4hCritical paths pass
T+16hReconciliationFinance / data team4-8hBalances match
T+24hGo/no-go decisionSteering committee1hPass or rollback
T+48hMonitor and stabilizeSupport teamOngoingIssue count trending down

Data Archival Decision Matrix

Data CategoryExampleMigrate?Archive?Abandon?Rationale
Active master dataCurrent customers, vendors, itemsYesNoNoRequired for day-1 ops
Open transactionsUnpaid invoices, open POsYesNoNoMust continue processing
GL balancesTrial balance at cutoverYes (opening bal)NoNoFinancial continuity
Closed txns < 3 yearsPaid invoices 2023-2025NoYesNoSOX/audit requirement
Closed txns 3-7 yearsHistorical 2019-2022NoYesNoTax/regulatory retention
Closed txns > 7 yearsPre-2019 dataNoEvaluateMaybeIndustry-dependent
Duplicates/obsoleteMerged records, test dataNoNoYesNo business/legal value
AttachmentsPDFs, scanned documentsNoYes (link)NoStore in DMS

Step-by-Step Integration Guide

1. Inventory all integrations and classify by criticality

Document every integration: source, target, direction, frequency, volume, and criticality (P1/P2/P3). [src3]

# integration_inventory.csv template
integration_id,name,source,target,direction,frequency,volume,protocol,criticality
INT-001,Order sync,Shopify,Legacy ERP,inbound,real-time,5000/day,REST,P1
INT-002,GL posting,Legacy ERP,Data warehouse,outbound,nightly,50000,SFTP,P2
INT-003,Inventory sync,WMS,Legacy ERP,bidirectional,15min,20000/day,SOAP,P1

Verify: wc -l integration_inventory.csv matches known integration count.

2. Classify data into migrate/archive/abandon

Run data profiling against legacy ERP to categorize every table. [src4]

-- Identify active vs historical records per entity
SELECT 'customers' AS entity, COUNT(*) AS total,
  SUM(CASE WHEN last_txn_date > DATEADD(year,-2,GETDATE()) THEN 1 ELSE 0 END) AS active,
  SUM(CASE WHEN last_txn_date <= DATEADD(year,-2,GETDATE()) THEN 1 ELSE 0 END) AS inactive
FROM customers;

Verify: Sum of migrate + archive + abandon = total legacy records per entity.

3. Set up middleware abstraction layer for integration switching

Route integrations through switchable middleware endpoints for instant cutover. [src2]

// integration-router.js — switchable routing
const ROUTES = {
  'order-sync': {
    legacy: { url: 'https://legacy-erp/api/orders', auth: 'LEGACY_CREDS' },
    new_erp: { url: 'https://new-erp/api/salesorders', auth: 'NEW_CREDS' },
    active: 'legacy',  // switch to 'new_erp' during cutover
    dual_write: false   // set true for parallel run
  }
};

async function switchIntegration(id, target) {
  ROUTES[id].active = target;
  const test = await sendTestMessage(ROUTES[id][target]);
  if (!test.ok) throw new Error(`${id} post-switch test failed`);
}

Verify: Call switchIntegration('order-sync', 'new_erp') in test → confirm test message reaches new ERP.

4. Execute dress rehearsal (2-3 times before production)

Simulate the entire cutover weekend. Time every step against the window. [src1, src3]

#!/bin/bash
# dress-rehearsal.sh — Full cutover simulation
LOG="cutover_rehearsal_$(date +%Y%m%d).log"
echo "=== REHEARSAL START: $(date) ===" | tee -a "$LOG"
# Phase 1: Freeze legacy | Phase 2: Extract delta
# Phase 3: Load delta    | Phase 4: Switch integrations
# Phase 5: Validate + reconcile
echo "=== REHEARSAL END: $(date) ===" | tee -a "$LOG"
# Target: total duration fits window with 25% buffer

Verify: Total duration fits cutover window. If not, identify bottlenecks and optimize.

5. Execute production cutover and validate

Follow rehearsed runbook exactly. Run post-cutover reconciliation. [src1, src5]

# Post-cutover reconciliation
def reconcile(legacy_conn, new_conn, entities):
    for entity in entities:
        legacy_count = query_count(legacy_conn, entity['legacy_table'])
        new_count = query_count(new_conn, entity['new_table'])
        diff = abs(legacy_count - new_count)
        status = 'PASS' if diff <= entity.get('tolerance', 0) else 'FAIL'
        print(f"{entity['name']}: legacy={legacy_count} new={new_count} {status}")
    # Financial: trial balance must match to the penny

Verify: All entities PASS. Any FAIL triggers investigation before go/no-go.

Code Examples

Python: Integration Endpoint Switcher with Health Check

# Input:  Integration inventory, new ERP endpoint config
# Output: Switch confirmation with health check per integration

import httpx, asyncio

async def switch_and_verify(config, client):
    health = await client.get(f"{config.new_url}/health", headers=config.auth, timeout=10)
    if health.status_code >= 400:
        return {'id': config.id, 'status': 'BLOCKED', 'reason': f'Health: {health.status_code}'}
    await update_route(config.id, config.new_url)
    test = await client.post(f"{config.new_url}/api/test", headers=config.auth,
                             json={'cutover_test': True}, timeout=15)
    if test.status_code >= 400:
        await update_route(config.id, config.legacy_url)  # rollback
        return {'id': config.id, 'status': 'ROLLED_BACK'}
    return {'id': config.id, 'status': 'SWITCHED'}

SQL: Data Archival Pre-Migration Manifest

-- Generate archival manifest showing migrate/archive/abandon volumes
WITH data_age AS (
  SELECT 'invoices' AS entity, COUNT(*) AS total,
    SUM(CASE WHEN status='OPEN' THEN 1 ELSE 0 END) AS migrate,
    SUM(CASE WHEN status='CLOSED' AND dt > DATEADD(year,-7,GETDATE()) THEN 1 ELSE 0 END) AS archive,
    SUM(CASE WHEN status='CLOSED' AND dt <= DATEADD(year,-7,GETDATE()) THEN 1 ELSE 0 END) AS abandon
  FROM invoices
)
SELECT entity, total, migrate, archive, abandon,
  ROUND(100.0*(archive+abandon)/NULLIF(total,0),1) AS pct_not_migrated
FROM data_age;
-- Expected: 60-80% not migrated = faster cutover

Bash: SAP Migration Cockpit Delta Load

# Upload delta to Migration Cockpit staging tables
curl -X POST "https://s4hana.company.com/sap/opu/odata/sap/API_MIG_OBJ_STAGING_SRV/StagingFiles" \
  -H "Authorization: Bearer $SAP_TOKEN" -H "x-csrf-token: $CSRF_TOKEN" \
  -H "Content-Type: application/octet-stream" -H "slug: CUSTOMER_DELTA.xml" \
  --data-binary @customer_delta.xml

# Trigger migration run (SIMULATE first, then MIGRATE)
curl -X POST "https://s4hana.company.com/sap/opu/odata/sap/API_MIG_OBJ_STAGING_SRV/MigrationRuns" \
  -H "Authorization: Bearer $SAP_TOKEN" -H "x-csrf-token: $CSRF_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{"MigrationObject":"CUSTOMER","RunMode":"SIMULATE"}'

Data Mapping

Cutover Data Flow Mapping

Data FlowSourceCutover ActionTargetValidation
Master data deltaLegacy DBCleanse + transform + loadNew ERP masterRecord count + sampling
Open POsLegacy PO tablesMap + loadNew PO moduleCount + total value
Open invoicesLegacy AR/APConvert to opening balancesNew sub-ledgerBalance to penny
GL trial balanceLegacy GLOpening balance journalNew GLDebits = credits
Integration configLegacy endpointsRemap to new endpointsMiddlewareSmoke test each

Data Type Gotchas

Error Handling & Failure Points

Common Cutover Errors

ErrorCauseResolution
Migration load timeoutFile too large, lock contentionSplit into smaller batches
Duplicate key violationDelta includes records from prior loadUse upsert or pre-check existence
Foreign key constraint failureIncorrect load orderLoad masters first, then transactions
Integration auth failure (401/403)Credentials not provisionedVerify credential checklist
Data type mismatchSource format incompatibleAdd transform step in ETL
Rate limit exceeded (429)Loading too fast via APIUse vendor migration tools instead
Sequence number collisionID range overlapStart sequences above migrated max

Failure Points in Production

Anti-Patterns

Wrong: Migrating all historical data into the new ERP

// BAD — migrating 10+ years of closed transactions
MIGRATE: ALL invoices 2010-2025 (5,000,000)   // 80% closed, never accessed
MIGRATE: ALL POs 2010-2025 (3,000,000)        // 75% closed, never accessed
MIGRATE: ALL journal entries (20,000,000)      // historical, not operational
// Result: 96h cutover, degraded performance, $500K+ effort

Correct: Archive historical, migrate only active + open

// GOOD — archive first, migrate lean
MIGRATE: Active masters (50K) + Open transactions (23K)
MIGRATE: Trial balance as opening journal (1 entry per GL account)
ARCHIVE: Closed 2019-2025 to data warehouse
ABANDON: Pre-2019 closed + duplicates (after legal review)
// Result: 36h cutover, optimal performance, 6 weeks effort

Wrong: No parallel run for financial integrations

// BAD — switch GL integration without verification
Friday 8PM: Switch GL posting to new ERP
Monday 8AM: Finance discovers GL out of balance by $2.3M
// Root cause: currency conversion logic different in new ERP
// No reconciliation, no parallel run, no smoke test

Correct: Parallel run financial integrations with daily reconciliation

// GOOD — parallel run GL for 1-2 weeks
Week 1: GL posts to BOTH legacy and new ERP
        Daily reconciliation: compare per GL account
Week 2: Discrepancies < 0.01% for 3 consecutive days
        Deactivate legacy, new ERP is sole system of record
// Cost: 2 weeks dual effort. Benefit: $0 in restatements

Wrong: Switching integrations without middleware abstraction

// BAD — reconfiguring each source system individually
Hour 1: SSH into Shopify app, change endpoint, restart
Hour 2: SSH into WMS, update config file, restart
Hour 3: Log into Salesforce, update named credential
Hour 12: Discover WMS config typo — 8 hours of data lost

Correct: Route through middleware with single-point switching

// GOOD — middleware handles routing; source systems unchanged
Pre-cutover: All integrations route through iPaaS to legacy
Cutover: Change iPaaS target config per integration (< 1 min each)
         Each switch includes automated health check
Rollback: Change config back (< 1 min per integration)
// Source systems never change. Rollback is instant.

Common Pitfalls

Diagnostic Commands

# Compare record counts: legacy vs new ERP (post-cutover)
psql -h legacy-erp -d erp -c "SELECT 'customers', COUNT(*) FROM customers WHERE status='ACTIVE'" \
  && psql -h new-erp -d erp -c "SELECT 'customers', COUNT(*) FROM bp_master WHERE status='ACTIVE'"

# Check D365 DMF import job status
curl -s "https://d365.operations.dynamics.com/data/DataManagementJobs" \
  -H "Authorization: Bearer $D365_TOKEN" | jq '.value[] | {jobId:.JobId, status:.Status}'

# Check SAP Migration Cockpit errors
curl -s "https://s4hana.company.com/sap/opu/odata/sap/API_MIG_OBJ_STAGING_SRV/MigrationRuns?\$filter=RunStatus eq 'ERROR'" \
  -H "Authorization: Bearer $SAP_TOKEN" | jq '.d.results[]'

# Validate trial balance match
psql -h legacy -c "SELECT SUM(debit), SUM(credit) FROM gl_entries WHERE period <= '2026-03'" \
  && psql -h new-erp -c "SELECT SUM(debit), SUM(credit) FROM gl_journal WHERE type='OPENING'"

Version History & Compatibility

ERP SystemMigration ToolVersionStatusKey Capability
SAP S/4HANAMigration Cockpit2408GAStaging tables, migration objects, simulate mode
SAP S/4HANADMIS2011 SP29GASLT replication, CDC-based delta
D365 F&ODMF10.0.40+GAData packages, composite entities
D365 F&ODual-write10.0.40+GAReal-time sync for phased cutover
Oracle ERP CloudFBDI24BGACSV templates, ESS scheduler
NetSuiteCSV Import2024.1GARecord-type templates, 25K limit
NetSuiteSuiteCloud Data Loader2024.1GABulk SOAP-based import

When to Use / When Not to Use

Use WhenDon't Use WhenUse Instead
Migrating from one ERP to anotherOngoing sync between two live ERPsCDC patterns
Deciding what historical data to carry forwardAlready decided scope, need API guidanceSystem-specific API card
Planning integration cutover for go-liveBuilding new integrations (no legacy)Batch vs real-time
Managing rollback risk during transitionSystem already live and stableError handling & DLQs
Need per-ERP migration tool guidanceNeed iPaaS comparisoniPaaS comparison

Cross-System Comparison

Migration Tool Comparison

CapabilitySAP Migration CockpitD365 DMFOracle FBDINetSuite CSV
ApproachStaging tables + objectsData entities + packagesCSV/XML templates + ESSCSV templates + Assistant
Max batch50K records/object500K records/package250 MB/file25K records/import
Simulate modeYesYesYesYes (preview)
Delta/incrementalYes (DMIS/SLT)Yes (change tracking)Yes (differential)No (full replace)
Custom objectsYes (custom MOM)Yes (custom entities)Yes (custom templates)Limited
Error handlingPer-record logPer-record exceptionsESS log + error rowsError CSV download
RollbackManual deleteDelete via DMFNo built-inNo built-in
Learning curveHighMediumMediumLow

Important Caveats

Related Units