This is an architecture pattern card covering compliance audit logging across the five major ERP systems used in regulated integrations. It maps native audit capabilities and defines the integration-layer logging that fills the gaps. The pattern applies globally with a US/EU focus (SOX is US federal, HIPAA is US healthcare, GDPR is EU).
| System | Native Audit Capability | Retention (Native) | SOX Gap | HIPAA Gap | GDPR Gap |
|---|---|---|---|---|---|
| Salesforce | Shield Event Monitoring + Field Audit Trail | 6 months (Setup), 10 years (Field Audit Trail w/ Shield) | Field Audit Trail closes gap | Requires Shield add-on | No built-in data subject log |
| SAP S/4HANA | Security Audit Log + Table Logging + Read Access Logging | Configurable (90 days default) | Requires explicit table logging config | Read Access Logging covers PHI reads | Table logging must be enabled per-table |
| D365 F&O | Database Log + Dataverse Auditing | 30 days default | Database Log covers financial changes | No PHI-specific tagging | Dataverse auditing covers access |
| NetSuite | System Notes + Audit Trail | 365 days (Compliance 360) | System Notes cover transactions | Insufficient retention (6yr required) | No built-in DSAR logging |
| Oracle ERP Cloud | Value Change History + Security Console | Configurable per object | History covers financial fields | No PHI classification in logs | Audit History covers access |
Audit-relevant APIs across all systems that the integration layer must interact with.
| ERP System | Audit API / Mechanism | Protocol | Export Format | Real-time? | SIEM Integration |
|---|---|---|---|---|---|
| Salesforce Shield | Event Monitoring API | REST/JSON | EventLogFile (CSV/JSON) | Near-real-time (hourly) | Splunk app, ELK via API |
| SAP S/4HANA | Security Audit Log (SM20) | RFC/OData | SYSLOG, CEF | Real-time (with ETD) | Sentinel, Splunk HEC |
| D365 F&O | Database Log + Activity Log API | OData v4 | JSON | Near-real-time | Azure Sentinel native |
| NetSuite | System Notes + SuiteAnalytics | SuiteTalk SOAP/REST | XML/JSON | Batch (scheduled) | Custom export to SIEM |
| Oracle ERP Cloud | Audit History REST API | REST/JSON | JSON | Near-real-time | OCI Logging Analytics |
| ERP System | Audit API Limit | Window | Impact on Compliance |
|---|---|---|---|
| Salesforce | Shared with org API limits (100K/24h Enterprise) | 24h rolling | High-volume orgs may hit limits during log export |
| SAP S/4HANA | No explicit rate limit (filesystem-bound) | Continuous | Log file rotation can cause data loss if not monitored |
| D365 F&O | Performance impact proportional to logged tables | Continuous | Logging all tables degrades throughput 10-30% |
| NetSuite | SuiteAnalytics capped at 10K rows/search | Per query | Large audit extractions require pagination |
| Oracle ERP Cloud | Subject to REST API throttling (fair use) | Per minute | Bulk audit export needs scheduled jobs |
| Regulation | Typical Volume (per 1M txn/month) | Storage (7-year SOX) | Storage (6-year HIPAA) |
|---|---|---|---|
| SOX (financial only) | 5-15 GB/month | 420 GB - 1.26 TB | N/A |
| HIPAA (PHI access) | 10-30 GB/month | N/A | 720 GB - 2.16 TB |
| GDPR (personal data) | 3-8 GB/month | Varies (minimize) | Varies (minimize) |
| Multi-regulation | 15-40 GB/month | 1.26 - 3.36 TB | Combined with SOX |
Audit log infrastructure requires its own authentication layer, separate from integration service accounts, to maintain SOX segregation of duties.
| Component | Auth Method | Access Level | Rotation | Notes |
|---|---|---|---|---|
| Audit log writer | Service account + API key | Append-only | 90 days | Must NOT have read or delete access |
| Audit log reader | SSO + MFA | Read-only | Per session | SOX requires named user access |
| SIEM connector | OAuth 2.0 client credentials | Read + stream | 90 days | Dedicated connected app per ERP |
| Audit log admin | MFA + approval workflow | Full access | Emergency only | All access logged to separate audit trail |
START — Need compliant audit logging for integration
├── Which regulations apply?
│ ├── SOX only
│ │ ├── Log: all financial data changes (before/after values)
│ │ ├── Retention: 7 years immutable
│ │ └── Storage: WORM or append-only with hash chain
│ ├── HIPAA only
│ │ ├── Log: all PHI access (read, write, delete) + user identity
│ │ ├── Retention: 6 years minimum
│ │ └── Storage: encrypted at rest (AES-256), access-controlled
│ ├── GDPR only
│ │ ├── Log: processing activities, consent, DSAR fulfillment
│ │ ├── Retention: as short as defensible (1-3 years)
│ │ └── Storage: pseudonymized, separate key management
│ └── Multiple regulations
│ ├── Apply strictest requirement per field/dimension
│ ├── SOX retention (7yr) for financial fields
│ ├── HIPAA encryption for PHI fields
│ └── GDPR minimization for EU personal data
├── What log detail level? (by data classification)
│ ├── PUBLIC → minimal (timestamp, operation, status)
│ ├── INTERNAL → standard (+ user, system, correlation ID)
│ ├── CONFIDENTIAL → full SOX (+ before/after values)
│ ├── PHI → full HIPAA (+ access type, encryption status)
│ └── PII (EU) → pseudonymized GDPR (+ processing purpose)
├── Where to store?
│ ├── Cloud SIEM (Splunk Cloud, Azure Sentinel, Datadog)
│ ├── Self-managed (ELK, Graylog) with WORM backend
│ └── Hybrid hot/warm/cold (recommended for 6-7yr retention)
└── How to verify integrity?
├── Hash chain (SHA-256 linking each entry)
├── WORM storage (S3 Object Lock, Azure Immutable Blob)
└── Third-party attestation (SOC 2 Type II)
| Field | Type | Required By | Example | Notes |
|---|---|---|---|---|
event_id | UUID v4 | All | a1b2c3d4-e5f6-... | Globally unique |
correlation_id | UUID v4 | All | x7y8z9a0-b1c2-... | Same across all systems for one transaction |
timestamp | ISO 8601 UTC | All | 2026-03-07T14:32:01.123Z | Microsecond precision, always UTC |
source_system | String | All | salesforce-prod | System that generated the event |
target_system | String | All | netsuite-prod | System receiving the data |
operation | Enum | All | CREATE|UPDATE|DELETE|READ | Standardized across all systems |
entity_type | String | All | SalesOrder | Business object type |
entity_id | String | All | SO-2026-001234 | Business record identifier |
user_or_service | String | SOX, HIPAA | integration-svc-001 | Named user or service account |
data_classification | Enum | All | CONFIDENTIAL|PHI|PII | Drives retention and access rules |
before_value | JSON (encrypted) | SOX | {"amount": 10000} | Previous field values |
after_value | JSON (encrypted) | SOX | {"amount": 12000} | New field values |
data_subject_ref | Pseudonymized ID | GDPR | ds-hash-abc123 | NOT the actual name/email |
processing_purpose | String | GDPR | contract_fulfillment | GDPR Article 6 legal basis |
encryption_status | Boolean | HIPAA | true | Whether PHI was encrypted in transit |
access_type | Enum | HIPAA | READ|WRITE|EXPORT | How PHI was accessed |
result | Enum | All | SUCCESS|FAILURE|PARTIAL | Operation outcome |
hash_previous | SHA-256 | SOX | a4f3b2... | Hash chain for tamper detection |
log_version | String | All | 1.0 | Schema version |
| Regulation | Retention Period | Applies To | Destruction Rules | Key Citation |
|---|---|---|---|---|
| SOX | 7 years | Financial audit records | Must not be destroyed during retention | Section 802 (18 USC 1519) |
| HIPAA | 6 years | PHI access logs | Secure destruction after retention | 45 CFR 164.530(j) |
| GDPR | Minimize (1-3 years typical) | Processing activity records | Delete when no longer necessary | Article 5(1)(e) |
| PCI DSS | 1 year (immediately accessible) | Cardholder data access logs | Secure destruction | Requirement 10.7 |
| SEC 17a-4 | 6 years | Broker-dealer records | WORM storage required | 17 CFR 240.17a-4 |
Map every field to a data classification level. This determines what you log, how you encrypt it, and how long you retain it. [src1, src3]
DATA_CLASSIFICATIONS = {
"PUBLIC": {"log_detail": "minimal", "retention_years": 1},
"INTERNAL": {"log_detail": "standard", "retention_years": 3},
"CONFIDENTIAL_FINANCIAL": {"log_detail": "full_before_after", "retention_years": 7},
"PHI": {"log_detail": "full_access_tracking", "retention_years": 6},
"PII_EU": {"log_detail": "pseudonymized", "retention_years": 2}
}
Verify: Every field in your integration mappings has a classification. Unclassified fields default to the strictest applicable regulation.
Create a standardized audit log entry that satisfies all three regulations simultaneously. [src2, src6]
class ComplianceAuditLogger:
def __init__(self, previous_hash="GENESIS"):
self._previous_hash = previous_hash
def create_entry(self, source_system, target_system, operation,
entity_type, entity_id, data_classification, **kwargs):
entry = {
"event_id": str(uuid.uuid4()),
"correlation_id": kwargs.get("correlation_id") or str(uuid.uuid4()),
"timestamp": datetime.now(timezone.utc).isoformat(),
"source_system": source_system,
"target_system": target_system,
"operation": operation,
"entity_type": entity_type,
"entity_id": entity_id,
"data_classification": data_classification,
"result": kwargs.get("result", "SUCCESS"),
"log_version": "1.0"
}
# SOX: before/after values for financial data
if data_classification == "CONFIDENTIAL_FINANCIAL":
entry["before_value"] = kwargs.get("before_value")
entry["after_value"] = kwargs.get("after_value")
# HIPAA: access tracking for PHI
if data_classification == "PHI":
entry["access_type"] = kwargs.get("access_type", "WRITE")
entry["encryption_status"] = kwargs.get("encryption_status", True)
# GDPR: pseudonymized subject reference (never raw PII)
if data_classification == "PII_EU":
entry["data_subject_ref"] = kwargs.get("data_subject_ref")
entry["processing_purpose"] = kwargs.get("processing_purpose")
# Hash chain for tamper detection
entry["hash_previous"] = self._previous_hash
entry_json = json.dumps(entry, sort_keys=True)
entry["hash_current"] = hashlib.sha256(entry_json.encode()).hexdigest()
self._previous_hash = entry["hash_current"]
return entry
Verify: entry["hash_previous"] chains to the previous entry’s hash_current.
The correlation ID ties a single business transaction across all systems. [src2]
# Generate at FIRST system in chain, propagate via HTTP header:
# X-Correlation-ID: a1b2c3d4-e5f6-7890-abcd-ef1234567890
# X-Audit-Classification: CONFIDENTIAL_FINANCIAL
# X-Audit-Regulations: SOX,HIPAA
Verify: Query your audit log for a correlation_id — entries from every system the transaction touched should appear in chronological order.
Each ERP has different native capabilities. Extract natively, supplement with integration-layer logging. [src4, src5, src7, src8]
# Salesforce Shield: Event Monitoring API (requires Shield license)
# SAP: Security Audit Log via SM20 / RSAU_READ_LOG + enable table logging via SE13
# D365: Admin > System administration > Database log > Setup
# NetSuite: Saved Search type=System Note, export via SuiteTalk
# Oracle: Audit History REST API + OCI Logging Analytics
Verify: Cross-reference native ERP entries with integration-layer entries using correlation ID.
Use hot/warm/cold tiers: 90-day hot (SIEM), 1-year warm (S3 Standard), 7-year cold (S3 Glacier with Object Lock Compliance mode). [src6]
# AWS S3 Object Lock for SOX compliance
s3.put_object_lock_configuration(
Bucket=bucket_name,
ObjectLockConfiguration={
"ObjectLockEnabled": "Enabled",
"Rule": {"DefaultRetention": {"Mode": "COMPLIANCE", "Years": 7}}
}
)
Verify: Attempt to delete an object — should fail with AccessDenied.
Periodically verify the hash chain. Alert on any breaks. [src6]
def verify_hash_chain(log_entries):
errors = []
for i, entry in enumerate(log_entries):
copy = {k: v for k, v in entry.items() if k != "hash_current"}
computed = hashlib.sha256(json.dumps(copy, sort_keys=True).encode()).hexdigest()
if computed != entry.get("hash_current"):
errors.append({"position": i, "error": "HASH_MISMATCH"})
if i > 0 and entry.get("hash_previous") != log_entries[i-1].get("hash_current"):
errors.append({"position": i, "error": "CHAIN_BREAK"})
return {"integrity": "PASS" if not errors else "FAIL", "errors": errors}
Verify: integrity == "PASS". Any CHAIN_BREAK triggers a SOX incident report.
# Input: Inbound integration request
# Output: Enriched request with guaranteed correlation ID
def audit_correlation(func):
@wraps(func)
def wrapper(request, *args, **kwargs):
corr_id = (request.headers.get("X-Correlation-ID")
or request.headers.get("X-Request-ID")
or str(uuid.uuid4()))
request.correlation_id = corr_id
request.audit_headers = {
"X-Correlation-ID": corr_id,
"X-Data-Classification": request.headers.get("X-Data-Classification", "INTERNAL")
}
return func(request, *args, **kwargs)
return wrapper
// Input: Integration event details
// Output: Immutable audit log entry with SHA-256 hash chain
const crypto = require("crypto");
const { v4: uuidv4 } = require("uuid"); // [email protected]
class AuditLogChain {
constructor(previousHash = "GENESIS") { this.previousHash = previousHash; }
createEntry({ sourceSystem, targetSystem, operation, entityType,
entityId, dataClassification, correlationId = null }) {
const entry = {
event_id: uuidv4(), correlation_id: correlationId || uuidv4(),
timestamp: new Date().toISOString(),
source_system: sourceSystem, target_system: targetSystem,
operation, entity_type: entityType, entity_id: entityId,
data_classification: dataClassification,
hash_previous: this.previousHash, log_version: "1.0"
};
const json = JSON.stringify(entry, Object.keys(entry).sort());
entry.hash_current = crypto.createHash("sha256").update(json).digest("hex");
this.previousHash = entry.hash_current;
return entry;
}
}
# Query available event log files
curl -s -H "Authorization: Bearer $SF_ACCESS_TOKEN" \
"$SF_INSTANCE_URL/services/data/v62.0/query?q=SELECT+Id,EventType,LogDate,LogFile+FROM+EventLogFile+WHERE+LogDate=TODAY" \
| jq '.records[] | {EventType, LogDate, LogFile}'
| ERP Native Event | Normalized Operation | Normalized Entity | Classification Logic |
|---|---|---|---|
Salesforce: ApiEvent | Derived from HTTP method | From sobjectType | Map sObject to classification table |
SAP: AU1 (Logon) | ACCESS | Session | INTERNAL |
SAP: AU5 (RFC call) | Derived from function module | From FM target | Map FM to classification table |
D365: Database Log Insert | CREATE | From table name | Map table to classification table |
NetSuite: System Note Set | UPDATE | From record type | Map record type to classification |
| Oracle: Value Change History | UPDATE | From object name | Map object to classification |
PAYMTERMID), not labels — normalization layer must map [src7]| Code | Meaning | Cause | Resolution |
|---|---|---|---|
LOG_STORAGE_FULL | Storage capacity exceeded | SIEM license limit or disk full | Tier to cold storage immediately; NEVER drop logs |
HASH_CHAIN_BREAK | Tamper detection triggered | Log entry modified after creation | Trigger SOX incident; investigate from last known-good hash |
ENCRYPTION_FAILURE | Cannot encrypt audit entry | Key management service unavailable | Queue in encrypted local buffer; retry with backoff |
CORRELATION_MISSING | No correlation ID | Upstream did not propagate header | Generate new ID; flag as "correlation_gap" |
RETENTION_VIOLATION | Log deleted before retention | Automated cleanup misconfigured | Restore from backup; review retention config |
PHI_PLAINTEXT_DETECTED | Unencrypted PHI in log | Data classification not applied | Purge; re-log encrypted; file HIPAA incident |
Local WAL buffer; async forward; alert if buffer exceeds 15 minutes. [src2]Single-writer pattern with message queue (Kafka/SQS); serialize at the hash step. [src6]Set rsau/max_diskspace/per_file >= 500MB; use real-time streaming via ETD or Sentinel. [src5]Pseudonymize personal data in SOX records; delete pseudonymization key after GDPR period. [src3]Normalize ALL timestamps to UTC at integration layer before writing audit entries. [src2]# BAD — violates GDPR Article 5(1)(c) and HIPAA Security Rule
audit_entry = {
"user": "[email protected]",
"record": "Patient: Jane Smith, SSN: 123-45-6789, Diagnosis: Type 2 Diabetes"
}
# This log entry is now itself regulated data
# GOOD — log contains only references, not raw data
audit_entry = {
"user_or_service": "integration-svc-order-sync",
"entity_type": "Patient",
"entity_id": "PAT-2026-7890", # Internal ID, not name
"data_subject_ref": "ds-sha256-abc", # Pseudonymized reference
"data_classification": "PHI",
"fields_changed": ["diagnosis_code"], # Field names only, no values
"encryption_status": True
}
# BAD — GDPR violation (over-retention) or SOX violation (under-retention)
RETENTION_POLICY = {"all_audit_logs": "3 years"}
# GOOD — each classification has its own retention
RETENTION_POLICIES = {
"CONFIDENTIAL_FINANCIAL": {"years": 7, "regulation": "SOX"},
"PHI": {"years": 6, "regulation": "HIPAA"},
"PII_EU": {"years": 2, "regulation": "GDPR"},
"INTERNAL": {"years": 3, "regulation": "internal"}
}
# BAD — cannot reconstruct end-to-end audit trail
# Salesforce: {"action": "create_order", "id": "SF-001"}
# NetSuite: {"action": "receive_order", "id": "NS-789"}
# How do you prove SF-001 became NS-789?
# GOOD — single ID links entire chain
correlation_id = "corr-2026-abc123"
sf_audit = {"correlation_id": correlation_id, "entity_id": "SF-001", "operation": "CREATE"}
mw_audit = {"correlation_id": correlation_id, "operation": "TRANSFORM"}
ns_audit = {"correlation_id": correlation_id, "entity_id": "NS-789", "operation": "CREATE"}
# Query: WHERE correlation_id = 'corr-2026-abc123' ORDER BY timestamp
Design audit schema and correlation ID propagation BEFORE building integration flows. [src2]Use data classification to determine log detail level per field. [src3]Dedicated append-only (write) and read-only service accounts for audit. [src2]Include audit trail reconstruction as integration test scenario. [src2]Budget storage costs upfront; implement hot/warm/cold tiering; compress cold-tier (10:1 ratio). [src6]ERP audit config checklist validated during deployment. [src5, src7]Pseudonymize from day one; delete mapping key after GDPR period while retaining the hashed audit trail. [src3]Convert ALL timestamps to UTC at integration layer before writing. [src5]# Verify hash chain integrity
python3 -c "
import json, hashlib
logs = json.load(open('audit_logs.json'))
for i, e in enumerate(logs):
c = {k:v for k,v in e.items() if k != 'hash_current'}
h = hashlib.sha256(json.dumps(c, sort_keys=True).encode()).hexdigest()
if h != e.get('hash_current'): print(f'TAMPER at {i}: {e[\"event_id\"]}')
"
# Check correlation completeness (Elasticsearch)
curl -s 'http://elk:9200/audit-logs/_search' \
-H 'Content-Type: application/json' \
-d '{"query":{"term":{"correlation_id":"corr-2026-abc123"}},"sort":[{"timestamp":"asc"}]}' \
| jq '.hits.hits[]._source | {timestamp, source_system, operation}'
# Check for plaintext PHI (HIPAA violation scan)
curl -s 'http://elk:9200/audit-logs/_search' \
-H 'Content-Type: application/json' \
-d '{"query":{"regexp":{"before_value":"[0-9]{3}-[0-9]{2}-[0-9]{4}"}},"size":10}' \
| jq '.hits.total.value'
# Expected: 0 (no SSN patterns)
# Salesforce: check event monitoring status
curl -s -H "Authorization: Bearer $SF_TOKEN" \
"$SF_URL/services/data/v62.0/query?q=SELECT+COUNT()+FROM+EventLogFile+WHERE+LogDate=TODAY"
| Standard/Tool | Version/Date | Status | Key Changes | Impact on Logging |
|---|---|---|---|---|
| GDPR | 2016/679 + 2025 amendments | Active (enforcement intensifying) | AI processing transparency, dark patterns | Must capture AI/ML processing decisions |
| SOX | 2002 + PCAOB AS 2201 | Active | No major changes | 7-year retention unchanged |
| HIPAA | 2013 Omnibus + 2025 NPRM | Active (proposed updates) | 72-hour breach notification proposed | May require more granular access logging |
| Splunk | 9.x (2025) | Current | Federated search, WORM compliance | Better multi-tier management |
| Elasticsearch | 8.x (2025) | Current | ILM policies, searchable snapshots | Better cold-tier retention |
| AWS S3 Object Lock | GA (2019, updated 2024) | Current | Compliance mode cannot be overridden | Gold standard for SOX immutable storage |
| Use When | Don't Use When | Use Instead |
|---|---|---|
| Integration touches financial data (SOX) | Purely operational monitoring (uptime, latency) | Observability & Distributed Tracing |
| Integration processes or transmits PHI (HIPAA) | Single-system internal audit only | ERP-specific audit configuration guide |
| Integration handles EU personal data (GDPR) | Non-regulated data (product catalog sync) | Standard integration logging |
| Multiple regulations apply simultaneously | Need real-time alerting on failures | Error Handling, Retry & DLQ |
| Auditors require end-to-end transaction tracing | Need API rate limit monitoring | System-specific API card |
| Capability | Salesforce (Shield) | SAP S/4HANA | D365 F&O | NetSuite | Oracle ERP Cloud |
|---|---|---|---|---|---|
| Native audit retention | 6 months / 10 years (Shield) | Configurable (90d default) | 30 days default | 365 days | Configurable |
| Before/after values | Field Audit Trail (Shield) | Table Logging (per-table) | Database Log (per-table) | System Notes (300 char limit) | Value Change History |
| Read access logging | Event Monitoring (Shield) | Read Access Logging | Dataverse Auditing | Login audit only | Security Console |
| SIEM integration | Splunk app, API export | Sentinel, Splunk HEC | Azure Sentinel (native) | Custom export | OCI Logging Analytics |
| Immutability | Platform-managed | SAL append-only | Requires Azure Immutable Blob | No native immutability | Oracle Audit Vault (add-on) |
| Cost | Shield: ~10% of SF spend | Included | Included | Compliance 360 (extra) | Included (basic) |
| SOX readiness | High (with Shield) | High (with config) | High (with Database Log) | Moderate (retention gap) | High |
| HIPAA readiness | Moderate | High (Read Access Logging) | Low (no PHI classification) | Low | Moderate |
| GDPR readiness | Low (no DSAR logging) | Moderate | Moderate | Low | Moderate |
| Correlation ID support | Not native | Not native | Not native | Not native | Not native |