Dynamics 365 Data Management Framework (DMF/DIXF): Import/Export Capabilities

Type: ERP Integration System: Dynamics 365 Finance & Operations (10.0.x) Confidence: 0.92 Sources: 8 Verified: 2026-03-01 Freshness: 2026-03-01

TL;DR

System Profile

Microsoft Dynamics 365 Finance & Operations (F&O) encompasses Dynamics 365 Finance, Supply Chain Management, Commerce, and Human Resources. The Data Management Framework (DMF) -- historically known as DIXF (Data Import/Export Framework) -- is the standard tool for bulk data import, export, and configuration transfer across these applications. DMF operates on a continuous release cadence (version 10.0.x) with monthly updates. This card covers the cloud-deployed version. On-premises deployments support the Package REST API with minor differences (AD FS authentication, /namespaces/AXSF appended to base URL). Does NOT cover Dynamics 365 Business Central.

PropertyValue
VendorMicrosoft
SystemDynamics 365 Finance & Operations (10.0.x continuous release)
API SurfaceDMF/DIXF -- Data entities, Data packages, Recurring Integration REST API, Package REST API
Current API VersionContinuous (tied to platform updates, currently PU64+)
Editions CoveredFinance, Supply Chain Management, Commerce, Human Resources
DeploymentCloud (primary), On-Premises (supported with limitations)
API DocsData management overview
StatusGA -- actively maintained, exempt from service protection API throttling

API Surfaces & Capabilities

API SurfaceProtocolBest ForMax Records/RequestRate LimitReal-time?Bulk?
DMF Package REST APIHTTPS/JSON + data packagesLarge batch import/export via external schedulingUnlimited (package-based)ExemptNoYes
Recurring Integration APIHTTPS/REST (enqueue/dequeue/ack)Scheduled recurring file exchangeUnlimited (file-based)ExemptNoYes
OData v4 REST APIHTTPS/JSONIndividual record CRUD, queries10,000 per page ($top)6,000 req/5 minYesLimited
Dual WriteReal-time syncBidirectional F&O-Dataverse syncPer-record eventN/AYesNo
Business EventsHTTPS webhookEvent-driven notificationsPer eventN/AYesN/A
Virtual EntitiesDataverse APIPower Platform querying F&O dataDataverse limitsExempt from F&O limitsYesNo

Rate Limits & Quotas

DMF-Specific Limits

DMF and DIXF operations (including recurring integrations) are exempt from service protection API limits. OData endpoints used alongside DMF are subject to throttling. [src4]

Limit TypeValueApplies ToNotes
Export file size (default)256 MBData package exportsDefault Azure Storage service version limitation
Export file size (max)5,000 MB (5 GB)Data package exportsSet AzureStorageServiceVersion to 2019-12-12 in DMFPARAMETERS table
Blob storage retention7 daysAll DMF filesFiles auto-deleted from Azure Blob after 7 days
Max batch threads per AOS8 (default), 16 (max recommended)Parallel batch processingValues above 16 require significant performance testing
Composite entity per package1ImportFromPackage APIPackage with composite entity can only contain that one entity

OData Service Protection Limits (for comparison)

Limit TypeValueWindowNotes
Requests per user per web server6,0005-min sliding windowPer user, per application ID, per web server
Combined execution time per user1,200 seconds (20 min)5-min sliding windowAcross all requests from that user
Concurrent requests per user52InstantaneousMaximum simultaneous requests
Resource utilizationDynamic thresholdInstantaneousCPU/memory-based; returns 429

Authentication

FlowUse WhenToken LifetimeRefresh?Notes
OAuth 2.0 Client CredentialsServer-to-server (no user context)Session-dependentNew token per request cycleRegister app in Entra ID; add to System admin > Microsoft Entra applications
OAuth 2.0 Authorization CodeUser-context operationsAccess: 1h, Refresh: until revokedYesLess common with DMF
AD FSOn-premises deploymentsAD FS-configuredYesAppend /namespaces/AXSF to base URL

Authentication Gotchas

Constraints

Integration Pattern Decision Tree

START -- User needs to exchange data with D365 F&O
|-- What's the integration pattern?
|   |-- Real-time (individual records, <1s latency)
|   |   |-- Need Dataverse bidirectional sync?
|   |   |   |-- YES -> Dual Write
|   |   |   '-- NO -> OData v4 REST API
|   |   '-- Need event notifications?
|   |       |-- YES -> Business Events
|   |       '-- NO -> OData REST API polling
|   |-- Batch/Bulk (scheduled, high volume)
|   |   |-- Scheduling managed in F&O?
|   |   |   |-- YES -> Recurring Integration API (enqueue/dequeue/ack)
|   |   |   '-- NO -> DMF Package REST API (ImportFromPackage/ExportToPackage)
|   |   |-- Need XSLT transformation?
|   |   |   |-- YES -> Recurring Integration API
|   |   |   '-- NO -> Either API works
|   |   '-- Data volume > 5 GB per file?
|   |       |-- YES -> Split into multiple packages
|   |       '-- NO -> Single package sufficient
|   |-- Configuration copy
|   |   '-- Use DMF Data Packages
|   '-- Data migration (initial load)
|       |-- < 100 entities -> Manual DMF import
|       '-- > 100 entities -> Data Task Automation via LCS
|-- Which direction?
|   |-- Inbound -> enqueue or ImportFromPackage
|   |-- Outbound -> dequeue or ExportToPackage
|   '-- Bidirectional -> separate jobs or Dual Write
'-- Need change tracking (delta only)?
    |-- YES -> Enable change tracking per entity
    '-- NO -> Full export each run

Quick Reference

DMF Package REST API Endpoints

OperationMethodEndpointNotes
Get writable blob URLPOST/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetAzureWriteUrlReturns BlobId and BlobUrl with SAS token
Import from packagePOST/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.ImportFromPackageAsync version: ImportFromPackageAsync
Export to packagePOST/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.ExportToPackageAsync version: ExportToPackageAsync
Get export URLPOST/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExportedPackageUrlReturns BlobUrl with SAS token
Check execution statusPOST/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExecutionSummaryStatusReturns: NotRun, Executing, Succeeded, PartiallySucceeded, Failed, Canceled
Get execution errorsPOST/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExecutionErrorsReturns JSON array of error messages
Get staging error filePOST/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetImportStagingErrorFileUrlError file for source-to-staging failures
Generate target error keysPOST/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GenerateImportTargetErrorKeysFileReturns true if errors exist

Recurring Integration REST Endpoints

OperationMethodEndpointNotes
Enqueue (import)POST/api/connector/enqueue/<activityID>?entity=<entityName>Pass data file as memory stream in body
Dequeue (export)GET/api/connector/dequeue/<activityID>Returns data package; must acknowledge
AcknowledgePOST/api/connector/ack/<activityID>Include dequeue response body; unacked messages re-appear every 30 min
Get message statusPOST/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetMessageStatusStatuses: Enqueued, Dequeued, Acked, Processing, Processed, etc.

Step-by-Step Integration Guide

1. Obtain an OAuth 2.0 Access Token

Register an application in Microsoft Entra ID and add it to D365 F&O via System administration > Microsoft Entra applications. [src2]

curl -X POST "https://login.microsoftonline.com/{tenantId}/oauth2/v2.0/token" \
  -H "Content-Type: application/x-www-form-urlencoded" \
  -d "client_id={clientId}&client_secret={clientSecret}&scope=https://{d365-env}.operations.dynamics.com/.default&grant_type=client_credentials"

Verify: Response contains access_token field with a JWT token.

2. Upload Data Package to Azure Blob Storage

Get a writable blob URL with an embedded SAS token, then upload your .zip data package. [src2]

# Get writable URL
curl -X POST "https://{d365-env}.operations.dynamics.com/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetAzureWriteUrl" \
  -H "Authorization: Bearer {access_token}" \
  -H "Content-Type: application/json" \
  -d '{"uniqueFileName":"import-data.zip"}'

# Upload package to returned BlobUrl
curl -X PUT "{BlobUrl}" -H "x-ms-blob-type: BlockBlob" --data-binary @import-data.zip

Verify: HTTP 201 response from Azure Blob upload.

3. Trigger Import from Package

Call ImportFromPackage with the package URL and data project name. [src2]

curl -X POST "https://{d365-env}.operations.dynamics.com/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.ImportFromPackage" \
  -H "Authorization: Bearer {access_token}" \
  -H "Content-Type: application/json" \
  -d '{"packageUrl":"{BlobUrl}","definitionGroupId":"CustomerImport","executionId":"","execute":true,"overwrite":true,"legalEntityId":"USMF"}'

Verify: Response returns executionId string.

4. Poll Execution Status

Poll GetExecutionSummaryStatus with exponential backoff. [src2]

curl -X POST "https://{d365-env}.operations.dynamics.com/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExecutionSummaryStatus" \
  -H "Authorization: Bearer {access_token}" \
  -H "Content-Type: application/json" \
  -d '{"executionId":"{executionId}"}'

Verify: value returns Succeeded when complete.

5. Retrieve Errors (if any)

Use GetExecutionErrors for general errors. [src2]

curl -X POST "https://{d365-env}.operations.dynamics.com/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExecutionErrors" \
  -H "Authorization: Bearer {access_token}" \
  -H "Content-Type: application/json" \
  -d '{"executionId":"{executionId}"}'

Verify: Empty array [] means no errors.

Code Examples

Python: Batch Import via DMF Package REST API

# Input:  D365 F&O credentials, .zip data package file
# Output: Import execution result (success/failure with error details)

import requests, time, json

TENANT_ID = "your-tenant-id"
CLIENT_ID = "your-client-id"
CLIENT_SECRET = "your-client-secret"
D365_URL = "https://your-env.operations.dynamics.com"

def get_access_token():
    resp = requests.post(
        f"https://login.microsoftonline.com/{TENANT_ID}/oauth2/v2.0/token",
        data={"client_id": CLIENT_ID, "client_secret": CLIENT_SECRET,
              "scope": f"{D365_URL}/.default", "grant_type": "client_credentials"})
    resp.raise_for_status()
    return resp.json()["access_token"]

def import_package(token, package_path):
    headers = {"Authorization": f"Bearer {token}", "Content-Type": "application/json"}
    # Get blob URL, upload, import, poll -- see full example in .md file
    # ...

cURL: Export Data Package

# Step 1: Trigger export
curl -X POST "https://{d365-env}.operations.dynamics.com/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.ExportToPackage" \
  -H "Authorization: Bearer {token}" -H "Content-Type: application/json" \
  -d '{"definitionGroupId":"CustomerExport","packageName":"export.zip","executionId":"","reExecute":true,"legalEntityId":"USMF"}'

# Step 2: Poll status, Step 3: Get download URL, Step 4: Download package
# See full example in .md file

Data Mapping

DMF Architecture: Source to Target Flow

StageWhat HappensFormatError Recovery
Source fileExternal system generates fileCSV, XML, Excel (14 formats)Fix source file and re-upload
Upload to blobFile uploaded via SAS URL.zip data packageSAS token time-limited
Source to stagingSSIS extracts to staging tablesSQL staging tablesGetImportStagingErrorFileUrl
Staging validationValidates types, required fieldsIn-databaseEdit staging records directly
Staging to targetData entities map to productionProduction tablesGenerateImportTargetErrorKeysFile

Entity Sequencing

ConceptPurposeParallel?Example
Execution unitIndependent processing groupsYesUnit 1: Tax codes; Unit 2: Tax exempt numbers
LevelDependency ordering within unitNo (sequential)Level 1: Sales tax codes; Level 2: Sales tax groups
SequenceOrder within a levelSequentialSequence 1: Code A; Sequence 2: Code B

Data Type Gotchas

Error Handling & Failure Points

Common Error Codes

ErrorMeaningCauseResolution
Failed to insert into stagingDuplicate recordSource file contains duplicatesRemove duplicates before import
Data value violates integrity constraintsField validation failureRequired field missing, wrong type, or referential integrityCheck mapping and source data
429 Too Many RequestsService protection limitToo many OData calls (not DMF)Implement Retry-After backoff
SSIS failure with apostropheSSIS parse errorProject name contains apostropheRemove apostrophe or enable DMFExecuteSSISInProc flight
Entity not foundEntity missing from packageMissing entity in manifestVerify entity name; regenerate mapping
Mapping mismatchStale mappingEntity extended but mapping not regeneratedUse Generate source mapping

Failure Points in Production

Anti-Patterns

Wrong: Calling ImportFromPackage in Parallel Threads

// BAD -- ImportFromPackage uses batch internally; parallel calls cause failures
Parallel.ForEach(packages, package => {
    client.ImportFromPackage(package.Url, projectName, "", true, true, "USMF");
});

Correct: Sequential ImportFromPackage with Internal Parallelism

// GOOD -- call sequentially; it handles parallelism internally via batch
foreach (var package in packages) {
    var executionId = await client.ImportFromPackageAsync(
        package.Url, projectName, "", true, true, "USMF");
    await PollUntilComplete(executionId);
}

Wrong: Ignoring Dequeue Acknowledgment

# BAD -- without ack, same message re-appears every 30 minutes
response = requests.get(f"{base_url}/api/connector/dequeue/{activity_id}", headers=headers)
process_data(response.content)
# Missing: acknowledge the message

Correct: Always Acknowledge After Successful Dequeue

# GOOD -- acknowledge immediately after successful download
response = requests.get(f"{base_url}/api/connector/dequeue/{activity_id}", headers=headers)
process_data(response.content)
requests.post(f"{base_url}/api/connector/ack/{activity_id}", headers=headers, data=response.content)

Wrong: Using DMF for Real-Time Single Record Operations

# BAD -- massive overhead for single record updates
def update_single_customer(data):
    package = create_package_for_one_record(data)  # Overkill
    upload_to_blob(package)
    import_from_package(package)
    poll_until_complete()

Correct: Use OData for Real-Time, DMF for Batch

# GOOD -- OData for single records, DMF for bulk
def update_single_customer(data):
    requests.patch(f"{D365_URL}/data/CustomersV3('{data['id']}')", headers=h, json=data)

def import_bulk_customers(file_path):
    import_from_package(create_package(file_path))  # DMF for bulk

Common Pitfalls

Diagnostic Commands

# Check execution status for a specific job
curl -X POST "https://{d365-env}.operations.dynamics.com/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExecutionSummaryStatus" \
  -H "Authorization: Bearer {token}" -H "Content-Type: application/json" \
  -d '{"executionId":"{executionId}"}'

# Get error details for a failed import
curl -X POST "https://{d365-env}.operations.dynamics.com/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExecutionErrors" \
  -H "Authorization: Bearer {token}" -H "Content-Type: application/json" \
  -d '{"executionId":"{executionId}"}'

# Get staging error file URL
curl -X POST "https://{d365-env}.operations.dynamics.com/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetImportStagingErrorFileUrl" \
  -H "Authorization: Bearer {token}" -H "Content-Type: application/json" \
  -d '{"executionId":"{executionId}","entityName":"{entityName}"}'

# Check recurring integration message status
curl -X POST "https://{d365-env}.operations.dynamics.com/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetMessageStatus" \
  -H "Authorization: Bearer {token}" -H "Content-Type: application/json" \
  -d '{"messageId":"{messageId}"}'

Version History & Compatibility

Platform UpdateReleaseStatusKey ChangesNotes
PU64+2025-2026CurrentAutomatic retry for recurring jobs; SysIntegrationActivityBatch redesignCustom code must update to child job pattern
10.0.362024SupportedUser-based API limits disabled; option removedResource-based limits remain
10.0.352024SupportedUser-based limits disabled by defaultOptionally enabled
10.0.192021EOLResource-based API limits introducedFirst throttling layer
PU122018EOLGetMessageStatus API; prevent upload when zero records--
PU52017EOLDMF Package REST API introducedInitial REST API

When to Use / When Not to Use

Use WhenDon't Use WhenUse Instead
Bulk data migration (thousands to millions of records)Real-time individual record CRUD (<1s latency)OData v4 REST API
Configuration copy between environmentsBidirectional real-time sync with DataverseDual Write
Recurring scheduled file-based integrationsEvent-driven notifications on record changesBusiness Events
Initial data load for new implementationsPower Platform apps querying F&O dataVirtual Entities
Multi-entity data packages with sequencingSimple single-record lookupsOData $filter queries
XSLT transformation required on inbound dataSub-second integration response timesCustom OData actions

Important Caveats

Related Units