Microsoft Dynamics 365 Finance & Operations (F&O) encompasses Dynamics 365 Finance, Supply Chain Management, Commerce, and Human Resources. The Data Management Framework (DMF) -- historically known as DIXF (Data Import/Export Framework) -- is the standard tool for bulk data import, export, and configuration transfer across these applications. DMF operates on a continuous release cadence (version 10.0.x) with monthly updates. This card covers the cloud-deployed version. On-premises deployments support the Package REST API with minor differences (AD FS authentication, /namespaces/AXSF appended to base URL). Does NOT cover Dynamics 365 Business Central.
| Property | Value |
|---|---|
| Vendor | Microsoft |
| System | Dynamics 365 Finance & Operations (10.0.x continuous release) |
| API Surface | DMF/DIXF -- Data entities, Data packages, Recurring Integration REST API, Package REST API |
| Current API Version | Continuous (tied to platform updates, currently PU64+) |
| Editions Covered | Finance, Supply Chain Management, Commerce, Human Resources |
| Deployment | Cloud (primary), On-Premises (supported with limitations) |
| API Docs | Data management overview |
| Status | GA -- actively maintained, exempt from service protection API throttling |
| API Surface | Protocol | Best For | Max Records/Request | Rate Limit | Real-time? | Bulk? |
|---|---|---|---|---|---|---|
| DMF Package REST API | HTTPS/JSON + data packages | Large batch import/export via external scheduling | Unlimited (package-based) | Exempt | No | Yes |
| Recurring Integration API | HTTPS/REST (enqueue/dequeue/ack) | Scheduled recurring file exchange | Unlimited (file-based) | Exempt | No | Yes |
| OData v4 REST API | HTTPS/JSON | Individual record CRUD, queries | 10,000 per page ($top) | 6,000 req/5 min | Yes | Limited |
| Dual Write | Real-time sync | Bidirectional F&O-Dataverse sync | Per-record event | N/A | Yes | No |
| Business Events | HTTPS webhook | Event-driven notifications | Per event | N/A | Yes | N/A |
| Virtual Entities | Dataverse API | Power Platform querying F&O data | Dataverse limits | Exempt from F&O limits | Yes | No |
DMF and DIXF operations (including recurring integrations) are exempt from service protection API limits. OData endpoints used alongside DMF are subject to throttling. [src4]
| Limit Type | Value | Applies To | Notes |
|---|---|---|---|
| Export file size (default) | 256 MB | Data package exports | Default Azure Storage service version limitation |
| Export file size (max) | 5,000 MB (5 GB) | Data package exports | Set AzureStorageServiceVersion to 2019-12-12 in DMFPARAMETERS table |
| Blob storage retention | 7 days | All DMF files | Files auto-deleted from Azure Blob after 7 days |
| Max batch threads per AOS | 8 (default), 16 (max recommended) | Parallel batch processing | Values above 16 require significant performance testing |
| Composite entity per package | 1 | ImportFromPackage API | Package with composite entity can only contain that one entity |
| Limit Type | Value | Window | Notes |
|---|---|---|---|
| Requests per user per web server | 6,000 | 5-min sliding window | Per user, per application ID, per web server |
| Combined execution time per user | 1,200 seconds (20 min) | 5-min sliding window | Across all requests from that user |
| Concurrent requests per user | 52 | Instantaneous | Maximum simultaneous requests |
| Resource utilization | Dynamic threshold | Instantaneous | CPU/memory-based; returns 429 |
| Flow | Use When | Token Lifetime | Refresh? | Notes |
|---|---|---|---|---|
| OAuth 2.0 Client Credentials | Server-to-server (no user context) | Session-dependent | New token per request cycle | Register app in Entra ID; add to System admin > Microsoft Entra applications |
| OAuth 2.0 Authorization Code | User-context operations | Access: 1h, Refresh: until revoked | Yes | Less common with DMF |
| AD FS | On-premises deployments | AD FS-configured | Yes | Append /namespaces/AXSF to base URL |
/namespaces/AXSF to the base URL for all DMF Package API calls. [src2]START -- User needs to exchange data with D365 F&O
|-- What's the integration pattern?
| |-- Real-time (individual records, <1s latency)
| | |-- Need Dataverse bidirectional sync?
| | | |-- YES -> Dual Write
| | | '-- NO -> OData v4 REST API
| | '-- Need event notifications?
| | |-- YES -> Business Events
| | '-- NO -> OData REST API polling
| |-- Batch/Bulk (scheduled, high volume)
| | |-- Scheduling managed in F&O?
| | | |-- YES -> Recurring Integration API (enqueue/dequeue/ack)
| | | '-- NO -> DMF Package REST API (ImportFromPackage/ExportToPackage)
| | |-- Need XSLT transformation?
| | | |-- YES -> Recurring Integration API
| | | '-- NO -> Either API works
| | '-- Data volume > 5 GB per file?
| | |-- YES -> Split into multiple packages
| | '-- NO -> Single package sufficient
| |-- Configuration copy
| | '-- Use DMF Data Packages
| '-- Data migration (initial load)
| |-- < 100 entities -> Manual DMF import
| '-- > 100 entities -> Data Task Automation via LCS
|-- Which direction?
| |-- Inbound -> enqueue or ImportFromPackage
| |-- Outbound -> dequeue or ExportToPackage
| '-- Bidirectional -> separate jobs or Dual Write
'-- Need change tracking (delta only)?
|-- YES -> Enable change tracking per entity
'-- NO -> Full export each run
| Operation | Method | Endpoint | Notes |
|---|---|---|---|
| Get writable blob URL | POST | /data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetAzureWriteUrl | Returns BlobId and BlobUrl with SAS token |
| Import from package | POST | /data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.ImportFromPackage | Async version: ImportFromPackageAsync |
| Export to package | POST | /data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.ExportToPackage | Async version: ExportToPackageAsync |
| Get export URL | POST | /data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExportedPackageUrl | Returns BlobUrl with SAS token |
| Check execution status | POST | /data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExecutionSummaryStatus | Returns: NotRun, Executing, Succeeded, PartiallySucceeded, Failed, Canceled |
| Get execution errors | POST | /data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExecutionErrors | Returns JSON array of error messages |
| Get staging error file | POST | /data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetImportStagingErrorFileUrl | Error file for source-to-staging failures |
| Generate target error keys | POST | /data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GenerateImportTargetErrorKeysFile | Returns true if errors exist |
| Operation | Method | Endpoint | Notes |
|---|---|---|---|
| Enqueue (import) | POST | /api/connector/enqueue/<activityID>?entity=<entityName> | Pass data file as memory stream in body |
| Dequeue (export) | GET | /api/connector/dequeue/<activityID> | Returns data package; must acknowledge |
| Acknowledge | POST | /api/connector/ack/<activityID> | Include dequeue response body; unacked messages re-appear every 30 min |
| Get message status | POST | /data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetMessageStatus | Statuses: Enqueued, Dequeued, Acked, Processing, Processed, etc. |
Register an application in Microsoft Entra ID and add it to D365 F&O via System administration > Microsoft Entra applications. [src2]
curl -X POST "https://login.microsoftonline.com/{tenantId}/oauth2/v2.0/token" \
-H "Content-Type: application/x-www-form-urlencoded" \
-d "client_id={clientId}&client_secret={clientSecret}&scope=https://{d365-env}.operations.dynamics.com/.default&grant_type=client_credentials"
Verify: Response contains access_token field with a JWT token.
Get a writable blob URL with an embedded SAS token, then upload your .zip data package. [src2]
# Get writable URL
curl -X POST "https://{d365-env}.operations.dynamics.com/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetAzureWriteUrl" \
-H "Authorization: Bearer {access_token}" \
-H "Content-Type: application/json" \
-d '{"uniqueFileName":"import-data.zip"}'
# Upload package to returned BlobUrl
curl -X PUT "{BlobUrl}" -H "x-ms-blob-type: BlockBlob" --data-binary @import-data.zip
Verify: HTTP 201 response from Azure Blob upload.
Call ImportFromPackage with the package URL and data project name. [src2]
curl -X POST "https://{d365-env}.operations.dynamics.com/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.ImportFromPackage" \
-H "Authorization: Bearer {access_token}" \
-H "Content-Type: application/json" \
-d '{"packageUrl":"{BlobUrl}","definitionGroupId":"CustomerImport","executionId":"","execute":true,"overwrite":true,"legalEntityId":"USMF"}'
Verify: Response returns executionId string.
Poll GetExecutionSummaryStatus with exponential backoff. [src2]
curl -X POST "https://{d365-env}.operations.dynamics.com/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExecutionSummaryStatus" \
-H "Authorization: Bearer {access_token}" \
-H "Content-Type: application/json" \
-d '{"executionId":"{executionId}"}'
Verify: value returns Succeeded when complete.
Use GetExecutionErrors for general errors. [src2]
curl -X POST "https://{d365-env}.operations.dynamics.com/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExecutionErrors" \
-H "Authorization: Bearer {access_token}" \
-H "Content-Type: application/json" \
-d '{"executionId":"{executionId}"}'
Verify: Empty array [] means no errors.
# Input: D365 F&O credentials, .zip data package file
# Output: Import execution result (success/failure with error details)
import requests, time, json
TENANT_ID = "your-tenant-id"
CLIENT_ID = "your-client-id"
CLIENT_SECRET = "your-client-secret"
D365_URL = "https://your-env.operations.dynamics.com"
def get_access_token():
resp = requests.post(
f"https://login.microsoftonline.com/{TENANT_ID}/oauth2/v2.0/token",
data={"client_id": CLIENT_ID, "client_secret": CLIENT_SECRET,
"scope": f"{D365_URL}/.default", "grant_type": "client_credentials"})
resp.raise_for_status()
return resp.json()["access_token"]
def import_package(token, package_path):
headers = {"Authorization": f"Bearer {token}", "Content-Type": "application/json"}
# Get blob URL, upload, import, poll -- see full example in .md file
# ...
# Step 1: Trigger export
curl -X POST "https://{d365-env}.operations.dynamics.com/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.ExportToPackage" \
-H "Authorization: Bearer {token}" -H "Content-Type: application/json" \
-d '{"definitionGroupId":"CustomerExport","packageName":"export.zip","executionId":"","reExecute":true,"legalEntityId":"USMF"}'
# Step 2: Poll status, Step 3: Get download URL, Step 4: Download package
# See full example in .md file
| Stage | What Happens | Format | Error Recovery |
|---|---|---|---|
| Source file | External system generates file | CSV, XML, Excel (14 formats) | Fix source file and re-upload |
| Upload to blob | File uploaded via SAS URL | .zip data package | SAS token time-limited |
| Source to staging | SSIS extracts to staging tables | SQL staging tables | GetImportStagingErrorFileUrl |
| Staging validation | Validates types, required fields | In-database | Edit staging records directly |
| Staging to target | Data entities map to production | Production tables | GenerateImportTargetErrorKeysFile |
| Concept | Purpose | Parallel? | Example |
|---|---|---|---|
| Execution unit | Independent processing groups | Yes | Unit 1: Tax codes; Unit 2: Tax exempt numbers |
| Level | Dependency ordering within unit | No (sequential) | Level 1: Sales tax codes; Level 2: Sales tax groups |
| Sequence | Order within a level | Sequential | Sequence 1: Code A; Sequence 2: Code B |
| Error | Meaning | Cause | Resolution |
|---|---|---|---|
| Failed to insert into staging | Duplicate record | Source file contains duplicates | Remove duplicates before import |
| Data value violates integrity constraints | Field validation failure | Required field missing, wrong type, or referential integrity | Check mapping and source data |
| 429 Too Many Requests | Service protection limit | Too many OData calls (not DMF) | Implement Retry-After backoff |
| SSIS failure with apostrophe | SSIS parse error | Project name contains apostrophe | Remove apostrophe or enable DMFExecuteSSISInProc flight |
| Entity not found | Entity missing from package | Missing entity in manifest | Verify entity name; regenerate mapping |
| Mapping mismatch | Stale mapping | Entity extended but mapping not regenerated | Use Generate source mapping |
Enable "Process messages in order" on recurring job. [src3]Request new SAS URL immediately before upload. [src2]Clear staging data before reimporting. [src1]Install OLEDB driver or use CSV format. [src1]Enable change tracking before first export. [src3]Download promptly; automate download pipeline. [src2]// BAD -- ImportFromPackage uses batch internally; parallel calls cause failures
Parallel.ForEach(packages, package => {
client.ImportFromPackage(package.Url, projectName, "", true, true, "USMF");
});
// GOOD -- call sequentially; it handles parallelism internally via batch
foreach (var package in packages) {
var executionId = await client.ImportFromPackageAsync(
package.Url, projectName, "", true, true, "USMF");
await PollUntilComplete(executionId);
}
# BAD -- without ack, same message re-appears every 30 minutes
response = requests.get(f"{base_url}/api/connector/dequeue/{activity_id}", headers=headers)
process_data(response.content)
# Missing: acknowledge the message
# GOOD -- acknowledge immediately after successful download
response = requests.get(f"{base_url}/api/connector/dequeue/{activity_id}", headers=headers)
process_data(response.content)
requests.post(f"{base_url}/api/connector/ack/{activity_id}", headers=headers, data=response.content)
# BAD -- massive overhead for single record updates
def update_single_customer(data):
package = create_package_for_one_record(data) # Overkill
upload_to_blob(package)
import_from_package(package)
poll_until_complete()
# GOOD -- OData for single records, DMF for bulk
def update_single_customer(data):
requests.patch(f"{D365_URL}/data/CustomersV3('{data['id']}')", headers=h, json=data)
def import_bulk_customers(file_path):
import_from_package(create_package(file_path)) # DMF for bulk
Create the data project via F&O UI first. [src2]Set correct execution unit, level, and sequence; use LCS data packages as reference. [src1]Enable change tracking per entity before first export. [src3]Keep staging enabled for complex entities. [src1]Request fresh SAS URL before each upload. [src2]Use entity label for enqueue; use definitionGroupId for Package API. [src1]# Check execution status for a specific job
curl -X POST "https://{d365-env}.operations.dynamics.com/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExecutionSummaryStatus" \
-H "Authorization: Bearer {token}" -H "Content-Type: application/json" \
-d '{"executionId":"{executionId}"}'
# Get error details for a failed import
curl -X POST "https://{d365-env}.operations.dynamics.com/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExecutionErrors" \
-H "Authorization: Bearer {token}" -H "Content-Type: application/json" \
-d '{"executionId":"{executionId}"}'
# Get staging error file URL
curl -X POST "https://{d365-env}.operations.dynamics.com/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetImportStagingErrorFileUrl" \
-H "Authorization: Bearer {token}" -H "Content-Type: application/json" \
-d '{"executionId":"{executionId}","entityName":"{entityName}"}'
# Check recurring integration message status
curl -X POST "https://{d365-env}.operations.dynamics.com/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetMessageStatus" \
-H "Authorization: Bearer {token}" -H "Content-Type: application/json" \
-d '{"messageId":"{messageId}"}'
| Platform Update | Release | Status | Key Changes | Notes |
|---|---|---|---|---|
| PU64+ | 2025-2026 | Current | Automatic retry for recurring jobs; SysIntegrationActivityBatch redesign | Custom code must update to child job pattern |
| 10.0.36 | 2024 | Supported | User-based API limits disabled; option removed | Resource-based limits remain |
| 10.0.35 | 2024 | Supported | User-based limits disabled by default | Optionally enabled |
| 10.0.19 | 2021 | EOL | Resource-based API limits introduced | First throttling layer |
| PU12 | 2018 | EOL | GetMessageStatus API; prevent upload when zero records | -- |
| PU5 | 2017 | EOL | DMF Package REST API introduced | Initial REST API |
| Use When | Don't Use When | Use Instead |
|---|---|---|
| Bulk data migration (thousands to millions of records) | Real-time individual record CRUD (<1s latency) | OData v4 REST API |
| Configuration copy between environments | Bidirectional real-time sync with Dataverse | Dual Write |
| Recurring scheduled file-based integrations | Event-driven notifications on record changes | Business Events |
| Initial data load for new implementations | Power Platform apps querying F&O data | Virtual Entities |
| Multi-entity data packages with sequencing | Simple single-record lookups | OData $filter queries |
| XSLT transformation required on inbound data | Sub-second integration response times | Custom OData actions |