Epicor DMT (Data Management Tool): Bulk Import Constraints, Limits, and Best Practices

Type: ERP Integration System: Epicor Kinetic (2024.1+) Confidence: 0.82 Sources: 7 Verified: 2026-03-02 Freshness: 2026-03-02

TL;DR

System Profile

This card covers the Epicor Data Management Tool (DMT), a desktop client application bundled with Epicor ERP (version 10.2.500+) and Epicor Kinetic. DMT is the official tool for bulk importing, updating, and deleting records in Epicor ERP using CSV files. It processes data through Epicor's business objects (BOs), meaning all standard validation rules, BPMs, and data directives execute on every imported record. DMT is NOT a direct database import tool -- it does not bypass business logic. [src1, src2]

PropertyValue
VendorEpicor
SystemEpicor Kinetic (ERP) 2024.1+
API SurfaceDesktop client (GUI + CLI), processes via Business Objects
Current VersionDMT 4.0.50.1+ (ships with Epicor Kinetic)
Editions CoveredAll editions (Standard, Enterprise, Cloud)
DeploymentHybrid (on-premise client connects to on-premise or cloud server)
API DocsEpicor DMT Documentation
StatusGA

API Surfaces & Capabilities

DMT is not a traditional API -- it is a client-side tool that internally calls Epicor's business objects. It exposes multiple interaction surfaces for automation. [src1]

SurfaceInterfaceBest ForAutomation?Parallel?Notes
DMT GUIWindows desktop appSmall imports (<1,000 records), one-time loadsNoManual parallel tabsTemplate builder, validation preview, pause/cancel
DMT CLIDMT.exe with argumentsScheduled/automated importsYes (batch/PowerShell)Yes (multiple processes)-User, -Pass, -Source, -NoUI arguments
PlayListsCSV file listing importsMulti-table sequential loadsYesSequential within playlistCSV of CLI arguments with headers
PlayBooksText file listing playlistsFull migration orchestrationYesSequential playlistsReferences multiple PlayList files
BAQ Export-Export -BAQ= CLI argsData extraction from EpicorYesYes (multiple processes)Outputs CSV only
Server ProcessingServer-side batch flagFaster processing for supported tablesYesN/ADefault batch size: 200. Limited table support.

Rate Limits & Quotas

Per-Record Processing Limits

Limit TypeValueApplies ToNotes
Processing throughput30-90 RPMAll DMT importsVaries by table complexity, BPMs, server hardware [src3]
Throughput with BPMs disabled55-90 RPMTables with many BPMsDisabling non-essential BPMs can double speed
Throughput with BPMs active30-50 RPMTables with complex BPMsEmail-sending BPMs are worst offenders
Server Processing batch size200 records (default)Server Processing-enabled tables only4x faster than standard processing reported [src4]

File & Data Limits

Limit TypeValueApplies ToNotes
Max file sizeNo hard limitCSV import filesPractical limit: files >100K rows degrade performance
Recommended chunk size10,000 records/fileParallel processingBest results: 10 files of 10K vs one 100K file [src3]
File formatCSV (comma-delimited) onlyPlaylists and CLIExcel must be saved as CSV first
File encodingUTF-8All CSV filesRequired for special characters [src6]
Date formatMM/DD/YYYYAll date fieldsNon-compliant formats cause import failure
Decimal separatorPeriod (.)All numeric fieldsRemove thousand separators (commas)
Concurrent instances~10-12 per machineParallel importsLimited by client CPU/RAM (85% CPU typical) [src3]

Practical Throughput Estimates

Record CountSingle Instance5 Instances10 InstancesNotes
1,00011-33 min3-7 min2-4 minMinimal benefit from parallelism
10,0001.8-5.5 hours22-66 min11-33 minIdeal chunk size
50,0009-28 hours2-6 hours1-3 hoursMust split files
100,00018-55 hours4-11 hours2-5.5 hours10 files of 10K recommended [src3]

Authentication

MethodUse WhenCredential TypeRefresh?Notes
GUI loginInteractive useUsername + passwordN/AStandard Epicor login dialog [src1]
CLI -User/-PassAutomated/scriptedPlaintext in argumentsN/AVisible in process list. Secure scripts. [src1]
Windows auth (SSO)On-premise with ADDomain credentialsN/AWhen Epicor configured for Windows auth

Authentication Gotchas

Constraints

Integration Pattern Decision Tree

START - Bulk import data into Epicor via DMT
|-- How many records?
|   |-- < 1,000 records
|   |   --> DMT GUI: interactive mode, preview validation, single file
|   |-- 1,000 - 10,000 records
|   |   --> DMT CLI: single instance, -NoUI for unattended
|   |-- 10,000 - 100,000 records
|   |   |-- Split into 10,000-record CSV files
|   |   |-- Launch 5-10 parallel DMT.exe instances
|   |   --> Disable non-essential BPMs first
|   |-- > 100,000 records
|       |-- Split into 10,000-record chunks
|       |-- Run across multiple client machines
|       --> Plan for 4-11 hours with 10 parallel instances
|-- What operation?
|   |-- Add new --> Check "Add", ensure parent records exist
|   |-- Update existing --> Check "Update", match on primary keys
|   |-- Delete --> Check "Delete", backup first
|   |-- Mixed --> Check both "Add" and "Update"
|-- One-time or recurring?
|   |-- One-time (go-live) --> Playbooks, pre-load + delta at cutover
|   |-- Recurring --> CLI + Windows Task Scheduler + error monitoring
|-- Load order (mandatory dependency sequence):
    1. Company, Sites, Plants
    2. Chart of Accounts, GL Accounts
    3. Customers, Vendors, Suppliers
    4. Part Classes, Part Groups
    5. Parts (Part Master)
    6. Part Revisions
    7. Bill of Operations (BOO)
    8. Bill of Materials (BOM)
    9. Inventory (initial balances)
    10. Open Orders, POs, Jobs

Quick Reference

CLI ArgumentPurposeExampleNotes
-User=Epicor username-User=managerRequired for CLI mode
-Pass=Epicor password-Pass=epicor123Plaintext -- secure scripts
-Source=CSV file path-Source="C:\data\parts.csv"Full path to import file
-NoUISuppress GUI-NoUIRequired for unattended automation
-AddAdd new records-AddCannot be used with -Delete
-UpdateUpdate existing records-UpdateCan combine with -Add
-DeleteDelete records-DeleteUse with caution
-ExportExport mode (BAQ)-ExportSwitches to export mode
-BAQ=BAQ query name-BAQ="PartExport"Used with -Export
-Target=Export output path-Target="C:\export\parts.csv"CSV output only
-UseFieldNamesInclude headers-UseFieldNamesAdds column headers to export

Step-by-Step Integration Guide

1. Prepare CSV using DMT Template Builder

Open DMT, select the target table, click "Template Builder." Required fields are pre-checked. Export the empty CSV template and populate with your data. [src5]

# Launch DMT GUI to build template
start C:\Epicor\ERP10\DMT.exe
# In DMT GUI: select table > Template Builder > check fields > Save CSV

Verify: Click "Validate Columns" in DMT to confirm all required fields are present.

2. Clean and format data

Ensure all data meets DMT formatting requirements: MM/DD/YYYY dates, no thousand separators, UTF-8 encoding, trimmed whitespace. [src6]

import pandas as pd

df = pd.read_csv("raw_data.csv", encoding="utf-8")
# Fix dates to MM/DD/YYYY
for col in df.select_dtypes(include=["datetime64"]).columns:
    df[col] = pd.to_datetime(df[col]).dt.strftime("%m/%d/%Y")
# Remove thousand separators, trim whitespace
for col in df.select_dtypes(include=["object"]).columns:
    df[col] = df[col].str.replace(",", "", regex=False).str.strip()
df.to_csv("cleaned_for_dmt.csv", index=False, encoding="utf-8")

Verify: Dates are MM/DD/YYYY, no commas in numbers, no leading/trailing whitespace.

3. Split large files for parallel processing

For datasets over 10,000 records, split into chunks for parallel DMT execution. [src3]

import pandas as pd, math
df = pd.read_csv("cleaned_for_dmt.csv", encoding="utf-8")
chunk_size = 10000
for i in range(math.ceil(len(df) / chunk_size)):
    chunk = df[i * chunk_size : (i + 1) * chunk_size]
    chunk.to_csv(f"dmt_chunk_{i+1:03d}.csv", index=False, encoding="utf-8")

Verify: Each chunk has correct headers. Total rows across chunks = original file.

4. Disable non-essential BPMs

Identify and disable BPMs that slow imports (email notifications, workflow triggers). [src3]

In Epicor Kinetic:
1. System Setup > Business Process Management
2. Identify BPMs on target table
3. Disable non-essential BPMs or add condition: "Client Type <> DMT"
4. Document which BPMs were disabled

Verify: Test 10-record import. Target speed: 55-90 RPM without BPMs.

5. Execute parallel imports via CLI

Launch multiple DMT instances simultaneously, each processing a different chunk. [src3]

@echo off
start "" "C:\Epicor\ERP10\DMT.exe" -User=dmtuser -Pass=pass -Source="C:\data\dmt_chunk_001.csv" -Add -NoUI
start "" "C:\Epicor\ERP10\DMT.exe" -User=dmtuser -Pass=pass -Source="C:\data\dmt_chunk_002.csv" -Add -NoUI
REM ... repeat for all chunks
echo All DMT instances launched. Monitor Task Manager for completion.

Verify: Monitor Task Manager for DMT.exe processes. Check log files for errors.

6. Validate and re-enable BPMs

After imports complete, verify data integrity and restore BPMs. [src5]

1. Run BAQ queries to count imported records
2. Compare counts against source data
3. Spot-check 10-20 records for field accuracy
4. Re-enable all disabled BPMs
5. Review DMT error logs, re-import failed records

Verify: Record counts match source. Error log shows zero or expected failures. BPMs re-enabled.

Code Examples

Python: Automated parallel DMT execution

# Input:  Directory of chunk CSV files, DMT.exe path, credentials
# Output: Parallel DMT execution with completion monitoring

import subprocess, os, time, glob

DMT_PATH = r"C:\Epicor\ERP10\DMT.exe"
DATA_DIR = r"C:\data\dmt_chunks"
MAX_PARALLEL = 10

chunk_files = sorted(glob.glob(os.path.join(DATA_DIR, "dmt_chunk_*.csv")))
processes = []

for csv_file in chunk_files:
    while len([p for p in processes if p.poll() is None]) >= MAX_PARALLEL:
        time.sleep(5)
    cmd = [DMT_PATH, "-User=dmtuser", "-Pass=pass", f"-Source={csv_file}", "-Add", "-NoUI"]
    proc = subprocess.Popen(cmd)
    processes.append(proc)
    print(f"Launched: {os.path.basename(csv_file)} (PID: {proc.pid})")

for proc in processes:
    proc.wait()
print("All DMT imports complete.")

PowerShell: PlayList-based migration

# Input:  PlayList CSV with import definitions
# Output: Sequential DMT imports

$dmtPath = "C:\Epicor\ERP10\DMT.exe"
$playlistPath = "C:\data\playlist.csv"

$startTime = Get-Date
$proc = Start-Process -FilePath $dmtPath `
    -ArgumentList "-User=dmtuser", "-Pass=pass", "-PlayList=$playlistPath", "-NoUI" `
    -Wait -PassThru

$duration = (Get-Date) - $startTime
Write-Host "Completed in $($duration.TotalMinutes) minutes. Exit: $($proc.ExitCode)"

Batch: BAQ Export for backup

@echo off
REM Export current data before bulk operations
start /wait "" "C:\Epicor\ERP10\DMT.exe" -User=admin -Pass=pass ^
    -Export -BAQ="PartExport" -Target="C:\backup\parts.csv" -UseFieldNames -NoUI
echo Export complete.

Data Mapping

DMT CSV Column Requirements

Data TypeCSV FormatExampleGotcha
StringPlain textACME CorpLeading/trailing spaces cause lookup failures. TRIM all strings.
DateMM/DD/YYYY01/15/2026Other formats cause silent failures [src6]
IntegerNo thousand separators12345Commas cause type mismatch errors
DecimalPeriod as separator123.45Regional comma-decimal causes failure
BooleanTRUE/FALSE or 1/0TRUEDo not use Yes/No
Lookup/FKExact matchCUST001Case-sensitive. Must match existing record.
Multi-valueTilde (~) separatedVAL1~VAL2Epicor-specific separator

Data Type Gotchas

Error Handling & Failure Points

Common Error Messages

ErrorMeaningCauseResolution
"Field references invalid value"Lookup field unrecognizedFK value doesn't exist in EpicorVerify referenced record exists before importing dependents
"Required field is empty"Mandatory field missingCSV cell is blankPopulate all required fields. Use Template Builder.
"Record already exists"Duplicate primary keyAdding a record that existsUse "Update" instead of "Add"
"Business object validation failed"BO rule violationInvalid data (bad date, negative qty)Review specific message, correct data
"Column not found"CSV header mismatchColumn name doesn't match field nameUse Template Builder for correct headers
"Access denied"Insufficient permissionsUser lacks security roleGrant Epicor security roles to DMT user
BPM error messagesCustom logic failureBPM rule rejected the recordReview BPM conditions or disable during import

Failure Points in Production

Anti-Patterns

Wrong: Loading all 100K records in a single instance

REM BAD -- single instance for 100K records takes 18-55 hours
DMT.exe -User=admin -Pass=pass -Source="all_100k_parts.csv" -Add -NoUI

Correct: Split into chunks and run parallel

REM GOOD -- 10 parallel instances, each 10K records. Total: 2-5 hours.
for /L %%i in (1,1,10) do (
    start "" DMT.exe -User=admin -Pass=pass -Source="chunk_%%i.csv" -Add -NoUI
)

Wrong: Leaving all BPMs active during bulk import

REM BAD -- email BPM fires 100,000 times. Speed drops to 30 RPM.
REM Mail server queue backs up for hours.

Correct: Disable or condition BPMs first

REM GOOD -- add BPM condition: skip when CallContext.ClientType == "DMT"
REM Speed stays at 55-90 RPM. Re-enable after import.

Wrong: Importing child records before parent records

REM BAD -- BOMs before Parts. Every record fails.
DMT.exe -Source="bom_materials.csv" -Add -NoUI
DMT.exe -Source="parts.csv" -Add -NoUI

Correct: Use Playbook with dependency order

REM GOOD -- Playbook enforces: Parts > Revisions > Operations > BOMs
DMT.exe -User=admin -Pass=pass -PlayBook="migration_playbook.txt" -NoUI

Common Pitfalls

Diagnostic Commands

REM Check DMT version
"C:\Epicor\ERP10\DMT.exe" -Version

REM Test connectivity with single record
"C:\Epicor\ERP10\DMT.exe" -User=test -Pass=pass -Source="single_record.csv" -Add -NoUI

REM Export data for verification
"C:\Epicor\ERP10\DMT.exe" -User=admin -Pass=pass -Export -BAQ="PartCount" -Target="verify.csv" -UseFieldNames -NoUI

REM Check DMT log files
dir "C:\Users\%USERNAME%\AppData\Local\Epicor\DMT\Logs\"

REM Monitor running DMT processes
tasklist /fi "imagename eq DMT.exe"

Version History & Compatibility

VersionReleaseStatusChangesNotes
DMT 4.0.50.1+ (Kinetic 2024.1)2024-Q1CurrentServer Processing flagDefault batch: 200 records. Limited tables.
DMT (ERP 10.2.700)2023SupportedBAQ Export via CLI-Export -BAQ= arguments added
DMT (ERP 10.2.500)2022SupportedBundled with ERPNo separate installer needed
DMT (ERP 10.2.x)2020-2022SupportedPlayList/PlayBook supportCSV-based orchestration
DMT (ERP 10.1.x)2018-2020LegacyCLI argumentsBasic automation introduced
DMT (ERP 9.x)Pre-2018EOLGUI-onlyNo CLI or automation

When to Use / When Not to Use

Use WhenDon't Use WhenUse Instead
One-time data migration at go-liveReal-time integration requiring <1s latencyEpicor REST API with webhooks
Bulk master data loads >1,000 recordsContinuous bidirectional syncEpicor REST API + integration platform
Recurring batch updates from CSVComplex transformation mid-importETL tool feeding Epicor REST API
Initial inventory balance loadingDirect SQL bypassing business rulesNever bypass DMT/BO for production data
Bulk price list updatesMobile/web app data entryEpicor Kinetic UI or REST API

Important Caveats

Related Units