This card covers the Epicor Data Management Tool (DMT), a desktop client application bundled with Epicor ERP (version 10.2.500+) and Epicor Kinetic. DMT is the official tool for bulk importing, updating, and deleting records in Epicor ERP using CSV files. It processes data through Epicor's business objects (BOs), meaning all standard validation rules, BPMs, and data directives execute on every imported record. DMT is NOT a direct database import tool -- it does not bypass business logic. [src1, src2]
| Property | Value |
|---|---|
| Vendor | Epicor |
| System | Epicor Kinetic (ERP) 2024.1+ |
| API Surface | Desktop client (GUI + CLI), processes via Business Objects |
| Current Version | DMT 4.0.50.1+ (ships with Epicor Kinetic) |
| Editions Covered | All editions (Standard, Enterprise, Cloud) |
| Deployment | Hybrid (on-premise client connects to on-premise or cloud server) |
| API Docs | Epicor DMT Documentation |
| Status | GA |
DMT is not a traditional API -- it is a client-side tool that internally calls Epicor's business objects. It exposes multiple interaction surfaces for automation. [src1]
| Surface | Interface | Best For | Automation? | Parallel? | Notes |
|---|---|---|---|---|---|
| DMT GUI | Windows desktop app | Small imports (<1,000 records), one-time loads | No | Manual parallel tabs | Template builder, validation preview, pause/cancel |
| DMT CLI | DMT.exe with arguments | Scheduled/automated imports | Yes (batch/PowerShell) | Yes (multiple processes) | -User, -Pass, -Source, -NoUI arguments |
| PlayLists | CSV file listing imports | Multi-table sequential loads | Yes | Sequential within playlist | CSV of CLI arguments with headers |
| PlayBooks | Text file listing playlists | Full migration orchestration | Yes | Sequential playlists | References multiple PlayList files |
| BAQ Export | -Export -BAQ= CLI args | Data extraction from Epicor | Yes | Yes (multiple processes) | Outputs CSV only |
| Server Processing | Server-side batch flag | Faster processing for supported tables | Yes | N/A | Default batch size: 200. Limited table support. |
| Limit Type | Value | Applies To | Notes |
|---|---|---|---|
| Processing throughput | 30-90 RPM | All DMT imports | Varies by table complexity, BPMs, server hardware [src3] |
| Throughput with BPMs disabled | 55-90 RPM | Tables with many BPMs | Disabling non-essential BPMs can double speed |
| Throughput with BPMs active | 30-50 RPM | Tables with complex BPMs | Email-sending BPMs are worst offenders |
| Server Processing batch size | 200 records (default) | Server Processing-enabled tables only | 4x faster than standard processing reported [src4] |
| Limit Type | Value | Applies To | Notes |
|---|---|---|---|
| Max file size | No hard limit | CSV import files | Practical limit: files >100K rows degrade performance |
| Recommended chunk size | 10,000 records/file | Parallel processing | Best results: 10 files of 10K vs one 100K file [src3] |
| File format | CSV (comma-delimited) only | Playlists and CLI | Excel must be saved as CSV first |
| File encoding | UTF-8 | All CSV files | Required for special characters [src6] |
| Date format | MM/DD/YYYY | All date fields | Non-compliant formats cause import failure |
| Decimal separator | Period (.) | All numeric fields | Remove thousand separators (commas) |
| Concurrent instances | ~10-12 per machine | Parallel imports | Limited by client CPU/RAM (85% CPU typical) [src3] |
| Record Count | Single Instance | 5 Instances | 10 Instances | Notes |
|---|---|---|---|---|
| 1,000 | 11-33 min | 3-7 min | 2-4 min | Minimal benefit from parallelism |
| 10,000 | 1.8-5.5 hours | 22-66 min | 11-33 min | Ideal chunk size |
| 50,000 | 9-28 hours | 2-6 hours | 1-3 hours | Must split files |
| 100,000 | 18-55 hours | 4-11 hours | 2-5.5 hours | 10 files of 10K recommended [src3] |
| Method | Use When | Credential Type | Refresh? | Notes |
|---|---|---|---|---|
| GUI login | Interactive use | Username + password | N/A | Standard Epicor login dialog [src1] |
| CLI -User/-Pass | Automated/scripted | Plaintext in arguments | N/A | Visible in process list. Secure scripts. [src1] |
| Windows auth (SSO) | On-premise with AD | Domain credentials | N/A | When Epicor configured for Windows auth |
START - Bulk import data into Epicor via DMT
|-- How many records?
| |-- < 1,000 records
| | --> DMT GUI: interactive mode, preview validation, single file
| |-- 1,000 - 10,000 records
| | --> DMT CLI: single instance, -NoUI for unattended
| |-- 10,000 - 100,000 records
| | |-- Split into 10,000-record CSV files
| | |-- Launch 5-10 parallel DMT.exe instances
| | --> Disable non-essential BPMs first
| |-- > 100,000 records
| |-- Split into 10,000-record chunks
| |-- Run across multiple client machines
| --> Plan for 4-11 hours with 10 parallel instances
|-- What operation?
| |-- Add new --> Check "Add", ensure parent records exist
| |-- Update existing --> Check "Update", match on primary keys
| |-- Delete --> Check "Delete", backup first
| |-- Mixed --> Check both "Add" and "Update"
|-- One-time or recurring?
| |-- One-time (go-live) --> Playbooks, pre-load + delta at cutover
| |-- Recurring --> CLI + Windows Task Scheduler + error monitoring
|-- Load order (mandatory dependency sequence):
1. Company, Sites, Plants
2. Chart of Accounts, GL Accounts
3. Customers, Vendors, Suppliers
4. Part Classes, Part Groups
5. Parts (Part Master)
6. Part Revisions
7. Bill of Operations (BOO)
8. Bill of Materials (BOM)
9. Inventory (initial balances)
10. Open Orders, POs, Jobs
| CLI Argument | Purpose | Example | Notes |
|---|---|---|---|
-User= | Epicor username | -User=manager | Required for CLI mode |
-Pass= | Epicor password | -Pass=epicor123 | Plaintext -- secure scripts |
-Source= | CSV file path | -Source="C:\data\parts.csv" | Full path to import file |
-NoUI | Suppress GUI | -NoUI | Required for unattended automation |
-Add | Add new records | -Add | Cannot be used with -Delete |
-Update | Update existing records | -Update | Can combine with -Add |
-Delete | Delete records | -Delete | Use with caution |
-Export | Export mode (BAQ) | -Export | Switches to export mode |
-BAQ= | BAQ query name | -BAQ="PartExport" | Used with -Export |
-Target= | Export output path | -Target="C:\export\parts.csv" | CSV output only |
-UseFieldNames | Include headers | -UseFieldNames | Adds column headers to export |
Open DMT, select the target table, click "Template Builder." Required fields are pre-checked. Export the empty CSV template and populate with your data. [src5]
# Launch DMT GUI to build template
start C:\Epicor\ERP10\DMT.exe
# In DMT GUI: select table > Template Builder > check fields > Save CSV
Verify: Click "Validate Columns" in DMT to confirm all required fields are present.
Ensure all data meets DMT formatting requirements: MM/DD/YYYY dates, no thousand separators, UTF-8 encoding, trimmed whitespace. [src6]
import pandas as pd
df = pd.read_csv("raw_data.csv", encoding="utf-8")
# Fix dates to MM/DD/YYYY
for col in df.select_dtypes(include=["datetime64"]).columns:
df[col] = pd.to_datetime(df[col]).dt.strftime("%m/%d/%Y")
# Remove thousand separators, trim whitespace
for col in df.select_dtypes(include=["object"]).columns:
df[col] = df[col].str.replace(",", "", regex=False).str.strip()
df.to_csv("cleaned_for_dmt.csv", index=False, encoding="utf-8")
Verify: Dates are MM/DD/YYYY, no commas in numbers, no leading/trailing whitespace.
For datasets over 10,000 records, split into chunks for parallel DMT execution. [src3]
import pandas as pd, math
df = pd.read_csv("cleaned_for_dmt.csv", encoding="utf-8")
chunk_size = 10000
for i in range(math.ceil(len(df) / chunk_size)):
chunk = df[i * chunk_size : (i + 1) * chunk_size]
chunk.to_csv(f"dmt_chunk_{i+1:03d}.csv", index=False, encoding="utf-8")
Verify: Each chunk has correct headers. Total rows across chunks = original file.
Identify and disable BPMs that slow imports (email notifications, workflow triggers). [src3]
In Epicor Kinetic:
1. System Setup > Business Process Management
2. Identify BPMs on target table
3. Disable non-essential BPMs or add condition: "Client Type <> DMT"
4. Document which BPMs were disabled
Verify: Test 10-record import. Target speed: 55-90 RPM without BPMs.
Launch multiple DMT instances simultaneously, each processing a different chunk. [src3]
@echo off
start "" "C:\Epicor\ERP10\DMT.exe" -User=dmtuser -Pass=pass -Source="C:\data\dmt_chunk_001.csv" -Add -NoUI
start "" "C:\Epicor\ERP10\DMT.exe" -User=dmtuser -Pass=pass -Source="C:\data\dmt_chunk_002.csv" -Add -NoUI
REM ... repeat for all chunks
echo All DMT instances launched. Monitor Task Manager for completion.
Verify: Monitor Task Manager for DMT.exe processes. Check log files for errors.
After imports complete, verify data integrity and restore BPMs. [src5]
1. Run BAQ queries to count imported records
2. Compare counts against source data
3. Spot-check 10-20 records for field accuracy
4. Re-enable all disabled BPMs
5. Review DMT error logs, re-import failed records
Verify: Record counts match source. Error log shows zero or expected failures. BPMs re-enabled.
# Input: Directory of chunk CSV files, DMT.exe path, credentials
# Output: Parallel DMT execution with completion monitoring
import subprocess, os, time, glob
DMT_PATH = r"C:\Epicor\ERP10\DMT.exe"
DATA_DIR = r"C:\data\dmt_chunks"
MAX_PARALLEL = 10
chunk_files = sorted(glob.glob(os.path.join(DATA_DIR, "dmt_chunk_*.csv")))
processes = []
for csv_file in chunk_files:
while len([p for p in processes if p.poll() is None]) >= MAX_PARALLEL:
time.sleep(5)
cmd = [DMT_PATH, "-User=dmtuser", "-Pass=pass", f"-Source={csv_file}", "-Add", "-NoUI"]
proc = subprocess.Popen(cmd)
processes.append(proc)
print(f"Launched: {os.path.basename(csv_file)} (PID: {proc.pid})")
for proc in processes:
proc.wait()
print("All DMT imports complete.")
# Input: PlayList CSV with import definitions
# Output: Sequential DMT imports
$dmtPath = "C:\Epicor\ERP10\DMT.exe"
$playlistPath = "C:\data\playlist.csv"
$startTime = Get-Date
$proc = Start-Process -FilePath $dmtPath `
-ArgumentList "-User=dmtuser", "-Pass=pass", "-PlayList=$playlistPath", "-NoUI" `
-Wait -PassThru
$duration = (Get-Date) - $startTime
Write-Host "Completed in $($duration.TotalMinutes) minutes. Exit: $($proc.ExitCode)"
@echo off
REM Export current data before bulk operations
start /wait "" "C:\Epicor\ERP10\DMT.exe" -User=admin -Pass=pass ^
-Export -BAQ="PartExport" -Target="C:\backup\parts.csv" -UseFieldNames -NoUI
echo Export complete.
| Data Type | CSV Format | Example | Gotcha |
|---|---|---|---|
| String | Plain text | ACME Corp | Leading/trailing spaces cause lookup failures. TRIM all strings. |
| Date | MM/DD/YYYY | 01/15/2026 | Other formats cause silent failures [src6] |
| Integer | No thousand separators | 12345 | Commas cause type mismatch errors |
| Decimal | Period as separator | 123.45 | Regional comma-decimal causes failure |
| Boolean | TRUE/FALSE or 1/0 | TRUE | Do not use Yes/No |
| Lookup/FK | Exact match | CUST001 | Case-sensitive. Must match existing record. |
| Multi-value | Tilde (~) separated | VAL1~VAL2 | Epicor-specific separator |
| Error | Meaning | Cause | Resolution |
|---|---|---|---|
| "Field references invalid value" | Lookup field unrecognized | FK value doesn't exist in Epicor | Verify referenced record exists before importing dependents |
| "Required field is empty" | Mandatory field missing | CSV cell is blank | Populate all required fields. Use Template Builder. |
| "Record already exists" | Duplicate primary key | Adding a record that exists | Use "Update" instead of "Add" |
| "Business object validation failed" | BO rule violation | Invalid data (bad date, negative qty) | Review specific message, correct data |
| "Column not found" | CSV header mismatch | Column name doesn't match field name | Use Template Builder for correct headers |
| "Access denied" | Insufficient permissions | User lacks security role | Grant Epicor security roles to DMT user |
| BPM error messages | Custom logic failure | BPM rule rejected the record | Review BPM conditions or disable during import |
Add BPM condition: skip when Client Type = DMT, or disable email BPMs during bulk loads. [src3]Always load in dependency order. Use Playbooks to enforce sequence. [src5]Set RedirectStandardOutput = false or asynchronously read stdout. [src3]Save as CSV UTF-8 in Excel. Verify encoding before import. [src6]Limit to 8-10 instances per machine. Distribute across multiple machines. [src3]REM BAD -- single instance for 100K records takes 18-55 hours
DMT.exe -User=admin -Pass=pass -Source="all_100k_parts.csv" -Add -NoUI
REM GOOD -- 10 parallel instances, each 10K records. Total: 2-5 hours.
for /L %%i in (1,1,10) do (
start "" DMT.exe -User=admin -Pass=pass -Source="chunk_%%i.csv" -Add -NoUI
)
REM BAD -- email BPM fires 100,000 times. Speed drops to 30 RPM.
REM Mail server queue backs up for hours.
REM GOOD -- add BPM condition: skip when CallContext.ClientType == "DMT"
REM Speed stays at 55-90 RPM. Re-enable after import.
REM BAD -- BOMs before Parts. Every record fails.
DMT.exe -Source="bom_materials.csv" -Add -NoUI
DMT.exe -Source="parts.csv" -Add -NoUI
REM GOOD -- Playbook enforces: Parts > Revisions > Operations > BOMs
DMT.exe -User=admin -Pass=pass -PlayBook="migration_playbook.txt" -NoUI
Always test with 50-100 records first before full import. [src5]Save As > CSV (Comma delimited) or CSV UTF-8 in Excel. [src1]Click Validate Columns hyperlink after loading CSV. [src5]Export via BAQ Export before any delete operation. [src1]Schedule during off-hours. Coordinate with Epicor admins. [src7]Use DMT version matching your Epicor build. Included since 10.2.500. [src1]REM Check DMT version
"C:\Epicor\ERP10\DMT.exe" -Version
REM Test connectivity with single record
"C:\Epicor\ERP10\DMT.exe" -User=test -Pass=pass -Source="single_record.csv" -Add -NoUI
REM Export data for verification
"C:\Epicor\ERP10\DMT.exe" -User=admin -Pass=pass -Export -BAQ="PartCount" -Target="verify.csv" -UseFieldNames -NoUI
REM Check DMT log files
dir "C:\Users\%USERNAME%\AppData\Local\Epicor\DMT\Logs\"
REM Monitor running DMT processes
tasklist /fi "imagename eq DMT.exe"
| Version | Release | Status | Changes | Notes |
|---|---|---|---|---|
| DMT 4.0.50.1+ (Kinetic 2024.1) | 2024-Q1 | Current | Server Processing flag | Default batch: 200 records. Limited tables. |
| DMT (ERP 10.2.700) | 2023 | Supported | BAQ Export via CLI | -Export -BAQ= arguments added |
| DMT (ERP 10.2.500) | 2022 | Supported | Bundled with ERP | No separate installer needed |
| DMT (ERP 10.2.x) | 2020-2022 | Supported | PlayList/PlayBook support | CSV-based orchestration |
| DMT (ERP 10.1.x) | 2018-2020 | Legacy | CLI arguments | Basic automation introduced |
| DMT (ERP 9.x) | Pre-2018 | EOL | GUI-only | No CLI or automation |
| Use When | Don't Use When | Use Instead |
|---|---|---|
| One-time data migration at go-live | Real-time integration requiring <1s latency | Epicor REST API with webhooks |
| Bulk master data loads >1,000 records | Continuous bidirectional sync | Epicor REST API + integration platform |
| Recurring batch updates from CSV | Complex transformation mid-import | ETL tool feeding Epicor REST API |
| Initial inventory balance loading | Direct SQL bypassing business rules | Never bypass DMT/BO for production data |
| Bulk price list updates | Mobile/web app data entry | Epicor Kinetic UI or REST API |