Infor Data Lake is the centralized data repository within Infor OS (the cloud platform underpinning all Infor CloudSuites). Data from Infor applications — M3, LN, CloudSuite Industrial, CloudSuite Distribution, and others — is automatically published to Data Lake via ION messaging. The Compass query engine sits atop Data Lake and provides SQL-like query capabilities for extracting this data. [src3]
This card covers the Compass v2 REST API available through the Infor Data Fabric Suite, which replaced the deprecated ION API Suite Compass endpoints (removed April 2025). It does NOT cover the Data Lake API (for hierarchical document retrieval), ION Data Lake Flows (for automated S3-based data routing), or the Compass UI within the Data Fabric application. [src1, src5]
| Property | Value |
|---|---|
| Vendor | Infor |
| System | Infor Data Lake (Data Fabric / Compass v2) |
| API Surface | REST (asynchronous job-based) |
| Current API Version | v2 (Data Fabric Suite, 2025.x) |
| Editions Covered | All Infor OS Cloud editions with Data Lake entitlement |
| Deployment | Cloud (Infor multi-tenant) |
| API Docs | Infor Data Fabric User Guide |
| Status | GA (v2 via Data Fabric Suite); v1/v2 via ION API Suite deprecated |
Infor provides multiple ways to extract data from Data Lake. The Compass API is the primary programmatic interface for SQL-based querying. [src5]
| API Surface | Protocol | Best For | Max Records/Request | Rate Limit | Real-time? | Bulk? |
|---|---|---|---|---|---|---|
| Compass v2 API (Data Fabric) | HTTPS/REST | Bulk SQL extraction, ETL | 100K rows / 10 MB per page | 100 jobs/min/tenant | No (async) | Yes |
| Compass JDBC Driver | JDBC | BI tools (Tableau, Power BI, DBeaver) | Driver-managed pagination | Shares Compass limits | No | Yes |
| Data Lake API | HTTPS/REST | Hierarchical document retrieval | Limited filtering | ION API limits | Yes | No |
| ION Data Lake Flows | ION Connect | Automated S3 export, CDC delta loads | Full table + deltas | Flow-based | No | Yes |
| Atlas UI | Browser | Interactive data browsing | N/A | N/A | Yes | No |
| Limit Type | Value | Applies To | Notes |
|---|---|---|---|
| Max rows per result page | 100,000 | Compass v2 /result endpoint | Whichever limit (rows or size) is hit first |
| Max result page size | 10 MB | Compass v2 /result endpoint | Use smaller page sizes for wide tables |
| Query timeout | 60 minutes | All Compass queries | Timed-out queries still consume compute time |
| Result expiration | ~20 hours | Completed query results | Must download before expiry |
| Content-Type header | text/plain | v2 /jobs endpoint | Required for query submission |
| Limit Type | Value | Window | Edition Differences |
|---|---|---|---|
| POST /v2/compass/jobs | 100 calls | Per minute, per tenant | Shared across all users in tenant |
| POST /v2/compass/{id}/status | 1,000 calls | Per minute, per tenant | --- |
| GET /v2/compass/jobs/{id}/result | 10,000 calls | Per minute, per tenant | --- |
| PUT /v2/compass/{id}/cancel | 1,000 calls | Per minute, per tenant | --- |
| Compute time | Metered | Per tenant | Subject to Infor OS subscription tier |
| Egress | Metered | Per tenant | Charged when data leaves Infor cloud |
All Compass API calls go through the ION API Gateway, which uses OAuth 2.0 for authentication. Credentials are provisioned as a .ionapi file downloaded from the Infor OS Portal. [src1, src6]
| Flow | Use When | Token Lifetime | Refresh? | Notes |
|---|---|---|---|---|
| OAuth 2.0 Resource Owner (Service Account) | Server-to-server, unattended ETL | ~2 hours | Yes (via refresh token) | Use saak/sask from .ionapi file as username/password to obtain bearer token |
| OAuth 2.0 Authorization Code | User-context, interactive tools | ~2 hours | Yes | Requires redirect URI; used by Compass JDBC and BI tools |
.ionapi file contains ALL credentials (tenant ID, client ID/secret, service account keys, OAuth endpoints). Treat it like a private key — never commit to source control. [src6, src8]pu (base URL) + ot (token path). The saak goes as username and sask as password in the token request. [src6]START — Extract data from Infor Data Lake
|-- What data format?
| |-- Flat (DSV or newline-delimited JSON)
| | |-- Data volume < 100K rows per query?
| | | |-- YES --> Compass v2 API (single page result)
| | | +-- NO --> Compass v2 API with pagination (offset/limit)
| | |-- Need SQL joins across tables?
| | | |-- YES --> Compass v2 API (supports JOINs) [src4]
| | | +-- NO --> Data Lake API may also work for simple lookups [src5]
| | +-- Need BI tool integration?
| | |-- YES --> Compass JDBC driver (DBeaver, Tableau, Power BI) [src3]
| | +-- NO --> Compass v2 REST API
| +-- Hierarchical/nested JSON
| +-- Data Lake API (not Compass) [src5]
|-- What's the extraction pattern?
| |-- One-time or ad-hoc extraction
| | +-- Compass v2 API or Compass UI [src3]
| |-- Scheduled recurring ETL
| | |-- < 60 min query time?
| | | |-- YES --> Compass v2 API with job scheduler
| | | +-- NO --> ION Data Lake Flows (S3 export) [src5]
| | +-- Need incremental/delta loads?
| | |-- YES --> Filter on infor.lastModified() property [src5]
| | +-- NO --> Full table query
| +-- Automated S3 export with CDC
| +-- ION Data Lake Flows (first load = full, subsequent = delta) [src5]
|-- Authentication approach?
| |-- Service account (unattended) --> OAuth 2.0 with saak/sask from .ionapi [src6]
| +-- User context (interactive) --> OAuth 2.0 authorization code flow
+-- Error tolerance?
|-- Zero-loss --> Implement idempotency via infor.DataObjectId tracking
+-- Best-effort --> Simple retry on FAILED status
| Operation | Method | Endpoint | Payload | Notes |
|---|---|---|---|---|
| Submit query | POST | /v2/compass/jobs | SQL query as text/plain body | Returns queryID for tracking [src1, src2] |
| Check status | POST | /v2/compass/{queryId}/status | None | Returns RUNNING, FINISHED, FAILED, or CANCELLED [src2] |
| Get results (paginated) | GET | /v2/compass/jobs/{queryId}/result?offset=0&limit=1000 | None | Supports text/csv or application/x-ndjson via Accept header [src2] |
| Cancel query | PUT | /v2/compass/{queryId}/cancel | None | Stops long-running queries [src2] |
| Get OAuth token | POST | {pu}{ot} (from .ionapi) | grant_type, username (saak), password (sask), client_id, client_secret | Returns access_token (~2h lifetime) [src6] |
Navigate to Infor OS Portal > ION API > Authorized Apps. Create or select a Backend Service authorized app. Click "Download Credentials" and select "Create Service Account." Save the .ionapi file securely. [src6, src8]
{
"ti": "TENANT_ID",
"cn": "common_name",
"ci": "client_id",
"cs": "client_secret",
"iu": "https://mingle-ionapi.inforcloudsuite.com/TENANT_ID",
"pu": "https://mingle-sso.inforcloudsuite.com:443/TENANT_ID/as/",
"oa": "authorization.oauth2",
"ot": "token.oauth2",
"or": "revoke_token.oauth2",
"saak": "SERVICE_ACCOUNT_ACCESS_KEY",
"sask": "SERVICE_ACCOUNT_SECRET_KEY"
}
Verify: Open the .ionapi file and confirm all fields are populated. The saak and sask fields must be present.
Use the token endpoint URL (constructed from pu + ot) with the service account credentials. [src6]
TOKEN_URL="https://mingle-sso.inforcloudsuite.com:443/TENANT_ID/as/token.oauth2"
ACCESS_TOKEN=$(curl -s -X POST "$TOKEN_URL" \
-H "Content-Type: application/x-www-form-urlencoded" \
-d "grant_type=password" \
-d "username=SERVICE_ACCOUNT_ACCESS_KEY" \
-d "password=SERVICE_ACCOUNT_SECRET_KEY" \
-d "client_id=CLIENT_ID" \
-d "client_secret=CLIENT_SECRET" \
| jq -r '.access_token')
Verify: echo $ACCESS_TOKEN should output a JWT string. Token should not be null or empty.
POST your SQL SELECT query to the /v2/compass/jobs endpoint with Content-Type text/plain. [src1, src2]
API_BASE="https://mingle-ionapi.inforcloudsuite.com/TENANT_ID/IONSERVICES/datafabric"
QUERY_ID=$(curl -s -X POST "$API_BASE/v2/compass/jobs" \
-H "Authorization: Bearer $ACCESS_TOKEN" \
-H "Content-Type: text/plain" \
-d "SELECT CONO, ITNO, ITDS, FUDS FROM MITMAS ORDER BY CONO, ITNO" \
| jq -r '.queryId')
Verify: Response contains a queryId string (UUID format).
Check the status endpoint until the query reaches FINISHED, FAILED, or CANCELLED state. [src2]
while true; do
STATUS=$(curl -s -X POST "$API_BASE/v2/compass/$QUERY_ID/status" \
-H "Authorization: Bearer $ACCESS_TOKEN" | jq -r '.status')
echo "Status: $STATUS"
[ "$STATUS" = "FINISHED" ] || [ "$STATUS" = "FAILED" ] && break
sleep 5
done
Verify: Status transitions from RUNNING to FINISHED. The rowCount field shows total available rows.
Use offset and limit parameters to page through the result set. Maximum 100,000 rows or 10 MB per page. [src2]
curl -s "$API_BASE/v2/compass/jobs/$QUERY_ID/result?offset=0&limit=10000" \
-H "Authorization: Bearer $ACCESS_TOKEN" \
-H "Accept: text/csv" \
-H "Accept-Encoding: gzip" \
-o "result_page_0.csv"
Verify: Output file contains header row + data rows matching expected count.
For result sets larger than one page, iterate with increasing offsets until all rows are retrieved. [src2]
PAGE_SIZE=50000; OFFSET=0; PAGE=0
while [ $OFFSET -lt $TOTAL_ROWS ]; do
curl -s "$API_BASE/v2/compass/jobs/$QUERY_ID/result?offset=$OFFSET&limit=$PAGE_SIZE" \
-H "Authorization: Bearer $ACCESS_TOKEN" -H "Accept: text/csv" \
-o "result_page_${PAGE}.csv"
OFFSET=$((OFFSET + PAGE_SIZE)); PAGE=$((PAGE + 1))
done
Verify: Total rows across all pages equals the rowCount from status response.
# Input: .ionapi credentials file path, SQL query
# Output: All query results as a pandas DataFrame
import json, time, requests, pandas as pd
from io import StringIO
def compass_query(ionapi_path, sql, page_size=50000):
creds = json.load(open(ionapi_path))
# Get token
token_resp = requests.post(f"{creds['pu']}{creds['ot']}", data={
'grant_type': 'password', 'username': creds['saak'],
'password': creds['sask'], 'client_id': creds['ci'],
'client_secret': creds['cs']})
token = token_resp.json()['access_token']
base = f"{creds['iu']}/IONSERVICES/datafabric"
hdrs = {'Authorization': f'Bearer {token}'}
# Submit query
qid = requests.post(f"{base}/v2/compass/jobs",
headers={**hdrs, 'Content-Type': 'text/plain'}, data=sql).json()['queryId']
# Poll status
while True:
s = requests.post(f"{base}/v2/compass/{qid}/status", headers=hdrs).json()
if s['status'] == 'FINISHED': break
if s['status'] in ('FAILED','CANCELLED'): raise RuntimeError(s)
time.sleep(3)
# Paginate results
dfs, offset = [], 0
while offset < s.get('rowCount', 0):
r = requests.get(f"{base}/v2/compass/jobs/{qid}/result",
headers={**hdrs, 'Accept': 'text/csv'},
params={'offset': offset, 'limit': page_size})
dfs.append(pd.read_csv(StringIO(r.text))); offset += page_size
return pd.concat(dfs, ignore_index=True) if dfs else pd.DataFrame()
// Input: .ionapi file path, SQL query string
// Output: Array of result objects from Data Lake
// npm install axios@1
const axios = require('axios');
const fs = require('fs');
async function compassQuery(ionapiPath, sql, pageSize = 50000) {
const creds = JSON.parse(fs.readFileSync(ionapiPath, 'utf-8'));
const tokenResp = await axios.post(`${creds.pu}${creds.ot}`,
new URLSearchParams({ grant_type: 'password', username: creds.saak,
password: creds.sask, client_id: creds.ci, client_secret: creds.cs }));
const token = tokenResp.data.access_token;
const base = `${creds.iu}/IONSERVICES/datafabric`;
const hdrs = { Authorization: `Bearer ${token}` };
// Submit
const { data: { queryId } } = await axios.post(`${base}/v2/compass/jobs`,
sql, { headers: { ...hdrs, 'Content-Type': 'text/plain' } });
// Poll
let status = 'RUNNING', rowCount = 0;
while (status === 'RUNNING') {
await new Promise(r => setTimeout(r, 3000));
const s = await axios.post(`${base}/v2/compass/${queryId}/status`, null, { headers: hdrs });
status = s.data.status; rowCount = s.data.rowCount || 0;
}
if (status !== 'FINISHED') throw new Error(`Query ${status}`);
// Paginate (NDJSON)
const rows = []; let offset = 0;
while (offset < rowCount) {
const r = await axios.get(`${base}/v2/compass/jobs/${queryId}/result`,
{ headers: { ...hdrs, Accept: 'application/x-ndjson' },
params: { offset, limit: pageSize } });
rows.push(...r.data.trim().split('\n').map(l => JSON.parse(l)));
offset += pageSize;
}
return rows;
}
# Input: .ionapi credentials, SQL query
# Output: First page of query results
# Step 1: Get token
TOKEN=$(curl -s -X POST \
"https://mingle-sso.inforcloudsuite.com:443/TENANT/as/token.oauth2" \
-d "grant_type=password&username=SAAK&password=SASK&client_id=CI&client_secret=CS" \
| jq -r '.access_token')
# Step 2: Submit query
BASE="https://mingle-ionapi.inforcloudsuite.com/TENANT/IONSERVICES/datafabric"
QID=$(curl -s -X POST "$BASE/v2/compass/jobs" \
-H "Authorization: Bearer $TOKEN" -H "Content-Type: text/plain" \
-d "SELECT TOP 10 CONO, ITNO, ITDS FROM MITMAS" | jq -r '.queryId')
# Step 3: Wait and check status
sleep 10
curl -s -X POST "$BASE/v2/compass/$QID/status" \
-H "Authorization: Bearer $TOKEN" | jq .
# Step 4: Get results as CSV
curl -s "$BASE/v2/compass/jobs/$QID/result?offset=0&limit=100" \
-H "Authorization: Bearer $TOKEN" -H "Accept: text/csv"
| Data Lake Property | SQL Name | Type | Purpose | Gotcha |
|---|---|---|---|---|
| infor.DataObjectId | dl_id | String | Unique identifier per data object in Data Lake | Not the same as the source system record ID |
| infor.lastModified() | dl_document_indexed_date | Timestamp | When data object was added/updated in Data Lake | Was named dl_document_date before 2023.02 |
| infor.DataObjectSeqId | N/A | Integer | Record/line number within a data object | Useful for deduplication in multi-line objects |
| Source system fields | Original names | Varies | Business data fields (e.g., CONO, ITNO from M3) | Field names are ERP-specific |
infor.lastModified() reflects when data arrived in Data Lake, NOT when the source record was last modified. [src5]| Code | Meaning | Cause | Resolution |
|---|---|---|---|
| FAILED (job status) | Query execution error | Syntax error, invalid table/field, timeout | Retrieve error details from /result endpoint; fix query syntax |
| 401 Unauthorized | Invalid or expired token | OAuth token expired (~2h) or invalid | Refresh token or request new one using .ionapi credentials |
| 429 Too Many Requests | Rate limit exceeded | Exceeded per-minute API call quota | Implement exponential backoff; reduce polling frequency |
| Process ERROR 401 | Invalid property name | Referencing non-existent column/field | Verify field names using Atlas UI or Data Catalog |
| INTERNAL ERROR 624 | Internal Compass error | Infrastructure issue | Contact Infor Support with full error message |
| Timeout (60 min) | Query exceeded time limit | Query too complex or scanning too much data | Optimize with WHERE clauses; use LIMIT; consider Data Lake Flows |
Check token expiry before each page request. Implement proactive token refresh when remaining lifetime < 10 minutes. [src6]Include WHERE clauses. Use infor.lastModified() for incremental extraction. Test in Compass UI first. [src3, src7]Start retrieval immediately after FINISHED status. [src2]Migrate from /IONSERVICES/datalakeapi to /IONSERVICES/datafabric. [src1]Monitor for 401 errors. Store credentials in a secrets manager. [src6]# BAD -- burns through 1,000/min status rate limit in seconds
while True:
status = check_status(query_id)
if status == 'FINISHED':
break
# GOOD -- respects rate limits, starts fast and backs off
import time
delay = 2
max_delay = 30
while True:
status = check_status(query_id)
if status in ('FINISHED', 'FAILED', 'CANCELLED'):
break
time.sleep(delay)
delay = min(delay * 1.5, max_delay)
# BAD -- wastes time, may trigger auth server rate limits
for page in range(total_pages):
token = get_access_token(creds) # new token per page!
get_results(token, query_id, offset=page * page_size)
# GOOD -- token cached, only refreshed when near expiry
token = get_access_token(creds)
token_expiry = time.time() + 7000 # ~2h minus buffer
for page in range(total_pages):
if time.time() > token_expiry:
token = get_access_token(creds)
token_expiry = time.time() + 7000
get_results(token, query_id, offset=page * page_size)
-- BAD -- scans entire table every time, may timeout
SELECT * FROM MITMAS ORDER BY ITNO
-- GOOD -- only extracts records added/modified since last run
SELECT CONO, ITNO, ITDS, FUDS FROM MITMAS
WHERE dl_document_indexed_date > '2026-02-28T00:00:00Z'
ORDER BY ITNO
/IONSERVICES/datalakeapi was removed April 2025. Fix: Migrate to Data Fabric Suite endpoint at /IONSERVICES/datafabric. [src1]text/plain. Sending application/json fails. Fix: Always set Content-Type: text/plain when POSTing to /v2/compass/jobs. [src1]For real-time needs, use ION API direct calls to the source ERP. [src5]Always call /result for FAILED queries to get the error message. [src2]Test in Compass UI first. Add WHERE clauses. Use LIMIT during dev. [src7]Select only needed columns. Start with smaller page size (10K) and adjust. [src2]# Test OAuth token acquisition
curl -s -X POST "https://mingle-sso.inforcloudsuite.com:443/TENANT/as/token.oauth2" \
-d "grant_type=password&username=SAAK&password=SASK&client_id=CI&client_secret=CS" \
| jq '{access_token: .access_token[:20], expires_in: .expires_in, token_type: .token_type}'
# Submit a minimal test query
curl -s -X POST "$BASE/v2/compass/jobs" \
-H "Authorization: Bearer $TOKEN" -H "Content-Type: text/plain" \
-d "SELECT TOP 1 * FROM MITMAS" | jq .
# Check query status
curl -s -X POST "$BASE/v2/compass/$QID/status" \
-H "Authorization: Bearer $TOKEN" | jq .
# Cancel a running query
curl -s -X PUT "$BASE/v2/compass/$QID/cancel" \
-H "Authorization: Bearer $TOKEN" | jq .
| Version | Release | Status | Breaking Changes | Migration Notes |
|---|---|---|---|---|
| Compass v2 (Data Fabric Suite) | 2024 | Current | N/A | Recommended endpoint for all new integrations |
| Compass v2 (ION API Suite) | 2022 | Deprecated (removed Apr 2025) | Base URL changed | Migrate from /datalakeapi to /datafabric |
| Compass v1 (ION API Suite) | 2019 | Deprecated (removed Apr 2025) | No pagination support | Migrate to v2 for pagination and better rate limits |
| JDBC Connection String Auth | 2021-06 | Current | N/A | Simplified authentication vs file-based approach |
Infor deprecates API endpoints with advance notice through release notes and Documentation Central. The Compass v1/v2 ION API Suite endpoints were deprecated with a migration window of approximately 12-18 months before removal. Always monitor Infor's release notes for upcoming changes. [src1]
| Use When | Don't Use When | Use Instead |
|---|---|---|
| Extracting bulk data from Infor ERP products (M3, LN, CloudSuite) published to Data Lake | Need to write data back to Infor applications | ION API or BOD-based integration |
| Building ETL pipelines to external warehouses (Snowflake, Databricks, BigQuery) | Need real-time (<1 second) data from Infor | Direct ION API calls to source ERP |
| Running SQL joins across multiple Infor data objects in Data Lake | Need to query hierarchical/nested JSON documents | Data Lake API (non-Compass) |
| BI tool connectivity (Tableau, Power BI) via Compass JDBC driver | Data volume requires >60 minutes of query time | ION Data Lake Flows (S3 export) |
| Ad-hoc data analysis and exploration | Need guaranteed delivery with CDC/event-driven patterns | ION Connect with Business Events |