Workday Prism Analytics API Capabilities & Limits

Type: ERP Integration System: Workday Prism Analytics (API v3) Confidence: 0.85 Sources: 7 Verified: 2026-03-02 Freshness: evolving

TL;DR

System Profile

Workday Prism Analytics is Workday's data hub for combining external data with native Workday HCM, Payroll, and Financial Management data. The Prism Analytics REST API (v3) provides programmatic access to create tables, define schemas, upload compressed CSV data through a bucket-based workflow, and manage data change tasks. API v3 introduced SQL-style tables, replacing the legacy v2 dataset model. Prism Analytics is included with Workday HCM and Financial Management subscriptions — there is no separate license required for the base Prism functionality, though advanced features (Data Hub, extended storage) may require additional licensing.

PropertyValue
VendorWorkday
SystemWorkday Prism Analytics
API SurfaceREST (HTTPS/JSON for metadata, HTTPS/gzip-CSV for data)
Current API Versionv3
Editions CoveredAll Workday tenants with Prism Analytics enabled
DeploymentCloud (multi-tenant SaaS)
API DocsWorkday Prism Python Client
StatusGA (v3 current; v2 maintenance/legacy)

API Surfaces & Capabilities

API SurfaceProtocolBest ForMax Per RequestRate LimitReal-time?Bulk?
Prism Analytics REST API v3HTTPS/JSON + gzip-CSVExternal data loads into Prism tables256 MB compressed per fileTenant-level throttling (429)NoYes
Workday REST APIHTTPS/JSONHCM worker data, financial transactionsPaginated (100-1000/page)Tenant-level throttlingYesNo
Workday SOAP API (WWS)HTTPS/XMLLegacy integrations, metadata opsPaginatedShared with RESTYesNo
RaaS (Report-as-a-Service)HTTPS/CSV or JSONData extraction via custom reportsFull report outputPer-report timeoutNoYes
EIB (Enterprise Interface Builder)SFTP/fileScheduled inbound/outbound file transfersFile-basedScheduledNoYes

Rate Limits & Quotas

Per-Request Limits

Limit TypeValueApplies ToNotes
Max file size per upload256 MB (compressed)Prism Data API file uploadFiles must be gzip-compressed CSV [src1, src2]
Max fields per table/dataset1,000Table schema definitionApplies to both base and derived datasets [src1]
Max field value length32,000 charactersIndividual field valuesTruncated if exceeded [src1]
Max row length500,000 charactersSingle CSV rowRows exceeding this are rejected [src1]
Max concurrent uploads10Per bucketParallel POST requests to same bucket [src1]
Files per POST request1Upload endpointMultiple files require multiple requests [src1]

Rolling / Daily Limits

Limit TypeValueWindowEdition Differences
Total storage across buckets100 GB (compressed)Per tenant, cumulativeApplies to all active buckets across tenant [src2]
Bucket expiration24 hoursPer bucket lifecycleBucket and all uploaded files lost after expiration [src2]
API rate limitingTenant-level throttlingRolling windowReturns 429 with x-ratelimit-remaining header [src7]
Published dataset storageTenant-configuredPer tenantContact Workday for tenant-specific limits

Authentication

FlowUse WhenToken LifetimeRefresh?Notes
OAuth 2.0 Refresh TokenAll Prism API operationsAccess token: ~60 minYes — use refresh tokenRecommended for all integrations [src3, src4]
OAuth 2.0 Client CredentialsServer-to-server (ISU-based)Access token: session-basedNo refresh tokenRequires Integration System User [src4]

Authentication Gotchas

Constraints

Integration Pattern Decision Tree

START — User needs to load external data into Workday Prism Analytics
├── Data size per load?
│   ├── < 256 MB compressed → single file upload to bucket
│   ├── 256 MB - 100 GB compressed → split into files ≤ 256 MB, up to 10 concurrent
│   └── > 100 GB compressed → exceeds tenant cap; request increase or rolling batches
├── How often?
│   ├── One-time migration → Prism API with manual orchestration
│   ├── Daily/weekly scheduled → iPaaS (Boomi, SnapLogic) or Workday Studio
│   └── Near-real-time → NOT recommended for Prism (use Workday REST API)
├── Data change operation?
│   ├── Full refresh → TruncateAndInsert
│   ├── Append new records → Insert
│   ├── Update existing by key → Update or Upsert
│   └── Remove records → Delete
└── Integration platform?
    ├── Custom code → prism-python library or direct REST API
    ├── iPaaS → Boomi, SnapLogic, Jitterbit, Workato (native Prism connectors)
    └── Workday-native → Workday Studio + EIB

Quick Reference

Prism Analytics API v3 Endpoint Reference

OperationMethodEndpointPayloadNotes
List tablesGET/api/prismAnalytics/v3/{tenant}/tablesN/AReturns all visible Prism tables
Create tablePOST/api/prismAnalytics/v3/{tenant}/tablesJSON schemaDefines table name, fields, types
Get tableGET/api/prismAnalytics/v3/{tenant}/tables/{id}N/AReturns schema and metadata
Create bucketPOST/api/prismAnalytics/v3/{tenant}/tables/{id}/bucketsJSON configSpecifies operation type
Upload filePOST/api/prismAnalytics/v3/{tenant}/buckets/{id}/filesgzip CSV (binary)One file per request, max 256 MB
Complete bucketPOST/api/prismAnalytics/v3/{tenant}/buckets/{id}/completeN/ATriggers data processing
Get bucket statusGET/api/prismAnalytics/v3/{tenant}/buckets/{id}N/ACheck processing status
Delete table dataPOST/api/prismAnalytics/v3/{tenant}/tables/{id}/dataChangesJSONData change task with Delete

Table Operations

OperationBehaviorUse When
TruncateAndInsertDeletes all existing rows, inserts new dataFull refresh / daily reload
InsertAppends new rows (no dedup)Adding net-new records
UpdateUpdates existing rows by primary keyModifying existing records
UpsertInserts new rows, updates existing by keyMixed new + updated records
DeleteRemoves rows matching criteriaData cleanup / GDPR compliance

Step-by-Step Integration Guide

1. Register API Client and Obtain Credentials

Navigate to Workday Tenant Setup > API Clients for Integrations. Register a new API Client, note the Client ID and Client Secret. Create an ISU with required Prism security domains, then generate a Refresh Token. [src3, src4]

# Test OAuth token retrieval
curl -X POST "https://{hostname}/ccx/oauth2/{tenant}/token" \
  -H "Content-Type: application/x-www-form-urlencoded" \
  -u "{client_id}:{client_secret}" \
  -d "grant_type=refresh_token&refresh_token={refresh_token}"

Verify: Response contains access_token and token_type: "Bearer".

2. Create a Prism Table with Schema

Define your table schema as JSON with field names, types, and ordinals. [src3]

curl -X POST "https://{hostname}/api/prismAnalytics/v3/{tenant}/tables" \
  -H "Authorization: Bearer {access_token}" \
  -H "Content-Type: application/json" \
  -d '{"name":"External_Sales_Data","fields":[...]}'

Verify: Response includes "id" for the new table.

3. Create a Bucket for Data Upload

Create a bucket specifying the data change operation. [src1, src2]

curl -X POST "https://{hostname}/api/prismAnalytics/v3/{tenant}/tables/{table_id}/buckets" \
  -H "Authorization: Bearer {access_token}" \
  -H "Content-Type: application/json" \
  -d '{"name":"daily_load","operation":{"id":"TruncateAndInsert"}}'

Verify: Response includes bucket "id" with "state": "New". Bucket expires in 24 hours.

4. Upload Gzip-Compressed CSV File(s)

Compress your CSV data with gzip, then upload. Max 256 MB per file, 10 concurrent uploads. [src1]

gzip -k sales_data.csv
curl -X POST "https://{hostname}/api/prismAnalytics/v3/{tenant}/buckets/{bucket_id}/files" \
  -H "Authorization: Bearer {access_token}" \
  -H "Content-Type: application/octet-stream" \
  --data-binary @sales_data.csv.gz

Verify: HTTP 200 response.

5. Complete the Bucket

Signal all files uploaded — triggers Prism data processing. [src2]

curl -X POST "https://{hostname}/api/prismAnalytics/v3/{tenant}/buckets/{bucket_id}/complete" \
  -H "Authorization: Bearer {access_token}"

Verify: Poll bucket status until "state": "Success".

6. Verify Data in Prism Table

Check the table to confirm rows were loaded. [src3]

curl -s "https://{hostname}/api/prismAnalytics/v3/{tenant}/tables/{table_id}" \
  -H "Authorization: Bearer {access_token}" | jq '.rowCount'

Verify: Row count matches expected record count.

Code Examples

Python: Load Data Using Official prism-python Client

# Input:  CSV file path, Workday credentials
# Output: Table ID, upload status

import prism

p = prism.Prism(
    base_url="https://wd2-impl-services1.workday.com",
    tenant_name="your_tenant",
    client_id="your_client_id",
    client_secret="your_client_secret",
    refresh_token="your_refresh_token"
)

table = prism.tables_create(p, table_name="External_Data", file="schema.json")
prism.upload_file(p, file="data.csv.gz", table_id=table["id"], operation="TruncateAndInsert")

Python: Direct REST API Upload (No Client Library)

# Input:  Gzip CSV file, Workday OAuth credentials
# Output: Bucket completion status

import requests, gzip, shutil

# Get token
token = requests.post(
    "https://{hostname}/ccx/oauth2/{tenant}/token",
    auth=("client_id", "client_secret"),
    data={"grant_type": "refresh_token", "refresh_token": "token"}
).json()["access_token"]

headers = {"Authorization": f"Bearer {token}"}
BASE = "https://{hostname}/api/prismAnalytics/v3/{tenant}"

# Create bucket, upload, complete
bucket = requests.post(f"{BASE}/tables/{table_id}/buckets",
    headers={**headers, "Content-Type": "application/json"},
    json={"name": "load", "operation": {"id": "TruncateAndInsert"}}).json()

with open("data.csv.gz", "rb") as f:
    requests.post(f"{BASE}/buckets/{bucket['id']}/files",
        headers={**headers, "Content-Type": "application/octet-stream"}, data=f)

requests.post(f"{BASE}/buckets/{bucket['id']}/complete", headers=headers)

cURL: Quick API Test — List Prism Tables

# Input:  Valid OAuth credentials
# Output: JSON array of Prism Analytics tables

TOKEN=$(curl -s -X POST "https://{hostname}/ccx/oauth2/{tenant}/token" \
  -u "{client_id}:{client_secret}" \
  -d "grant_type=refresh_token&refresh_token={refresh_token}" | jq -r '.access_token')

curl -s "https://{hostname}/api/prismAnalytics/v3/{tenant}/tables" \
  -H "Authorization: Bearer $TOKEN" | jq '.data[] | {id, name}'

Data Mapping

Prism Analytics Field Type Reference

Prism Type IDType NameSource FormatGotcha
f9e5bff9...541da7TextString (max 32K chars)Exceeding 32K chars silently truncates
48fbad81...7d3f6NumericDecimal (precision + scale)Precision/scale mismatch causes load failure
bdd201eb...83a9DateISO 8601 (YYYY-MM-DD)Non-ISO dates rejected
d6cce1d0...4d3c9Booleantrue/falseMust be lowercase — "True" or "1" rejected
1f01ef53...b7a60InstanceWorkday WID referenceMust match existing Workday reference data

Data Type Gotchas

Error Handling & Failure Points

Common Error Codes

CodeMeaningCauseResolution
401UnauthorizedExpired token or revoked refresh tokenRe-authenticate; regenerate refresh token if needed
403ForbiddenMissing Prism security domains on ISUAdd all 6 required security domains
404Not FoundWrong table/bucket ID or expired bucketVerify IDs; create new bucket if expired
409ConflictUpload on completed/expired bucketCreate a new bucket (single-use)
413Payload Too LargeFile exceeds 256 MB compressedSplit into chunks < 256 MB
422Unprocessable EntityCSV schema doesn't match table schemaVerify column names, order, and types
429Too Many RequestsRate limit exceededExponential backoff; check x-ratelimit-remaining

Failure Points in Production

Anti-Patterns

Wrong: Uploading uncompressed CSV directly

# BAD — Prism API only accepts gzip-compressed files
with open("data.csv", "rb") as f:
    requests.post(url, headers=headers, data=f)  # Will fail

Correct: Always gzip-compress before upload

# GOOD — Compress CSV before uploading
import gzip, shutil
with open("data.csv", "rb") as f_in:
    with gzip.open("data.csv.gz", "wb") as f_out:
        shutil.copyfileobj(f_in, f_out)
with open("data.csv.gz", "rb") as f:
    requests.post(url, headers={**headers, "Content-Type": "application/octet-stream"}, data=f)

Wrong: Loading >256 MB as a single file

# BAD — Single file exceeds 256 MB limit, returns 413
with open("huge_dataset.csv.gz", "rb") as f:  # 500 MB
    requests.post(url, headers=headers, data=f)

Correct: Split large datasets into chunks < 256 MB

# GOOD — Split and upload concurrently (max 10)
import concurrent.futures
chunks = ["chunk1.csv.gz", "chunk2.csv.gz", "chunk3.csv.gz"]
with concurrent.futures.ThreadPoolExecutor(max_workers=10) as pool:
    futures = [pool.submit(upload_chunk, c, bucket_id) for c in chunks]

Wrong: Reusing expired buckets

# BAD — Bucket created yesterday, now expired (404 or 409)
requests.post(f"{BASE}/buckets/{old_bucket_id}/files", headers=headers, data=data)

Correct: Always create a fresh bucket per load cycle

# GOOD — Create new bucket immediately before upload
from datetime import datetime
bucket = requests.post(f"{BASE}/tables/{table_id}/buckets",
    headers={**headers, "Content-Type": "application/json"},
    json={"name": f"load_{datetime.now():%Y%m%d_%H%M}", "operation": {"id": "TruncateAndInsert"}}).json()
# Upload immediately — 24h clock starts now

Common Pitfalls

Diagnostic Commands

# Test OAuth authentication
curl -s -X POST "https://{hostname}/ccx/oauth2/{tenant}/token" \
  -u "{client_id}:{client_secret}" \
  -d "grant_type=refresh_token&refresh_token={refresh_token}" | jq '.access_token'

# List all Prism tables (verify API access)
curl -s "https://{hostname}/api/prismAnalytics/v3/{tenant}/tables" \
  -H "Authorization: Bearer {token}" | jq '.total'

# Check specific table schema
curl -s "https://{hostname}/api/prismAnalytics/v3/{tenant}/tables/{table_id}" \
  -H "Authorization: Bearer {token}" | jq '.fields'

# Check bucket status (after upload)
curl -s "https://{hostname}/api/prismAnalytics/v3/{tenant}/buckets/{bucket_id}" \
  -H "Authorization: Bearer {token}" | jq '{state, fileCount, errorMessage}'

# Monitor rate limit headers
curl -sv "https://{hostname}/api/prismAnalytics/v3/{tenant}/tables" \
  -H "Authorization: Bearer {token}" 2>&1 | grep -i 'x-ratelimit'

Version History & Compatibility

API VersionRelease DateStatusBreaking ChangesMigration Notes
v32023CurrentSQL-style tables replace datasets; new endpointsMigrate from /datasets to /tables; schema format changed
v22020Maintenance/LegacyN/A (original GA)Still functional; new features only in v3
v12018DeprecatedN/ADo not use; no longer supported

When to Use / When Not to Use

Use WhenDon't Use WhenUse Instead
Loading external data into Workday for blended analyticsNeed real-time transactional integrationWorkday REST API
Scheduled batch data loads (daily, weekly)Need sub-second latencyWorkday REST API or SOAP API
Building composite dashboards (Workday + external)Need to extract data FROM WorkdayRaaS or Workday REST API
Data migration or one-time bulk loadsDataset has >1,000 columnsSplit into multiple related tables
Automating manual browser file uploadsData exceeds 100 GB compressedContact Workday for increase or rolling batch

Important Caveats

Related Units