Workday Prism Analytics is Workday's data hub for combining external data with native Workday HCM, Payroll, and Financial Management data. The Prism Analytics REST API (v3) provides programmatic access to create tables, define schemas, upload compressed CSV data through a bucket-based workflow, and manage data change tasks. API v3 introduced SQL-style tables, replacing the legacy v2 dataset model. Prism Analytics is included with Workday HCM and Financial Management subscriptions — there is no separate license required for the base Prism functionality, though advanced features (Data Hub, extended storage) may require additional licensing.
| Property | Value |
|---|---|
| Vendor | Workday |
| System | Workday Prism Analytics |
| API Surface | REST (HTTPS/JSON for metadata, HTTPS/gzip-CSV for data) |
| Current API Version | v3 |
| Editions Covered | All Workday tenants with Prism Analytics enabled |
| Deployment | Cloud (multi-tenant SaaS) |
| API Docs | Workday Prism Python Client |
| Status | GA (v3 current; v2 maintenance/legacy) |
| API Surface | Protocol | Best For | Max Per Request | Rate Limit | Real-time? | Bulk? |
|---|---|---|---|---|---|---|
| Prism Analytics REST API v3 | HTTPS/JSON + gzip-CSV | External data loads into Prism tables | 256 MB compressed per file | Tenant-level throttling (429) | No | Yes |
| Workday REST API | HTTPS/JSON | HCM worker data, financial transactions | Paginated (100-1000/page) | Tenant-level throttling | Yes | No |
| Workday SOAP API (WWS) | HTTPS/XML | Legacy integrations, metadata ops | Paginated | Shared with REST | Yes | No |
| RaaS (Report-as-a-Service) | HTTPS/CSV or JSON | Data extraction via custom reports | Full report output | Per-report timeout | No | Yes |
| EIB (Enterprise Interface Builder) | SFTP/file | Scheduled inbound/outbound file transfers | File-based | Scheduled | No | Yes |
| Limit Type | Value | Applies To | Notes |
|---|---|---|---|
| Max file size per upload | 256 MB (compressed) | Prism Data API file upload | Files must be gzip-compressed CSV [src1, src2] |
| Max fields per table/dataset | 1,000 | Table schema definition | Applies to both base and derived datasets [src1] |
| Max field value length | 32,000 characters | Individual field values | Truncated if exceeded [src1] |
| Max row length | 500,000 characters | Single CSV row | Rows exceeding this are rejected [src1] |
| Max concurrent uploads | 10 | Per bucket | Parallel POST requests to same bucket [src1] |
| Files per POST request | 1 | Upload endpoint | Multiple files require multiple requests [src1] |
| Limit Type | Value | Window | Edition Differences |
|---|---|---|---|
| Total storage across buckets | 100 GB (compressed) | Per tenant, cumulative | Applies to all active buckets across tenant [src2] |
| Bucket expiration | 24 hours | Per bucket lifecycle | Bucket and all uploaded files lost after expiration [src2] |
| API rate limiting | Tenant-level throttling | Rolling window | Returns 429 with x-ratelimit-remaining header [src7] |
| Published dataset storage | Tenant-configured | Per tenant | Contact Workday for tenant-specific limits |
| Flow | Use When | Token Lifetime | Refresh? | Notes |
|---|---|---|---|---|
| OAuth 2.0 Refresh Token | All Prism API operations | Access token: ~60 min | Yes — use refresh token | Recommended for all integrations [src3, src4] |
| OAuth 2.0 Client Credentials | Server-to-server (ISU-based) | Access token: session-based | No refresh token | Requires Integration System User [src4] |
https://{hostname}/ccx/oauth2/{tenant}/token. Using the wrong tenant name or hostname silently returns 401. [src4]START — User needs to load external data into Workday Prism Analytics
├── Data size per load?
│ ├── < 256 MB compressed → single file upload to bucket
│ ├── 256 MB - 100 GB compressed → split into files ≤ 256 MB, up to 10 concurrent
│ └── > 100 GB compressed → exceeds tenant cap; request increase or rolling batches
├── How often?
│ ├── One-time migration → Prism API with manual orchestration
│ ├── Daily/weekly scheduled → iPaaS (Boomi, SnapLogic) or Workday Studio
│ └── Near-real-time → NOT recommended for Prism (use Workday REST API)
├── Data change operation?
│ ├── Full refresh → TruncateAndInsert
│ ├── Append new records → Insert
│ ├── Update existing by key → Update or Upsert
│ └── Remove records → Delete
└── Integration platform?
├── Custom code → prism-python library or direct REST API
├── iPaaS → Boomi, SnapLogic, Jitterbit, Workato (native Prism connectors)
└── Workday-native → Workday Studio + EIB
| Operation | Method | Endpoint | Payload | Notes |
|---|---|---|---|---|
| List tables | GET | /api/prismAnalytics/v3/{tenant}/tables | N/A | Returns all visible Prism tables |
| Create table | POST | /api/prismAnalytics/v3/{tenant}/tables | JSON schema | Defines table name, fields, types |
| Get table | GET | /api/prismAnalytics/v3/{tenant}/tables/{id} | N/A | Returns schema and metadata |
| Create bucket | POST | /api/prismAnalytics/v3/{tenant}/tables/{id}/buckets | JSON config | Specifies operation type |
| Upload file | POST | /api/prismAnalytics/v3/{tenant}/buckets/{id}/files | gzip CSV (binary) | One file per request, max 256 MB |
| Complete bucket | POST | /api/prismAnalytics/v3/{tenant}/buckets/{id}/complete | N/A | Triggers data processing |
| Get bucket status | GET | /api/prismAnalytics/v3/{tenant}/buckets/{id} | N/A | Check processing status |
| Delete table data | POST | /api/prismAnalytics/v3/{tenant}/tables/{id}/dataChanges | JSON | Data change task with Delete |
| Operation | Behavior | Use When |
|---|---|---|
| TruncateAndInsert | Deletes all existing rows, inserts new data | Full refresh / daily reload |
| Insert | Appends new rows (no dedup) | Adding net-new records |
| Update | Updates existing rows by primary key | Modifying existing records |
| Upsert | Inserts new rows, updates existing by key | Mixed new + updated records |
| Delete | Removes rows matching criteria | Data cleanup / GDPR compliance |
Navigate to Workday Tenant Setup > API Clients for Integrations. Register a new API Client, note the Client ID and Client Secret. Create an ISU with required Prism security domains, then generate a Refresh Token. [src3, src4]
# Test OAuth token retrieval
curl -X POST "https://{hostname}/ccx/oauth2/{tenant}/token" \
-H "Content-Type: application/x-www-form-urlencoded" \
-u "{client_id}:{client_secret}" \
-d "grant_type=refresh_token&refresh_token={refresh_token}"
Verify: Response contains access_token and token_type: "Bearer".
Define your table schema as JSON with field names, types, and ordinals. [src3]
curl -X POST "https://{hostname}/api/prismAnalytics/v3/{tenant}/tables" \
-H "Authorization: Bearer {access_token}" \
-H "Content-Type: application/json" \
-d '{"name":"External_Sales_Data","fields":[...]}'
Verify: Response includes "id" for the new table.
Create a bucket specifying the data change operation. [src1, src2]
curl -X POST "https://{hostname}/api/prismAnalytics/v3/{tenant}/tables/{table_id}/buckets" \
-H "Authorization: Bearer {access_token}" \
-H "Content-Type: application/json" \
-d '{"name":"daily_load","operation":{"id":"TruncateAndInsert"}}'
Verify: Response includes bucket "id" with "state": "New". Bucket expires in 24 hours.
Compress your CSV data with gzip, then upload. Max 256 MB per file, 10 concurrent uploads. [src1]
gzip -k sales_data.csv
curl -X POST "https://{hostname}/api/prismAnalytics/v3/{tenant}/buckets/{bucket_id}/files" \
-H "Authorization: Bearer {access_token}" \
-H "Content-Type: application/octet-stream" \
--data-binary @sales_data.csv.gz
Verify: HTTP 200 response.
Signal all files uploaded — triggers Prism data processing. [src2]
curl -X POST "https://{hostname}/api/prismAnalytics/v3/{tenant}/buckets/{bucket_id}/complete" \
-H "Authorization: Bearer {access_token}"
Verify: Poll bucket status until "state": "Success".
Check the table to confirm rows were loaded. [src3]
curl -s "https://{hostname}/api/prismAnalytics/v3/{tenant}/tables/{table_id}" \
-H "Authorization: Bearer {access_token}" | jq '.rowCount'
Verify: Row count matches expected record count.
# Input: CSV file path, Workday credentials
# Output: Table ID, upload status
import prism
p = prism.Prism(
base_url="https://wd2-impl-services1.workday.com",
tenant_name="your_tenant",
client_id="your_client_id",
client_secret="your_client_secret",
refresh_token="your_refresh_token"
)
table = prism.tables_create(p, table_name="External_Data", file="schema.json")
prism.upload_file(p, file="data.csv.gz", table_id=table["id"], operation="TruncateAndInsert")
# Input: Gzip CSV file, Workday OAuth credentials
# Output: Bucket completion status
import requests, gzip, shutil
# Get token
token = requests.post(
"https://{hostname}/ccx/oauth2/{tenant}/token",
auth=("client_id", "client_secret"),
data={"grant_type": "refresh_token", "refresh_token": "token"}
).json()["access_token"]
headers = {"Authorization": f"Bearer {token}"}
BASE = "https://{hostname}/api/prismAnalytics/v3/{tenant}"
# Create bucket, upload, complete
bucket = requests.post(f"{BASE}/tables/{table_id}/buckets",
headers={**headers, "Content-Type": "application/json"},
json={"name": "load", "operation": {"id": "TruncateAndInsert"}}).json()
with open("data.csv.gz", "rb") as f:
requests.post(f"{BASE}/buckets/{bucket['id']}/files",
headers={**headers, "Content-Type": "application/octet-stream"}, data=f)
requests.post(f"{BASE}/buckets/{bucket['id']}/complete", headers=headers)
# Input: Valid OAuth credentials
# Output: JSON array of Prism Analytics tables
TOKEN=$(curl -s -X POST "https://{hostname}/ccx/oauth2/{tenant}/token" \
-u "{client_id}:{client_secret}" \
-d "grant_type=refresh_token&refresh_token={refresh_token}" | jq -r '.access_token')
curl -s "https://{hostname}/api/prismAnalytics/v3/{tenant}/tables" \
-H "Authorization: Bearer $TOKEN" | jq '.data[] | {id, name}'
| Prism Type ID | Type Name | Source Format | Gotcha |
|---|---|---|---|
| f9e5bff9...541da7 | Text | String (max 32K chars) | Exceeding 32K chars silently truncates |
| 48fbad81...7d3f6 | Numeric | Decimal (precision + scale) | Precision/scale mismatch causes load failure |
| bdd201eb...83a9 | Date | ISO 8601 (YYYY-MM-DD) | Non-ISO dates rejected |
| d6cce1d0...4d3c9 | Boolean | true/false | Must be lowercase — "True" or "1" rejected |
| 1f01ef53...b7a60 | Instance | Workday WID reference | Must match existing Workday reference data |
| Code | Meaning | Cause | Resolution |
|---|---|---|---|
| 401 | Unauthorized | Expired token or revoked refresh token | Re-authenticate; regenerate refresh token if needed |
| 403 | Forbidden | Missing Prism security domains on ISU | Add all 6 required security domains |
| 404 | Not Found | Wrong table/bucket ID or expired bucket | Verify IDs; create new bucket if expired |
| 409 | Conflict | Upload on completed/expired bucket | Create a new bucket (single-use) |
| 413 | Payload Too Large | File exceeds 256 MB compressed | Split into chunks < 256 MB |
| 422 | Unprocessable Entity | CSV schema doesn't match table schema | Verify column names, order, and types |
| 429 | Too Many Requests | Rate limit exceeded | Exponential backoff; check x-ratelimit-remaining |
Monitor elapsed time; complete partial and start new bucket for remaining files. [src2]Validate CSV headers against table schema before uploading. [src1]Pre-validate field lengths; add length assertions. [src1]Monitor for 401 on token refresh; alert immediately. [src3]Implement upload queue with max 10 parallel workers. [src1]Use standard gzip; test with Python's gzip module. [src2]# BAD — Prism API only accepts gzip-compressed files
with open("data.csv", "rb") as f:
requests.post(url, headers=headers, data=f) # Will fail
# GOOD — Compress CSV before uploading
import gzip, shutil
with open("data.csv", "rb") as f_in:
with gzip.open("data.csv.gz", "wb") as f_out:
shutil.copyfileobj(f_in, f_out)
with open("data.csv.gz", "rb") as f:
requests.post(url, headers={**headers, "Content-Type": "application/octet-stream"}, data=f)
# BAD — Single file exceeds 256 MB limit, returns 413
with open("huge_dataset.csv.gz", "rb") as f: # 500 MB
requests.post(url, headers=headers, data=f)
# GOOD — Split and upload concurrently (max 10)
import concurrent.futures
chunks = ["chunk1.csv.gz", "chunk2.csv.gz", "chunk3.csv.gz"]
with concurrent.futures.ThreadPoolExecutor(max_workers=10) as pool:
futures = [pool.submit(upload_chunk, c, bucket_id) for c in chunks]
# BAD — Bucket created yesterday, now expired (404 or 409)
requests.post(f"{BASE}/buckets/{old_bucket_id}/files", headers=headers, data=data)
# GOOD — Create new bucket immediately before upload
from datetime import datetime
bucket = requests.post(f"{BASE}/tables/{table_id}/buckets",
headers={**headers, "Content-Type": "application/json"},
json={"name": f"load_{datetime.now():%Y%m%d_%H%M}", "operation": {"id": "TruncateAndInsert"}}).json()
# Upload immediately — 24h clock starts now
Always gzip-compress RFC 4180-compliant CSV files. [src1]Create buckets immediately before uploading. [src2]Compare CSV headers against table schema endpoint before uploading. [src1]Retrieve type definitions dynamically or maintain version-pinned mapping. [src1]Grant all 6 Prism security domains. [src1]Read header on every response; pause below 25. [src7]# Test OAuth authentication
curl -s -X POST "https://{hostname}/ccx/oauth2/{tenant}/token" \
-u "{client_id}:{client_secret}" \
-d "grant_type=refresh_token&refresh_token={refresh_token}" | jq '.access_token'
# List all Prism tables (verify API access)
curl -s "https://{hostname}/api/prismAnalytics/v3/{tenant}/tables" \
-H "Authorization: Bearer {token}" | jq '.total'
# Check specific table schema
curl -s "https://{hostname}/api/prismAnalytics/v3/{tenant}/tables/{table_id}" \
-H "Authorization: Bearer {token}" | jq '.fields'
# Check bucket status (after upload)
curl -s "https://{hostname}/api/prismAnalytics/v3/{tenant}/buckets/{bucket_id}" \
-H "Authorization: Bearer {token}" | jq '{state, fileCount, errorMessage}'
# Monitor rate limit headers
curl -sv "https://{hostname}/api/prismAnalytics/v3/{tenant}/tables" \
-H "Authorization: Bearer {token}" 2>&1 | grep -i 'x-ratelimit'
| API Version | Release Date | Status | Breaking Changes | Migration Notes |
|---|---|---|---|---|
| v3 | 2023 | Current | SQL-style tables replace datasets; new endpoints | Migrate from /datasets to /tables; schema format changed |
| v2 | 2020 | Maintenance/Legacy | N/A (original GA) | Still functional; new features only in v3 |
| v1 | 2018 | Deprecated | N/A | Do not use; no longer supported |
| Use When | Don't Use When | Use Instead |
|---|---|---|
| Loading external data into Workday for blended analytics | Need real-time transactional integration | Workday REST API |
| Scheduled batch data loads (daily, weekly) | Need sub-second latency | Workday REST API or SOAP API |
| Building composite dashboards (Workday + external) | Need to extract data FROM Workday | RaaS or Workday REST API |
| Data migration or one-time bulk loads | Dataset has >1,000 columns | Split into multiple related tables |
| Automating manual browser file uploads | Data exceeds 100 GB compressed | Contact Workday for increase or rolling batch |