Salesforce Platform Events, Change Data Capture, and Pub/Sub API

Type: ERP Integration System: Salesforce Platform (API v62.0) Confidence: 0.90 Sources: 8 Verified: 2026-03-01 Freshness: 2026-03-01

TL;DR

System Profile

Salesforce's event-driven architecture centers on the Event Bus, a unified message backbone that carries Platform Events, Change Data Capture events, and standard platform events. External consumers can subscribe via the legacy CometD-based Streaming API or the newer gRPC-based Pub/Sub API (GA since Spring '22). All new custom platform events default to high-volume since Spring '23; legacy standard-volume events can no longer be created. [src1, src4]

PropertyValue
VendorSalesforce
SystemSalesforce Platform (API v62.0, Spring '26)
API SurfacePlatform Events, Change Data Capture, Pub/Sub API, Streaming API
Current API Versionv62.0
Editions CoveredEnterprise, Performance, Unlimited, Developer
DeploymentCloud
API DocsPlatform Events Developer Guide
StatusGA (Pub/Sub API GA since Spring '22; Streaming API GA, not deprecated)

API Surfaces & Capabilities

API SurfaceProtocolBest ForEvent RetentionRate LimitReal-time?Direction
Platform Events (High-Volume)Event BusCustom event messages between systems72 hours250K publishes/hour (Perf/Unlim)YesPublish + Subscribe
Change Data Capture (CDC)Event BusAutomatic record-change notifications72 hours (3 days)Shared with PE delivery allocationYesSubscribe only
Pub/Sub APIgRPC / HTTP/2 / AvroExternal subscription, high throughputN/A (consumes from Event Bus)1,000 concurrent gRPC streams/channelYesPublish + Subscribe
Streaming APICometD / Bayeux / HTTP/1.1 / JSONLegacy external subscriptionN/A (consumes from Event Bus)2,000 concurrent CometD clients/orgYesSubscribe only

[src1, src2, src3, src4]

Rate Limits & Quotas

Per-Request Limits

Limit TypeValueApplies ToNotes
Max events per FetchRequest100Pub/Sub API Subscribe RPCMaximum num_requested value per request [src2]
Max event message size1 MBPub/Sub API PublishRequestPer individual event in a batch [src2]
Recommended batch size200 eventsPub/Sub API PublishRequestTotal batch < 3 MB recommended [src2]
Hard gRPC message limit4 MBPub/Sub API PublishRequestExceeding causes publish failure and stream closure [src5]
Concurrent gRPC streams1,000Pub/Sub API per channelActive RPC calls on same gRPC channel [src5]
Max managed subscriptions200Pub/Sub API per orgUnique managed subscriptions per Salesforce org [src2]
Max custom fields per PE25Platform Event definitionPer platform event definition [src1]

Rolling / Daily Limits

Limit TypeValueWindowEdition Differences
Platform Event hourly publishing250,0001 hourPerformance/Unlimited: 250K; add-on adds +25K/hour [src1]
Platform Event + CDC daily delivery50,000 base24 hoursPerformance/Unlimited: 50K; add-on adds +100K/day (shifts to 3M/month) [src1]
CDC entity selection5 objectsOngoingAll editions: 5 entities without add-on; add-on removes limit [src3]
Concurrent CometD clients2,000Per orgShared across all Streaming API subscriptions [src7]
Event retention (high-volume PE)72 hoursRollingEvents beyond 72h are permanently deleted [src6]
Event retention (CDC)72 hours (3 days)RollingSame retention window as high-volume platform events [src3]
Event retention (legacy standard-volume PE)24 hoursRollingStandard-volume events can no longer be created [src1]

Delivery Counting Methodology

Delivery allocations are consumed per subscriber, not per published event. Publishing 1,000 events to 10 subscribers consumes 10,000 deliveries (1,000 × 10) from the daily allocation. This is the single most impactful limit for fan-out architectures. [src1, src7]

Authentication

FlowUse WhenToken LifetimeRefresh?Notes
OAuth 2.0 JWT BearerServer-to-server Pub/Sub API connectionsSession timeout (default 2h)New JWT per requestRecommended for unattended integrations [src5]
OAuth 2.0 Web ServerUser-context event publishingAccess: 2h, Refresh: until revokedYesRequires callback URL
Connected App + Client CredentialsService-to-service without user contextConfigurableYesAvailable since Winter '23

Authentication Gotchas

Constraints

Integration Pattern Decision Tree

START -- Need event-driven integration with Salesforce
|-- What triggers the event?
|   |-- Custom business logic (order placed, status changed)
|   |   |-- Need custom event schema?
|   |   |   |-- YES --> Platform Events (define __e custom event)
|   |   |   |-- NO --> Standard platform events (LoginEvent, etc.)
|   |-- Any record change on specific objects (create/update/delete/undelete)
|   |   |-- YES --> Change Data Capture (CDC)
|   |   |   |-- Tracking <= 5 objects?
|   |   |   |   |-- YES --> CDC works without add-on license
|   |   |   |   |-- NO --> Requires Platform Events add-on license
|   |   |-- Need field-level change tracking?
|   |       |-- YES --> CDC (delivers changed fields + ChangeEventHeader)
|   |       |-- NO --> Platform Events (custom schema, publish explicitly)
|-- Where is the subscriber?
|   |-- External system (outside Salesforce)
|   |   |-- Greenfield / new integration?
|   |   |   |-- YES --> Pub/Sub API (gRPC) -- modern, efficient, recommended
|   |   |   |-- NO (existing CometD) --> Streaming API still works, migrate when ready
|   |   |-- Need > 2,000 concurrent subscribers?
|   |       |-- YES --> Fan-out via middleware (Heroku, Kafka, EventBridge)
|   |       |-- NO --> Direct Pub/Sub API subscription
|   |-- Inside Salesforce (Apex, Flow, LWC)
|       |-- Apex trigger on platform event (after insert)
|       |-- Flow: Platform Event-Triggered
|       |-- LWC: empApi for real-time UI updates
|-- Volume and reliability?
    |-- < 250K events/hour, < 50K deliveries/day?
    |   |-- YES --> Standard allocation sufficient
    |   |-- NO --> Purchase Platform Events add-on
    |-- Need guaranteed delivery?
        |-- YES --> Store replay ID, resubscribe on disconnect
        |-- NO --> Fire-and-forget acceptable

Quick Reference

Platform Events vs CDC vs Pub/Sub API

CapabilityPlatform EventsChange Data CapturePub/Sub API
What it doesCustom event messages you define and publishAutomatic events on record CUD operationsgRPC interface to subscribe/publish events
Event schemaCustom (__e object definition)Automatic (mirrors object fields)Consumes PE + CDC events
PublishingApex, API, Flow, Process BuilderAutomatic (system-generated)gRPC PublishRequest
SubscribingApex trigger, Flow, empApi, CometD, gRPCApex trigger, Flow, empApi, CometD, gRPCgRPC Subscribe/ManagedSubscribe RPC
Retention72h (high-volume)72h (3 days)N/A (reads from Event Bus)
Replay supportYes (replay ID)Yes (replay ID)Yes (replay ID or custom preset)
Payload format (external)JSON (CometD) / Avro (Pub/Sub)JSON (CometD) / Avro (Pub/Sub)Avro binary only
ProtocolVia Streaming API or Pub/Sub APIVia Streaming API or Pub/Sub APIgRPC / HTTP/2
OrderingWithin single Apex transactionPer object per transactionPreserved from source

Streaming API (CometD) vs Pub/Sub API (gRPC)

FeatureStreaming API (CometD)Pub/Sub API (gRPC)
ProtocolHTTP/1.1, Bayeux/CometDHTTP/2, gRPC
Payload formatJSONApache Avro (binary, compressed)
Delivery modelPush (server pushes to client)Pull (client requests N events)
Flow controlNone (server controls pace)Client-controlled (num_requested)
EfficiencyHigher bandwidth, larger payloads~7-10x faster; compressed byte buffers
Language supportJavaScript (CometD client)11+ languages (Python, Java, Node, C++, Go)
PublishingNot supported (subscribe only)Supported (PublishRequest RPC)
Concurrent limit2,000 CometD clients/org1,000 streams/gRPC channel (no org-wide cap)
Managed subscriptionsNoYes (up to 200/org)
Deprecation statusNot deprecated, no new featuresActively developed, recommended

[src4, src5]

Step-by-Step Integration Guide

1. Define a Platform Event (if using custom events)

Navigate to Setup > Platform Events > New Platform Event. Define fields for your event payload. The API name ends with __e. For CDC, skip this step. [src1]

Setup > Integrations > Platform Events > New Platform Event
- Label: Order Placed
- API Name: Order_Placed__e
- Publish Behavior: Publish After Commit
- Add custom fields: Order_Id__c, Amount__c, Status__c

Verify: GET /services/data/v62.0/sobjects/Order_Placed__e/describe — returns field definitions.

2. Enable CDC for target objects (if using CDC)

Navigate to Setup > Integrations > Change Data Capture. Select up to 5 objects (without add-on). [src3]

Setup > Integrations > Change Data Capture
- Select objects: Account, Opportunity, Contact, Lead, Case
- Save -- CDC begins immediately
- Channel name: /data/AccountChangeEvent, /data/OpportunityChangeEvent, etc.

Verify: Update a record on a selected object — the change event should appear on the corresponding channel within seconds.

3. Connect via Pub/Sub API (gRPC)

Establish a gRPC connection to api.pubsub.salesforce.com:7443. Use OAuth access token in gRPC metadata. [src5]

import grpc

PUBSUB_ENDPOINT = "api.pubsub.salesforce.com:7443"

channel_credentials = grpc.ssl_channel_credentials()
channel = grpc.secure_channel(PUBSUB_ENDPOINT, channel_credentials)

metadata = (
    ("accesstoken", "YOUR_OAUTH_ACCESS_TOKEN"),
    ("instanceurl", "https://yourinstance.my.salesforce.com"),
    ("tenantid", "YOUR_ORG_ID"),
)

Verify: Call GetTopic RPC with your event channel name — returns topic metadata and schema ID.

4. Subscribe to events and handle replay

Use the Subscribe RPC to receive events. Store the replay ID after each batch. Max 100 events per FetchRequest. [src2, src6]

def subscribe_to_events(stub, topic_name, replay_id=None):
    fetch_request = {
        "topic_name": topic_name,
        "num_requested": 100,  # max per FetchRequest
    }
    if replay_id:
        fetch_request["replay_id"] = replay_id
    else:
        fetch_request["replay_preset"] = "LATEST"

    for response in stub.Subscribe(iter([fetch_request]), metadata=metadata):
        for event in response.events:
            payload = decode_avro(event.event.payload, response.schema_id)
            last_replay_id = event.replay_id  # Store for reconnection
        yield {"topic_name": topic_name, "num_requested": 100}

Verify: Publish a test event — subscriber receives it within seconds. Store replay_id and verify reconnection resumes correctly.

Code Examples

JavaScript/Node.js: Subscribe to Platform Events via CometD

// Input:  Salesforce OAuth token, event channel name
// Output: Real-time event notifications via CometD long-polling

const jsforce = require("jsforce"); // v2.x

const conn = new jsforce.Connection({
  loginUrl: "https://login.salesforce.com",
  accessToken: process.env.SF_ACCESS_TOKEN,
  instanceUrl: process.env.SF_INSTANCE_URL,
});

// Subscribe to Platform Event
conn.streaming.topic("/event/Order_Placed__e").subscribe((message) => {
  console.log("Event received:", JSON.stringify(message, null, 2));
  const replayId = message.event.replayId;
  // Store replayId persistently for reconnection
});

// Subscribe to CDC
conn.streaming.topic("/data/AccountChangeEvent").subscribe((message) => {
  const header = message.payload.ChangeEventHeader;
  console.log(`Change: ${header.changeType}, IDs: ${header.recordIds}`);
});

cURL: Publish a Platform Event via REST API

# Publish a single platform event
curl -X POST \
  "https://yourinstance.my.salesforce.com/services/data/v62.0/sobjects/Order_Placed__e" \
  -H "Authorization: Bearer ${SF_ACCESS_TOKEN}" \
  -H "Content-Type: application/json" \
  -d '{"Order_Id__c":"ORD-2026-001","Amount__c":1500.00,"Status__c":"Confirmed"}'
# Response: {"id":"e01xx0000000001AAA","success":true,"errors":[]}

# Check platform event usage limits
curl -s "https://yourinstance.my.salesforce.com/services/data/v62.0/limits" \
  -H "Authorization: Bearer ${SF_ACCESS_TOKEN}" \
  | jq '.HourlyPublishedPlatformEvents, .DailyDeliveredPlatformEvents'

Data Mapping

CDC ChangeEventHeader Fields

FieldTypeDescriptionGotcha
changeTypeStringCREATE, UPDATE, DELETE, UNDELETE, GAP_*GAP_ types indicate missed events — trigger full reconciliation [src3]
changedFieldsString[]API names of changed fieldsEmpty for CREATE (all fields in body) [src3]
recordIdsString[]IDs of affected recordsCan contain multiple IDs when batched [src3]
commitTimestampLongEpoch timestamp of DML commitNot the publish time — delivery delay is possible [src3]
transactionKeyStringUnique ID for the transactionUse to group events from same Apex transaction [src3]
sequenceNumberIntegerOrder within transactionEvents with same transactionKey ordered by this [src3]
entityNameStringSObject API nameUse to route from compound channels [src3]
commitUserStringUser ID who made the changeMay be automation user [src3]

Data Type Gotchas

Error Handling & Failure Points

Common Error Codes

CodeMeaningCauseResolution
OPERATION_TOO_LARGEEvent payload exceeds 1 MBSingle event too largeReduce payload; split into multiple events [src2]
UNAVAILABLE (gRPC)Server temporarily unavailableService disruptionExponential backoff: 1s, 2s, 4s, 8s, max 60s [src5]
UNAUTHENTICATED (gRPC)Invalid or expired tokenOAuth token expiredRefresh token and reconnect [src5]
RESOURCE_EXHAUSTED (gRPC)Rate limit exceededExceeds concurrent stream or publishing limitReduce streams or publishing rate [src2]
GAP_OVERFLOWCDC events lostEvent volume exceeded delivery capacityFull data reconciliation required [src3]
INVALID_REPLAY_IDReplay ID outside retention windowReplay ID older than 72 hoursUse EARLIEST preset or LATEST; full sync may be needed [src6]

Failure Points in Production

Anti-Patterns

Wrong: Polling REST API for changes instead of using CDC

# BAD -- Queries all records every 5 minutes to find changes
# Wastes API calls, misses changes between polls
while True:
    results = sf.query("SELECT Id, Name FROM Opportunity WHERE SystemModstamp > {last_poll}")
    process_changes(results)
    time.sleep(300)

Correct: Subscribe to CDC for real-time change notifications

# GOOD -- CDC delivers changes in real-time, zero API call overhead
def handle_cdc_event(event):
    header = event["ChangeEventHeader"]
    if header["changeType"] in ("CREATE", "UPDATE"):
        sync_to_target(header["recordIds"], event)
    elif header["changeType"] == "DELETE":
        delete_from_target(header["recordIds"])
    elif header["changeType"].startswith("GAP_"):
        trigger_full_reconciliation(header["entityName"])

Wrong: One CometD subscription per downstream consumer

// BAD -- Each CometD client counts against 2,000 org limit
services.forEach((svc) => {
  const client = new CometDClient(sfCredentials);
  client.subscribe("/event/Order_Placed__e", svc.handler);
});

Correct: Single subscriber with middleware fan-out

// GOOD -- One subscriber, fan-out via Kafka
const pubsubClient = new SalesforcePubSubClient(credentials);
pubsubClient.subscribe("/event/Order_Placed__e", async (event) => {
  await kafkaProducer.send({
    topic: "salesforce.order-placed",
    messages: [{ value: JSON.stringify(event) }],
  });
});

Wrong: Ignoring replay IDs on reconnect

# BAD -- Always subscribes from LATEST, losing events during downtime
subscriber.subscribe(topic="/data/AccountChangeEvent", replay_preset="LATEST")

Correct: Persisting replay IDs for gap-free reconnection

# GOOD -- Stores replay ID, resumes from last position
last_id = load_replay_id(topic)
if last_id:
    subscriber.subscribe(topic=topic, replay_id=last_id)
else:
    subscriber.subscribe(topic=topic, replay_preset="EARLIEST")

for event in subscriber.events():
    process(event)
    save_replay_id(topic, event.replay_id)

Common Pitfalls

Diagnostic Commands

# Check platform event usage / remaining limits
curl -s "https://${SF_INSTANCE}/services/data/v62.0/limits" \
  -H "Authorization: Bearer ${SF_ACCESS_TOKEN}" \
  | jq '{HourlyPublishedPlatformEvents, DailyDeliveredPlatformEvents}'

# List all platform event definitions in the org
curl -s "https://${SF_INSTANCE}/services/data/v62.0/sobjects" \
  -H "Authorization: Bearer ${SF_ACCESS_TOKEN}" \
  | jq '[.sobjects[] | select(.name | endswith("__e")) | {name, label}]'

# Describe a specific platform event schema
curl -s "https://${SF_INSTANCE}/services/data/v62.0/sobjects/Order_Placed__e/describe" \
  -H "Authorization: Bearer ${SF_ACCESS_TOKEN}" \
  | jq '{name, fields: [.fields[] | {name, type, length}]}'

# Test publishing a platform event
curl -X POST "https://${SF_INSTANCE}/services/data/v62.0/sobjects/Order_Placed__e" \
  -H "Authorization: Bearer ${SF_ACCESS_TOKEN}" \
  -H "Content-Type: application/json" \
  -d '{"Order_Id__c":"TEST-001","Amount__c":100,"Status__c":"Test"}'

Version History & Compatibility

API/FeatureReleaseStatusKey ChangesNotes
Pub/Sub APISpring '22 (v54.0)GAInitial GA — gRPC Subscribe + PublishReplaced need for external CometD libraries [src8]
Pub/Sub Managed SubscriptionsSpring '24 (v60.0)BetaServer-managed subscription stateSalesforce tracks replay position [src2]
High-Volume PE as defaultSpring '23 (v57.0)GAAll new PE definitions are high-volumeStandard-volume creation disabled [src1]
Change Data CaptureWinter '19 (v44.0)GAInitial GA for selected standard objectsEntity limit: 5 without add-on [src3]
Streaming API (CometD)API v21.0GALong-standing, stableNot deprecated; Pub/Sub API preferred [src4]
Platform EventsWinter '17 (v38.0)GAInitial GAApex triggers, Flow, API publishing [src1]

When to Use / When Not to Use

Use WhenDon't Use WhenUse Instead
Need real-time notification of Salesforce record changesNeed to bulk-export historical dataBulk API 2.0
External system needs to react to Salesforce eventsNeed to query current record stateREST API SOQL query
Building event-driven microservice architectureNeed request-response patternREST API or Composite API
Tracking field-level changes on specific objects (CDC)Tracking > 5 objects without add-onPlatform Events with Apex triggers
Need guaranteed delivery with replay (within 72h)Need event retention > 72 hoursExternal broker (Kafka, AWS SQS)
Fan-out to single middleware subscriberFan-out to > 2,000 direct subscribersMiddleware hub-and-spoke pattern

Important Caveats

Related Units