Important
For use with the Heroku Integration and Heroku Eventing pilots only
This sample seamlessly delegates the processing of large amounts of data with significant compute requirements to Heroku Worker processes. It also demonstrates the use of the Unit of Work aspect of the SDK (JavaScript only for the pilot) for easier utilization of the Salesforce Composite APIs.
The scenario used in this sample illustrates a basis for processing large volumes of Salesforce data using elastically scalable Heroku worker processes that execute complex compute calculations. In this case Opportunity data is read and calculated pricing data is stored in an associated Quote. Calculating quote information from opportunities can become quite intensive, especially when large multinational businesses have complex rules that impact pricing related to region, products, and discount thresholds. It's also possible that such code already exists, and there is a desire to reuse it within a Salesforce context.
This sample includes two process types web
and worker
, both can be scaled vertically and horizontally to speed up processing. The web
process will receive API calls from Salesforce and worker
will execute the jobs asynchronously. A Heroku Key Value Store (Heroku Redis) is used to create means to communicate between the two processes.
Note
This sample could be considered an alternative to using Batch Apex if your data volumes and/or compute complexity requires it. In addition Heroku worker processes scale elastically and can thus avoid queue wait times impacting processing time that can occur with Batch Apex. For further information see Technical Information below.
- Heroku login
- Heroku Integration Pilot enabled
- Heroku CLI installed
- Heroku Integration Pilot CLI plugin is installed
- Salesforce CLI installed
- Login information for one or more Scratch, Development or Sandbox orgs
- Watch the Introduction to the Heroku Integration Pilot for Developers video
As with other samples (see below) this section focuses on how to develop and test locally before deploying to Heroku and testing from within a Salesforce org. Using the heroku local
command (part of the Heroku CLI) we can easily launch locally both the web
and worker
processes defined in the Procfile
from one command. Run the following commands to run the sample locally against a remotely provisioned Heroku Key Value Store.
Important
If have deployed the application, as described below and want to return to local development, you may want to destroy it to avoid race conditions since both will share the same job queue, use heroku destroy
. In real situation you would have a different queue store for developer vs production.
Start with the following commands to create an empty application and provision within Heroku a key value store this sample uses to manage the job queue:
# Create app and add Redis
heroku create
heroku addons:create heroku-redis:mini --wait
# Copy Heroku Redis URL to local .env file
heroku config --shell > .env
# Install dependencies
pnpm install
# Run locally (starts both web and worker as defined in Procfile)
heroku local -f Procfile.local web=1,worker=1
Open a new terminal window and enter the following command to start a job that generates sample data:
# Use the invoke script (replace my-org with your org alias)
# Target: POST /api/data/create (Default: 10 Opportunities)
./bin/invoke.sh my-org 'http://localhost:5000/api/data/create' '{}'
# To specify a different number (e.g., 100), use the query parameter:
./bin/invoke.sh my-org 'http://localhost:5000/api/data/create?numberOfOpportunities=100' '{}'
This will respond with a job Id, as shown in the example below:
Response from server:
{"jobId":"b7bfb6bd-8db8-4e4f-b0ad-c98966e91dde"}
Review the log output from the heroku local
process and you will see output similar to the following (timestamps and specific IDs will vary):
web.1 | Job published to Redis channel jobsChannel...
worker.1 | Worker received job with ID: b63e2cbd-cb6a-4be9-b2e1-0b1ab928938b for data operation: create
worker.1 | Starting data creation via Bulk API v2 for Job ID: b63e2cbd-cb6a-4be9-b2e1-0b1ab928938b, Count: 10
worker.1 | Preparing Bulk API v2 Opportunity creation job for Job ID: b63e2cbd-cb6a-4be9-b2e1-0b1ab928938b
worker.1 | Submitted Bulk API v2 Opportunity creation job with ID: 750am00000Q3m1BAAR...
worker.1 | Polling Bulk API v2 job status for Job ID: 750am00000Q3m1BAAR...
worker.1 | Bulk API v2 Job 750am00000Q3m1BAAR status: UploadComplete
worker.1 | Bulk API v2 Job 750am00000Q3m1BAAR status: InProgress
worker.1 | Bulk API v2 Job 750am00000Q3m1BAAR status: JobComplete
worker.1 | Bulk API v2 Job 750am00000Q3m1BAAR processing complete.
worker.1 | Opportunity creation job 750am00000Q3m1BAAR completed. State: JobComplete, Processed: 10, Failed: 0
worker.1 | Extracted 10 successful Opportunity IDs for Job ID: b63e2cbd-cb6a-4be9-b2e1-0b1ab928938b
worker.1 | Preparing Bulk API v2 OLI creation job for 10 Opportunities...
worker.1 | Submitted Bulk API v2 OLI creation job with ID: 750am00000Q3zmNAAR...
worker.1 | Polling Bulk API v2 job status for Job ID: 750am00000Q3zmNAAR...
worker.1 | Bulk API v2 Job 750am00000Q3zmNAAR status: InProgress
worker.1 | Bulk API v2 Job 750am00000Q3zmNAAR status: JobComplete
worker.1 | Bulk API v2 Job 750am00000Q3zmNAAR processing complete.
worker.1 | OLI creation job 750am00000Q3zmNAAR completed. State: JobComplete, Processed: 20, Failed: 0
worker.1 | Job processing completed for Job ID: b63e2cbd-cb6a-4be9-b2e1-0b1ab928938b
Finally navigate to the Opportunities tab in your Salesforce org and you should see something like the following
Run the following command to execute a batch job to generate Quote records from the Opportunity records created above.
# Target: POST /api/executebatch
./bin/invoke.sh my-org http://localhost:5000/api/executebatch '{"soqlWhereClause": "Name LIKE '\''Sample Opportunity%'\''"}'
Observe the log output from heroku local
and you will see output similar to the following:
web.1 | Job published to Redis channel jobsChannel...
worker.1 | Worker received job with ID: 778412d8-f56f-4a11-ad62-09174339e5f9 for SOQL WHERE clause: Name LIKE 'Sample Opportunity%'
worker.1 | Processing 10 Opportunities
worker.1 | Submitting UnitOfWork to create 10 Quotes and 20 Line Items
worker.1 | Job processing completed for Job ID: 778412d8-f56f-4a11-ad62-09174339e5f9. Results: 10 succeeded, 0 failed.
Navigate to the Quotes tab in your org to review the generates records:
Next we will deploy the application and import it into a Salesforce org to allow jobs to be started from Apex, Flow or Agentforce.
Important
Check you are not still running the application locally. If you want to start over at any time use heroku destroy
to delete your app.
Steps below leverage the sf
CLI as well so please ensure you have authenticated your org already - if not you can use this command:
sf org login web --alias my-org
First, if you have not done so above, create the application and provision a key value store to manage the job queue.
heroku create
heroku addons:create heroku-redis:mini --wait
Next deploy the application and scale both the web
and worker
processes to run on a single dyno each.
# Deploy code via Git
git push heroku main
# Scale processes (defined in Procfile)
heroku ps:scale web=1 worker=1
Next config the Heroku Integration add-on and import the application into your Salesforce org as follows:
heroku addons:create heroku-integration
heroku buildpacks:add https://github.com/heroku/heroku-buildpack-heroku-integration-service-mesh
heroku salesforce:connect my-org --store-as-run-as-user
heroku salesforce:import api-docs.yaml --org-name my-org --client-name GenerateQuoteJob
Trigger an application rebuild to install the Heroku Integration buildpack
git commit --allow-empty -m "empty commit"
git push heroku main
Once imported grant permissions to users to invoke your code using the following sf
command:
sf org assign permset --name GenerateQuoteJob -o my-org
Once imported you can see the executeBatch
operation that takes a SOQL WHERE clause to select the Opportunity object records to process. Also note that the datacreate
and datadelete
operations are also exposed since they declared in the api-docs.yaml
generated from the Java annotations within PriceEngineService.java
.
As noted in the Extending Apex, Flow and Agentforce sample you can now invoke these operations from Apex, Flow or Agentforce. Here is some basic Apex code to start the job to create the sample data (if you have not done so earlier):
echo \
"ExternalService.GenerateQuoteJob service = new ExternalService.GenerateQuoteJob();" \
"System.debug('Quote Id: ' + service.datacreate().Code202.jobId);" \
| sf apex run -o my-org
Note
Run the heroku logs --tail
command to monitor the logs to confirm the job completed.
Here is some basic Apex code you can run from the command line to start the generate Quotes job:
echo \
"ExternalService.GenerateQuoteJob service = new ExternalService.GenerateQuoteJob();" \
"ExternalService.GenerateQuoteJob.executeBatch_Request request = new ExternalService.GenerateQuoteJob.executeBatch_Request();" \
"ExternalService.GenerateQuoteJob_BatchExecutionRequest body = new ExternalService.GenerateQuoteJob_BatchExecutionRequest();" \
"body.soqlWhereClause = 'Name LIKE \\\\'Sample Opportunity%\\\\'';" \
"request.body = body;" \
"System.debug('Quote Id: ' + service.executeBatch(request).Code202.jobId);" \
| sf apex run -o my-org
Note
Run the heroku logs --tail
command to monitor the logs of the web
and worker
processes as you did when running locally.
Navigate to the Quotes tab in your org or one of the sample Oppoortunties to review the generates quotes. You can re-run this operation as many times as you like it will simply keep adding Quotes to the sample Opporunties created.
If you are running application locally, run the following command to execute a batch process to delete the sample Opportunity and Quote records.
# Target: POST /api/data/delete
./bin/invoke.sh my-org http://localhost:5000/api/data/delete '{}'
If you have deployed the application, run the following:
echo \
"ExternalService.GenerateQuoteJob service = new ExternalService.GenerateQuoteJob();" \
"System.debug('Quote Id: ' + service.datadelete().Code202.jobId);" \
| sf apex run -o my-org
Observe the log output from the heroku local
or heroku logs --tail
commands and you will see output similar to the following
web.1 | Job published to Redis channel jobsChannel...
worker.1 | Worker received job with ID: 55610381-55f8-4a05-8550-158f6410663b for data operation: delete
worker.1 | Starting data deletion via Bulk API v2 for Job ID: 55610381-55f8-4a05-8550-158f6410663b
worker.1 | Found 10 Opportunities to delete for Job ID: 55610381-55f8-4a05-8550-158f6410663b
worker.1 | Preparing Bulk API v2 Opportunity deletion job for Job ID: 55610381-55f8-4a05-8550-158f6410663b
worker.1 | Submitted Bulk API v2 Deletion job with ID: 750am00000Q3uV1AAJ...
worker.1 | Polling Bulk API v2 job status for Job ID: 750am00000Q3uV1AAJ...
worker.1 | Bulk API v2 Job 750am00000Q3uV1AAJ status: UploadComplete
worker.1 | Bulk API v2 Job 750am00000Q3uV1AAJ status: InProgress
worker.1 | Bulk API v2 Job 750am00000Q3uV1AAJ status: JobComplete
worker.1 | Bulk API v2 Job 750am00000Q3uV1AAJ processing complete.
55610381-55f8-4a05-8550-158f6410663b
- The Heroku Redis add-on is used to manage job queuing via a single Redis Stream named
jobsChannel
. While there are logically two types of jobs (quote generation and sample data management), both are sent to this same stream. The worker process reads from this stream and dispatches jobs to the appropriate service based on the job type included in the message payload. Themini
tier of this add-on is suitable for this sample. Redis connection details are managed via environment variables (typically set in.env
locally or via Heroku Config Vars). - Node.js and the
Procfile
define theweb
andworker
process types. Theweb
process runs the Fastify server (server/index.js
) handling API requests and publishing jobs to the Redis stream, while theworker
process (server/worker.js
) listens to the stream and executes jobs using the appropriate service (server/services/quote.js
orserver/services/data.js
). - The quote generation logic in
server/services/quote.js
uses the AppLink SDK's Data API and Unit of Work pattern (org.dataApi.newUnitOfWork
,commitUnitOfWork
) to insert Quote and QuoteLineItem records together within a single transaction, ensuring atomicity. - The
invoke.sh
script relies on thex-client-context
header being correctly passed for authentication when running locally. The mainProcfile
is used for deployment, which incorporates the Heroku Integration service mesh. - The main worker process (
server/worker.js
) receives job messages via Redis, extracts and initializes the Salesforce context, and then delegates the core processing logic (using the context) to handlers defined inserver/services/quote.js
andserver/services/data.js
. - The Heroku Connect add-on can be used as an alternative to reading and/or writing to an org via Heroku Postgres. This is an option to consider if your use case does not fit within the Salesforce API limitations. In this case note that there will be some lag between data changes and updates in the Salesforce org caused by the nature of the synchronization pattern used by Heroku Connect. If this is acceptable this option will further increase performance. Of course a hybrid of using the Salesforce API for certain data access needs and Heroku Connect for others is also possible.
- This sample uses Salesforce API Query More pattern implicitly via the AppLink SDK's
org.dataApi.query
method when retrieving large datasets. - To create sample data (
handleDataMessage
inserver/services/data.js
), the AppLink SDK's Bulk API v2 (org.bulkApi
) is used for efficient handling of potentially large volumes. - An informal execution time comparison. The pricing calculation logic is intentionally simple for the purposes of ensuring the technical aspects of using the Heroku Integration in this context are made clear. As the compute requirements fit within Apex limits, it was possible to create an Apex version of the job logic (originally included in the Java version's
/src-org
folder). While not a formal benchmark, execution time over 5000 opportunities took ~24 seconds using the Heroku job approach vs ~150 seconds to run with Batch Apex, an improvement of 144% in execution time. During testing (of the Java version) it was observed that this was largely due in this case to the longer dequeue times with Batch Apex vs being near instant with a Heroku worker.
Sample | What it covers? |
---|---|
Salesforce API Access | This sample application showcases how to extend a Heroku web application by integrating it with Salesforce APIs, enabling seamless data exchange and automation across multiple connected Salesforce orgs. It also includes a demonstration of the Salesforce Bulk API, which is optimized for handling large data volumes efficiently. |
Extending Apex, Flow and Agentforce | This sample demonstrates importing a Heroku application into an org to enable Apex, Flow, and Agentforce to call out to Heroku. For Apex, both synchronous and asynchronous invocation are demonstrated, along with securely elevating Salesforce permissions for processing that requires additional object or field access. |
Scaling Batch Jobs with Heroku | This sample seamlessly delegates the processing of large amounts of data with significant compute requirements to Heroku Worker processes. It also demonstrates the use of the Unit of Work aspect of the SDK (JavaScript only for the pilot) for easier utilization of the Salesforce Composite APIs. |
Using Eventing to drive Automation and Communication | This sample extends the batch job sample by adding the ability to use eventing to start the work and notify users once it completes using Custom Notifications. These notifications are sent to the user's desktop or mobile device running Salesforce Mobile. Flow is used in this sample to demonstrate how processing can be handed off to low-code tools such as Flow. |