- Synchronous v/s Asynchronous decision: Whether response of API callout need to be processed in same transaction in realtime or near realtime.
- Have Error logging framework to capture errors and have retry mechanism.
- Request and Reply
- Point and Click based solution using Lightning Flow. Blend clicks and code.
- External services map out to external services as if those are in Salesforce.
- Need a API schema file in interagent format which will act as code.
- Create "Named Credentials". Put URL, Identity type = "Named Principal", Authentication Protocol = "Password Authentication", put username and password.
- Quick Find ==> Integrations ==> External Services
- Point Schema file here.
- Create a new flow. Under Palette, APEX you would see API code auto generated by external service we configured above. Just drag and drop it in Flow. Assign input and output parameter of API to flow variables.
- User can initiate SOAP/HTTP callout on click of quick action by writing a flow. From Flow using Action ==> Apex Action, call @InvocableMethod apex function from apex class.
- Batch Apex invoking external services. Execute method in batch apex gets refreshed governor limits every time however there are governor limits on total callouts.
- Fire and Forget
- Salesforce platform events can be generated using point and click or code and target system can listen to a channel. Platform event can be created using process builder / flow or apex.
- To publish platform event, quickfind ==> Platform Events. Create a new platform event and create required fields.
- You can publish platform event using apex code.
- To create platform event using process builder or flow, use action type as "Create a Record"
- Platform events can also be published using Salesforce APIs.
- To subscribe to the platform event, write a trigger on event object.
- You can also subscribe to platform event notifications using lightning component. Help guide: https://developer.salesforce.com/docs/atlas.en-us.platform_events.meta/platform_events/platform_events_subscribe_lc.htm#:~:text=Examples-,Subscribe%20to%20Platform%20Event%20Notifications%20in%20a%20Lightning%20Component,and%20listening%20to%20event%20messages.
- You can also subscribe to platform event from process builder or flows.
- Workflow based outbound message. SF will retry in case of failures. Limitation is can send data from only 1 object.
- Apex based callouts (SOAP or REST). Use when multiple connected objects need to be synced.
- For Large Data Volume Batch Integration
- Change Data Capture. Salesforce publishes change events which represents change in records (insert/update/delete/undelete). Saved event on bus for 72 hours.
- For ex when a account is created in SF, it will publish event on bus and then AWS and SAP can create account in its system.
- CDC helps in Data replication, Back office reacting to data changes in SF and archive changes in SF for extended period of time.
- QuickFind ==> Change Data Capture, select objects on whom data capture is required. Subscribe to for example for Case change events "/data/CaseChangeEvent"
- CDC is only published on transaction commit.
- Using ETL tool. Extract, transform and upload using bulk or SOAP API.
- Salesforce invoke external system using SOAP/REST and external system invoke SF API's.
- Streaming API is built most for UI realtime updates. and CDC for more volumes for backend synchronization cases.
- Four Streaming APIs
- Architectural diagram on what is supported
- Remote Call In
- Must need valid login and session id to perform a API call.
- For Bulk data use Bulk API.
- Testing using workbench
- Following would be the response on clicking "Execute"
- Add data to the job. Select PUT with uri as /services/data/v48.0/jobs/ingest/jobID/batches
- Workbench would look the following way. Content-Type header should be text/csv.
- Click on Execute.
- To let Salesforce know that it’s time to process the data. Do this by changing the job state from Open to UploadComplete. URI: /services/data/v48.0/jobs/ingest/7507F00000M4tatQAB and PATCH method and Content-Type header make it application / json and following request body
- { "state" : "UploadComplete" }.
- Workbench would look the following way
- At this point, Salesforce starts processing the job.
- You can check status of the job using Quick Find ==> Bulk Data Load Jobs.
- You can also check using API /services/data/v48.0/jobs/ingest/7507F00000M4tatQAB
- Get the job results the following way. GET method
- URI: /services/data/v48.0/jobs/ingest/7507F00000M4tPbQAJ/successfulResults
- To get failure results in URI have /services/data/v48.0/jobs/ingest/7507F00000M4tPbQAJ/failedResults
- Platform events can be used to publish event from outside SF. It can be published by calling REST/SOAP/BULK APIs.
- Enterprise WSDL is tightly coupled with org. Partner WSDL is more generic and can be used across orgs.
- Data Virtualization
- View/Modify data stored outside SF.
- Use Salesforce connect to pull data from other system in realtime using point and click.
- Create Auth provider in target org.
- Copy Callback URL from Auth provider and paste in Connected app created in source org.
- Create connected app in source org.
- Create external data source in target org. Select type as Salesforce Connect Cross Org Adapter. Identity Type "Named Principal". Scope "full refresh_token". Click "Validate and Sync"
- Select the object from Source org to be pulled. Click on Sync.
- Go to External Object in target org. Create a new tab for external object.
- This will make bidirectional sync of data between both the orgs.
- Diagrammatic format:
- You cannot have triggers on external objects.
- You can report on external objects.