6 Step Guide to Integrate Salesforce with AWS DynamoDB using Amazon AppFlow bi-directionally
The architecture described in this blog post is intended to improve the efficiency of updating configuration data in a DynamoDB table by allowing authorized business users to make updates directly from Salesforce, rather than having to go through a manual process involving creating a support ticket.
To integrate salesforce with aws DynamoDB , the architecture makes use of two Amazon Web Services (AWS) products: Amazon AppFlow and Amazon EventBridge. These services can be used to integrate Salesforce Lightning with DynamoDB in a bi-directional manner, allowing updates to be made in either direction.
Amazon AppFlow , a fully managed integration service which enables data transfer between SaaS applications, such as Salesforce, and cloud storage or databases, such as DynamoDB. It can be used to easily set up data flows between these systems, allowing data to be transferred and transformed in real-time.
Amazon EventBridge is a fully managed event bus which makes it easy to connect applications together using data from your own applications, SaaS applications, and AWS services. It can be used to send data from one application to another, or to trigger an action in another application in response to an event.
By using these two services together, it is possible to build an event-driven, serverless-based microservice that allows authorized business users to update configuration data in DynamoDB directly from the Salesforce screen, without requiring access to the AWS Command Line Interface or AWS Management Console. This can help, streamline the process of updating configuration data and improve the efficiency of the contact center application.
Here is a high-level overview of the steps you might follow to integrate Salesforce Lightning with DynamoDB using Amazon AppFlow and Amazon EventBridge:
Step 1 : Setup AWS and DynamoDB
Set up your AWS account and create a DynamoDB table to store your configuration data.
Step 2 : Salesforce account and Salesforce Lightning application
Set up your Salesforce account and create a Salesforce Lightning application to manage your configuration data.
Step 3 : Set up Amazon AppFlow
Set up Amazon AppFlow by creating a flow that transfers data between Salesforce and DynamoDB. You will need to specify the source and destination for the data, as well as any transformations or mapping that need to be applied to the data.
To set up Amazon AppFlow and create a flow that transfers data between Salesforce and DynamoDB, you will need to follow these steps:
With AWS Management Console :
Go to the Amazon AppFlow homepage in the AWS Management Console and sign in to your account.
In the AppFlow dashboard, click the “Create flow” button.
On the “Select source and destination” page, choose Salesforce as the source and DynamoDB as the destination.
On the “Configure source” page, select the Salesforce object that you want to use as the source for the data transfer. You can also choose to filter the data by specifying a query or a specific record.
On the “Configure destination” page, select the DynamoDB table that you want to use as the destination for the data transfer.
On the “Map fields” page, you can specify any transformations or mapping that need to be applied to the data as it is transferred from Salesforce to DynamoDB. You can use the mapping editor to drag and drop fields from the source to the destination, or you can use the expression editor to specify more complex transformations.
On the “Schedule” page, you can specify the frequency at which the flow should run, as well as any advanced options such as retry behavior or error handling.
Review the flow settings and click “Create flow” to complete the setup process.
Once the flow is set up, Amazon AppFlow will automatically transfer the data between Salesforce and DynamoDB according to the schedule you specified. You can monitor the status of the flow and view any errors or issues in the AppFlow dashboard.
(Optional) If you need to apply transformations or mapping to the data being transferred, you can specify them using the create-flow-execution-property command:
This will set up the flow to transfer data from Salesforce to DynamoDB, applying any specified transformations or mapping along the way. You can monitor the progress of the flow using the describe-flow-execution-records command.
Step 4 : Set up Amazon EventBridge
Set up Amazon EventBridge by creating an event bus and defining the events that should trigger data transfer between Salesforce and DynamoDB. You can also specify any rules that should be applied to filter or transform the data before it is transferred.
With AWS Management Console :
To set up Amazon EventBridge using the AWS Management Console, you will need to perform the following steps:
Sign in to the AWS Management Console and navigate to the Amazon EventBridge page.
Click the “Create event bus” button.
Enter a name for the event bus and click the “Create event bus” button.
Click the “Create rule” button.
Enter a name for the rule and select the “Event pattern” option.
In the “Event pattern preview” field, enter a JSON object that defines the events that should trigger data transfer between Salesforce and DynamoDB. For example:
(Optional) If you want to apply additional filtering or transformation to the data, you can specify a target for the rule by clicking the “Add target” button and selecting a target type (e.g., AWS Lambda, Amazon SNS).
This will set up Amazon EventBridge to trigger data transfer between Salesforce and DynamoDB whenever an AppFlow Export Succeeded event is received on the MyEventBus event bus. The data will be filtered or transformed according to the rules defined in the MyRule rule, and any additional targets specified for the rule. You can monitor the progress of the data transfer by inspecting the events received on the event bus and the results of any targets that were triggered.
With AWS CLI :
Create a new event bus using the create-event-bus command:
(Optional) If you want to apply additional filtering or transformation to the data, you can specify a target for the rule using the put-targets command:
This will set up Amazon EventBridge to trigger data transfer between Salesforce and DynamoDB whenever an AppFlow Export Succeeded event is received on the MyEventBus event bus. The data will be filtered or transformed according to the rules defined in the MyRule rule, and any additional targets specified for the rule. You can monitor the progress of the data transfer by inspecting the events received on the event bus and the results of any targets that were triggered.
Step 5: Test the integration
Test the integration by making updates to the configuration data in either Salesforce or DynamoDB and verifying that the changes are reflected in the other system.
With AWS Management Console:
To test the integration between Salesforce and DynamoDB using the AWS Management Console, you will need to perform the following steps:
Sign in to the AWS Management Console and navigate to the Amazon AppFlow page.
Locate the flow that you set up to transfer data between Salesforce and DynamoDB and verify that it is active.
In Salesforce, make updates to the configuration data that you want to transfer to DynamoDB.
In the AWS Management Console, navigate to the Amazon DynamoDB page and verify that the updates to the configuration data have been reflected in the DynamoDB table.
Alternatively, you can also make updates to the configuration data in DynamoDB and verify that the changes are reflected in Salesforce.
This will confirm that the integration between Salesforce and DynamoDB is working as expected and that data is being transferred correctly between the two systems.
With AWS CLI:
To test the integration between Salesforce and DynamoDB using the AWS CLI, you will need to perform the following steps:
In Salesforce, make updates to the configuration data that you want to transfer to DynamoDB.
Use the describe-flow-execution-records command to check the status of the flow and verify that data is being transferred from Salesforce to DynamoDB:
Alternatively, you can also make updates to the configuration data in DynamoDB and verify that the changes are reflected in Salesforce.
This will confirm that the integration between Salesforce and DynamoDB is working as expected and that data is being transferred correctly between the two systems.
Step 6 : Deploy the integration
Deploy the integration to production and make it available to authorized business users.
With AWS Management Console :
To deploy the integration between Salesforce and DynamoDB to production using the AWS Management Console, you will need to perform the following steps:
Sign in to the AWS Management Console and navigate to the Amazon AppFlow page.
Locate the flow that you set up to transfer data between Salesforce and DynamoDB and verify that it is active.
(Optional) If you have not already done so, set up Amazon EventBridge to trigger data transfer between Salesforce and DynamoDB based on specific events.
(Optional) If you want to enable authorized business users to access the integration, you can do so by adding them as users in the AWS Management Console or by granting them access to the relevant resources through IAM policies.
This will make the integration between Salesforce and DynamoDB available to authorized business users in production. You can monitor the progress of the data transfer and the status of the integration using the Amazon AppFlow and Amazon EventBridge pages in the AWS Management Console.
With AWS CLI :
To deploy the integration between Salesforce and DynamoDB to production using the AWS CLI, you will need to perform the following steps:
Verify that the flow that you set up to transfer data between Salesforce and DynamoDB is active using the describe-flows command:
This will make the integration between Salesforce and DynamoDB available to authorized business users in production. You can monitor the progress of the data transfer and the status of the integration using the describe-flow-execution-records and list-event-buses commands, respectively.
Conclusion
This is just a high-level overview of the process, and there may be additional steps or considerations depending on your specific needs and requirements. It is also worth noting that Amazon AppFlow and Amazon EventBridge are just two options for integrating Salesforce and DynamoDB – there are other approaches and tools that you could use as well.
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the ...
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie
Duration
Description
cookielawinfo-checkbox-analytics
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional
11 months
The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy
11 months
The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.