May the force be with you

Project Description


For this project the client engaged App Sparq to help improve their daily email campaigns by going directly through the Salesforce Marketing Cloud (SFMC). Our goal was to export the customer data from the client's database and send it over to SFMC. And then import any updated customer data back into the client's database. Sounds simple enough; but there was a lot of hard-work, planning and testing that went into this project!

The Challenge

The core challenge of this project was to make sure that the client's database and the SFMC database would be able to exchange data without getting out of sync. This was certainly not an easy task! There was a lot of customer data that was in the client database. We had to determine what was needed and what wasn't. Then the next step was to figure out the best way to transfer this data over to SFMC. Once the data had been sent over then we needed to figure out what had to be brought back from SFMC that was essential. We also had to come up with a solution to repeat this whole process on a daily basis. Apart from all this, we were handed over the legacy solution that the client had already implemented using SSIS (SQL Server Integration Server) packages with the help of another vendor, and we were requested to re-use the existing solution.

The Solution

We realized early on that the legacy solution of using SSIS (SQL Server Integration Server) packages was not a long-term solution. Because of that we decided to distribute this project into 2 phases. The first phase would be to start the ETL process between the client and SFMC so that the daily email campaigns could get up and running as quickly as possible. The second phase would be to replace the legacy solution with a more modern solution that would allow for easy maintenance and evolution in order to support future campaigns.

Phase 1:

In this phase we first analyzed the data of the client in order to understand what was required for the daily campaigns. Then we updated the existing SSIS packages and stripped them down so that only the required data was being exported and imported. We also updated the packages so that they would upload and download files directly from the SFMC server as opposed to going through a middle layer. That meant understanding the format accepted and produced by SFMC. Once that was clear then we updated the SSIS packages to export and import the files in that format. We also needed a way to keep the 2 databases in sync on a daily basis. Originally the delta was being exported and imported, but this had resulted in bad data being accumulated on the client side. Because of that we decided to export a rolling 2 year customer data file that would over-write the SFMC data every day. This way the SFMC job would just start fresh every day and did not have to worry about the delta being in sync.

We also had to implement error detection and correction. This meant ensuring that proper rules were being followed when exporting from the client database and importing into the SFMC database, as well as when exporting from the SFMC database and importing into the client database. Then we also had to test each of the daily emails that were being sent. This testing took a long time as we had to cover a huge array of cases and combinations. In fact, the testing took about 8 weeks to complete and uncovered a lot of points in the logic that had to be corrected or re-thought. Once we had successfully tested all the use cases and we were sure that our logic was sound on both sides that is when we deployed our solution.

Phase 2:

This phase was about making sure that the ETL process between the client and SFMC would be able to withstand the test of time. The client had a vision for future campaigns with more sophisticated logic and emails. The existing SSIS packages did not have the ability to support more complex logic. Not only that, it was very difficult to test, maintain and deploy these packages. That is why we proposed to replace these packages with a custom service that would be written in .NET Core. We designed the architecture of the service to have sub-services within it, so that each sub-service would be responsible for a specific task. This allowed for separation of concerns as well as the flexibility to run multiple sub-services in parallel. We chose .NET Core for a reason! The other advantage because of .NET Core was that we could deploy our solution to any platform and in any form. Meaning that we could deploy it as a Windows service for now and then deploy it as a Web service later if and when the client decided to move their apps to the cloud.

By creating a custom solution we were also able to incorporate and build more complex rules around the daily email campaigns. We designed these rules to be configurable so that the client could easily change any of the parameters in the future. In fact, our architecture also supports adding more sub-services if required that can be used for exporting even other types of data if the client wishes to do so in the future. We have also built recovery into the design so that the service is able to re-attempt if it fails for any reason. Our new solution has successfully been deployed recently and we have already seen a boost in the number of daily email campaigns.

Effort & Tech:

Phase 1 of this project took approximately 4 months to complete. Phase 2 of this project was a lot quicker and took about 2 months to complete. The techs used during this project included:

  • SSIS (SQL Server Integration Services)
  • .NET Core
  • SQL Server
  • nUnit.

Project Details

  • Duration: 6 months
  • ETL: SSIS (SQL Server Integration Services)
  • Back-End: .NET Core
  • Database: MS SQL Server
  • Testing: nUnit
Let's Get in Touch


Send us a quick note to get the conversation started!