Data Migration.

Web Automation Service

Project Description

One of our clients, a financial services company that offers affordable payment plans for patients seeking quality care from a preferred provider. Over the course of its history, it has helped more than 50,000 patients receive medical treatment by offering them loans to cover for their medical fees.

Problem Statement
Being a financial institution, this client is a company that has a high dependency in technical systems to maintain a record of all the clients and loans that it has, and of the institutions that have provided the services. This makes having safe and updated systems a necessity.

Having considered all this, our client decided to migrate its data from a Legacy system that they had to Salesforce, that provides a CRM platform, for an easier way to manage all their clients. The main problem was that all the info was on a legacy system and we had no database access nor any way to consume the info.

Since there wasn’t a straightforward method to get the data, we recurred to a Web Scraping system (also known as Web Harvesting or Web Data Extraction). It’s used for extracting data from websites trough a web browser or directly, using the Hypertext Transfer Protocol.

It can be done manually by a software user, but it performs better by implementing Web Automation processes using a bot or web crawler. The data is gathered and copied from the web, typically into a local database or spreadsheet, for later retrieval or analysis.

For this particular project, the data got scraped directly from the client’s legacy website, then extracted and imported into a CSV (Comma Separated Value) file, that Salesforce accepts as data source.

Technology Stack
For this project, the following stack was used:

  • Node.Js
  • Selenium Webdriver
  • PhantomJS
  • ECMAScript 6
  • Git

Project Details

  • Date January 23, 2017
  • Tags Automation
Back to Top