The increasing TCO of the traditional BI environment, combined with the lack of agility and high time to market has forced a key player on Automotive Insurance and Breakdown Services market to consider new options to collect, integrate and process data in a cost-efficient and agile fashion. Read on to see how we created a new solution that not only meets those requirements but helps our customer to leverage their data in the Cloud on a whole new level.

What was the challenge that we were approached with?

The increasing TCO of the traditional BI environment, combined with the lack of agility and high time to market has forced our customer to consider new options to collect, integrate and process data in a cost-efficient and agile fashion.

That’s why they came to the experts of Lucy, to help them implement a new, Cloud-based enterprise data warehouse to meet the following requirements:

  •       Build a new, agile canonical data platform
  •       Integrate data from legacy systems (such as SAP) together with data from new sources (including Salesforce) in a transparent way
  •       Dramatically reduce overall BI/Big data TCO
  •       Set up the tools to move on to Advanced Analytics and Machine Learning capabilities

Our solution: a Cloud-based enterprise data warehouse powered by AWS

We implemented a new data lake and a data warehouse in AWS, using Data Vault 2.0 as methodology to accelerate the data integration process while ensuring full traceability and audit-ability of the data. We also implemented our own DV 2.0 engine using standard AWS technology to automate the DWH ingestion and integration.

The production environment is remotely supported by experts of Lucy to make sure data is delivered on time while costs are optimized.

Architecture based upon key data services of AWS

Data lake

Amazon S3 is used for the central inbound layer and achieve long term sustainability. Aging data is pushed to Amazon Glacier to optimize costs.

Data Warehouse

The Data Warehouse is running on Amazon Redshift and follows the Data Vault 2.0 methodology. Data Vault objects are very standardized and have strict modelling rules, which allows a high level of standardization and automation. The data model is generated based on meta data stored in an Amazon RDS Aurora database.

The automation engine itself is orchestrated using Amazon StepFunctions and Lambda functions.

And much more!

The Data Platform we have built for this key player on the Automative Insurance & Breakdown Services market, grows fast and enables them to address all sorts of data related use cases.