Democratize data and reveal insights with a modern data estate

Snapshot

A national entertainment and restaurant chain needed to know more about their customers to drive interest and revenue growth. They had raw data in various disparate systems, but were unable to aggregate, analyze and unlock its true value until they engaged Blueprint. Blueprint created a centralized data architecture allowing the company to tap into terabytes of data that had been underutilized for years.

Our work

The problem

A global leader in the entertainment and restaurant industry had one key data goal: Know the customer to develop the best relationship possible and optimize their experience at all levels. The company recognized the value their own data held but was unable to effectively realize it due to their siloed and outdated infrastructure and the immense volume of data they had. The company wanted to create customer profiles based on three main data sources:

  • Point-of-sale data from a 15Tb SQL server that was never tuned for big-data workloads. Each transaction created a new row in the database. This resulted in 375 million rows of data, which made it effectively unqueryable.

  • Transaction data from mobile devices being stored in a Postgres database in AWS and Google Analytics.

  • Streaming customer experience data from in-store Wi-Fi and the corporate website. This holds important demographic data and organizes data into individual customer profiles.

“They were operating as if it were the early ‘90s as far as their IT department was concerned. Everything was a SQL database. Everything was siloed. Every piece of data was in a different database,” a Blueprint business development director said. “Every time they wanted to integrate data from these different siloes, it was a big project.”

The Blueprint Way

Following a Project Definition Workshop with the client, Blueprint outlined a 30/60/90-day roadmap focused on rapidly building the foundations of a Modern Data Estate and gathering more than 20 TBs of data from the disparate sources into a central location in the cloud.

“We created and used the Azure Data Factory metadata-driven pattern that we have spearheaded at Blueprint and steered all that data over to a Data Lake,” Blueprint Director of Solutions Development Eric Vogelpohl said.

Blueprint migrated the point-of-sale, mobile transaction and customer experience data into an Azure Data Lake and then implemented Databricks Delta Lake to form a “Lake House.” Azure event hubs were configured to accept the streaming customer experience data in the Data Lake. All historical and real-time streaming data is now organized in the same location, allowing for querying and analysis for the first time. Queries that once took 8 to 9 minutes to run can now be completed in less than 10 seconds, and access to data has been democratized across the organization.

With all their data in a single, manageable and stable location, the company is now able to consider business decisions with a complete picture of their customers. This data is currently being used to build customer behavior metrics and profiles for use in marketing, predictive analysis and layout planning. In addition, by adopting a Databricks Lake House Pattern, the company can comply with the data erasure requirements of privacy laws like GDPR and CCPA.

Impact

Let's build your future.

Share with your network

Share on twitter
Share on facebook
Share on linkedin
Share on email

You may also enjoy

Case study

Unify data to spark ideas and reduce costs

Developing a flexible modern data estate for an established oil and gas exploration and production company

Case study

Democratize data and reveal insights with a modern data estate

Building a data factory to unlock analytics and elevate the customer experience