Bobby Huang
Author Bobby Huang

Conduit: Making Data Virtualization Simple & Low-Cost

August 20 2019 | Conduit

We think connecting to your data should be fast, simple, and reasonably priced.

As the world continues towards digital transformation, one of the chief concerns is how legacy businesses survive. How do companies that didn’t grow up in the digital age evolve to compete in a market increasingly shaped by new entrants immersed in data-driven decisioning? For these legacy businesses, how do they transform departments such as Supply Chain, Marketing and Support Teams to adopt advanced analysis to drive their success? The answer isn’t easy; transformation doesn’t come packaged in a box. Every organization looking at digital transformation must identify the strategic, cultural values-based changes that they want, in conjunction with objective data management modernization and establishing analytics disciplines. The tactics and targets of these efforts will vary widely by organization, but one thing is certain, none of it is possible without access to data.   

Data Virtualization can lead the way in a digital transformation journey

What is Data Virtualization?

Data Virtualization was born out of a business need to have unified, quick access to data in real time against the backdrop of a complex data landscape involving data silos, latency and a span of technologies that don’t natively connect to one another.

How does it work?

At its core, Data Virtualization is a form of data integration. It consolidates disparate sources into one environment and translates them into a consistent queryable language without duplicating the data on disk. This provides read access with minimal impact to source data and reduces the costs of having to implement and maintain two sets of data on disk. Furthermore, because Data Virtualization is deployed as a unifying architectural layer, it consolidates privacy, security and governance protocols into one single point of control – which greatly reduces database administrative time and overhead.

Give me an example

If a retail organization has sales data in MS SQL, inventory in SAP, CRM in Oracle and Marketing data in the cloud on PostgreSQL, leveraging a Data Virtualization tool would enable all these sources to be analyzed and reported on in one tool and in one unified view. Depending on configurations, the end user can easily refresh the data in real time allowing critical decisions to be made at the pace of business.

How can I do this?

There are many popular Data Virtualization tools in the marketplace; Denodo, Informatica, Dremio and Novartis are some of the most well-known. All these solutions are similar in that they deliver the benefits described above as well as implement a Data Servicing Layer (DSL) where governance and structure can be applied to the virtualized data. However, this functionality takes a long time to implement and manage – which translates to high costs. We think connecting to your data should be fast, simple, and reasonably priced. It is for this reason that we have deployed Conduit, a secure, lightweight virtualization tool that accelerates digital transformation, rather than delaying it. Conduit enables the same functionality but allows the data servicing to happen in the analytics or visualization tool rather than in the architecture. This allows a Conduit user to build a connection and access data in less than five minutes! Furthermore, there are no implementation or setup costs with Conduit, whereas the average implementation costs of Denodo or Informatica can range upward of $240k. Lastly, Conduit’s pricing is a fraction of the competitors’ and is much simpler and easier.

So, if you’re considering data virtualization or need access to datasets you can’t connect to, there’s no better tool than Conduit. Feel free to reach out to me directly at bobby@bpcs.com or fill out the form below.

Get Conduit today

Get a Conduit demo or install today in your Azure environment. Unleash the power of your data now.