Accelerate analytics up to 100X and answer business questions, increase 40% operational efficiency


Capture and organize massive amounts of data from multiple disparate sources, in order to gather imperative information and make critical business decisions in near real time, saving money, resources, and time.

Industry: SecOps Data Intelligence, Information Security - Financial Sector
Areas of Expertise: Modern Stack Data Services and BI
Tools: Ultralake, Data Lake Query Engine, Tableau and Excel

Customer Challenges

The SecOps business analyst team was heavily reliant on querying their technical organization to manually run individual reports. Prioritization was subjective, and the significant backlog of requests resulted in outdated, stagnant information unsuitable for real time business decisions. Poor, old, questionable data that was not accessible to the business users without an added layer of technical reliance.

The Primary Obstacle was with the OLAP cube not able to efficiently process the vast amounts of data collected from various devices and applications. All of the data collected from various systems stored on NAS storage first. The system was taking several hours to process the data from NAS storage and it passed through various stages of ETL pipelines and scripts to land in Data Warehouse and eventually to the OLAP cube. Several technical people were engaged to support the business team’s report requirements. The requirement of highly skilled human effort triggered a huge backlog and resulted in long, costly wait times and a poor use of expensive resources.

The team’s answer to the challenging loading process of their existing OLAP was hiring several contractors and increasing the estimated project time to completion from 12 months to 18 months. Immediately, the impact of hiring expensive talent, industry leading solutions, and expanded resources caused a massive increase to project cost while stalling production further with more focus on planning. To freeze the increasing costs and meet immediate demands of the Business, The DataOps team adopted XtraLeap’s solution along with leading Data Lake Query Engine platform to accelerate their time to insights.

The proposed solution was to reduce time to insights by automating the data acquisition, cleansing, and delivery process via self-service analytics for the business team.

The Solution

The proposed solution scales from a few gigabytes to several petabytes scale and minimizes the effort needed to re architect the solution when the demand grows. Our solution embraces open architecture and aligns to modern data architecture standards implemented and recommended by the experts from massive data-consuming Power Houses like Netflix, Expedia, or Adobe.

The solution consisted of three components. Ultralake subscribed the data from various devices, cleanses it, and stores it on object storage in a tabular format. The Open Source version of Data Lake Query Engine is placed on top of data lake and acts as a query engine to provide results back to the client using its lightning-fast query processing technology. Previous queries that used to take several minutes to hours are finished under sub-seconds of response time. This showed an improvement up to 100X for complex and critical business questions. Tableau is used as a self-service analytics platform that connects and fires live queries to Data Lake Query Engine. With the new solutions, Tableau is turbocharged to produce insights with a speed that was never seen before. This enabled the business team to visually explore, drill-down, and download reports on their own a lot faster without having to ask the technical team all the time.

“XtraLeap took a completely different approach which we didn't think of. We were struggling because we were moving the data a lot. XtraLeap recommended an efficient method of eliminating unnecessary heavy lifting jobs to push data to the data warehouse and processing OLAP, etc. Xtraleap solution subscribed data from sources and switched storage from NAS to HIGH OPS object storage of elastic in nature, and they placed a Distributed Data Lake Query Engine on top of it. MAGICAL. No changes made to the BI layer. We experienced speeds up to 100X without rewriting any of the reports or modifying the dashboard logic. We have instant access to all of our data at unprecedented speed. No pushing of data to OLAP, No Complex ETL scripts, No Data warehouse, and No Pushing of Data to another expensive memory-based software, caching on BI Tool or Data Warehouse.” says a technical expert from the client.

The new solution eliminated the need for proprietary licensing costs associated with premier database vendors and performed faster, better up to 100X, and showed a gain of 40% operational efficiency and cost savings.

Successful Outcomes

  • Faster and better decision making with a performance gain up to 100X.

  • Eliminated the proprietary software costs and reduction in licensing costs.

  • From siloed data to a unified data service layer.

  • Self-service BI reports reduced the long wait times and backlog for business users.