Building data pipelines, Customer stories

Incorta at Gartner D&A: Smart lakehouse technology to simplify the analysis process

We all have that purchase in the back of our closets. It may be a shirt or a pair of pants or shoes. They looked amazing online. They look amazing on the hangar (or sitting on the shelf). They look far less amazing when you put them on. So they sit in the back of the closet. You look at them and hope they change. You try them on from time to time and hope your requirements change. The reality is the one-size-fits-all approach does not work. It does not work in clothing, and it does not work in technology. Clothing designed to fit your specific needs continues to be expensive. Technology solutions designed to fit your specific needs conversely are getting less expensive.

The Data & Analytics (D&A) space is moving through a phase similar to the industrial revolution. The advancement of Machine Learning (ML) and Artificial Intelligence (AI) capabilities, in concert with simplicity advancements, presents the greatest challenge in decades to practitioners in the data space:

  • Your D&A teams are expected to be adaptable to these new technologies and responsive to users (who are rapidly adopting the new technologies).
  • Leaders are expected to truly understand the analytic needs of a company and to design and build a cohesive ecosystem to meet those needs.
  • Anything built, or any process designed, must be able to readily adapt to the stunning efficiencies and complex privacy and ethics scenarios posed by ML and AI technologies

The one thing that was clear at last week’s 2023 Gartner Data & Analytics (D&A) Summit in Orlando is that innovation and disruption are the norms moving forward. Success for D&A teams revolves around how they enable adaption across their organization and business partners.

Disruption is moving at logarithmic speed across all parts of the data and analysis lifecycle. There is no one vendor or one approach for all circumstances. The one size fits all approach is as much (if not more) of a myth in software as it is in clothing sales. The key is to drive efficiencies on a process-by-process basis. Take, for example, ETL processes. The historic approach of building a custom set of ETLs based on highly dynamic business requirements is no longer tenable.

For years, the energy in the D&A space has moved toward the data lake house. Land your information (in form) in a central location and then provide an adaptive framework to enable use in support of informed decision making. Delivering data into this framework is the beachhead for all information strategies moving forward. Technologies that drive value must simplify:

  • The process of delivering information to the lake house.
  • Creating semantic structures to support business needs.
  • Last-mile delivery of the information to the user.

All of this must happen in alignment with the security and ethics policies of an organization. Anything that falls outside of these tenants is secondary, a nice to have (why we bought that extra pair of shoes in the first place). Success in this world is not going to be found in visuals or clever summaries. You need to execute the hard, sometimes boring, work of building out a foundational process that enables agile information use. Do not over-govern; do not remove access from the users; simplify the process and align it to your needs.

Incorta is one of a cohort of technologies built around simplifying the process of delivering and publishing data for use, allowing agility to respond to internal and external analysis demands. It’s an approach Donald Clarke from Zeus Industrial Products detailed during his presentation at the conference, explaining how Zeus turned to Incorta to modernize all their reporting and analytics while integrating data from many disparate sources.

An analytics hub for near real-time data

Pre-Incorta, Donald’s D&A team faced several data challenges that might sound familiar to you. They, for example:

  • Couldn’t optimize their legacy code
  • Slowed down Oracle EBS significantly when running concurrent, request-based reports
  • Were forced to use serialized scheduling
  • Couldn’t get ahead of the increasing number of reporting requests they received every month (and knew that number would keep growing)
  • Couldn’t deliver accurate, efficient interdepartmental reporting

With Incorta, Zeus finally consolidated ERP and supporting operational data into one platform and realized true, near real-time analysis. Users have access (both published to on-demand) to new transactional information almost instantaneously. They are, as a D&A team, able to respond to new needs in just hours. In many cases, Donald said some departments are now completely self-sufficient, all the way from schema to dashboard when it comes to reporting. In the first six months alone, the number of analytics users at Zeus exploded from 25 to 850, the number of reports grew from 50 to 550, and Donald’s team became proactive instead of reactive — they’re now moving beyond basic analytics use cases toward predictive AI & ML analytics.

What Incorta did for Zeus, we can do for you, too. And we can prove it. At a Gartner D&A Vendor Showdown, Incorta faced off against SAP Analytics Cloud and Lumen, we demonstrated how Incorta’s unique approach to enterprise analytics addressed the scenario provided by Gartner but also made that available to other platforms (like Power BI and Tableau). We showed the simplicity and integration on stage. Incorta combined tens of thousands of file-based data, applied master data concepts, enriched the data with industry-standard ML capabilities, showed scenario-based analytics (as required by Gartner), and then exposed the information to PowerBI (demonstrated by live queries against more than 2 billion rows of data)

This is not a one-platform approach to everything. One-size-fits-all does not work. The Incorta approach is about enabling simplicity. The road forward for D&A professionals is paved with simple fundamentals.