Best practices, Building data pipelines, Company, Customer stories

Zero Gravity: Building Data Pipelines for Agility and Freedom

That’s a wrap! Zero Gravity — the industry’s first and only modern cloud data pipeline event — is in the books and we are proud to report that it was a tremendous success!

On May 26th, more than 7,000 data engineers, architects, analysts, and data scientists from 140+ countries joined us virtually to learn from each other and explore new ways to build and manage data pipelines in the cloud. We heard from Google Cloud CEO Thomas Kurian, Microsoft Corporate VP of Azure Data Rohan Kumar, Apache Arrow and Substrait co-creator Jacques Nadeau, and many other builders and visionaries.

Did you miss any of the sessions at Zero Gravity? The full conference is now available on demand here. You can access the keynotes, as well as the sessions in all three tracks: data engineering, data architecture, and use cases. 

The big takeaways from the day:

  • There is more than one way to build a data pipeline
  • The technology landscape is changing dramatically
  • There is A LOT of interest in changing the status quo

“The movement of data to and within the cloud is fundamentally changing how we think about data architectures,” said Incorta CEO Scott Jones in his opening remarks. “There is no one-size-fits-all when it comes to managing data pipelines in the cloud.” 

If there was any doubt that the days of ETL, vendor lock-in, and walled gardens are numbered, it was laid to rest by Google’s Thomas Kurian as he elaborated Google’s approach to today’s data ecosystem in his fireside chat with Jones. 

“If you can make all of your data — structured and unstructured — available in real time; if you can open the platform to different styles of analysis and allow people to access it no matter where it is stored, we think that’s the core of making data the asset that it can be to every organization around the world,” he said.

Microsoft Azure’s Rohan Kumar echoed a similar message in his closing keynote, noting that decisions about data platforms are now being made at the board level. He also spoke about the challenges facing data architects and engineers today: “A big trend we see today is that decision making needs to happen in real time. So the question [for data engineers] is ‘How do we significantly simplify what it means to get the data in the right place, in the right form, in order to enable decision making?’ ”

Kumar noted that while there’s been a lot of innovation in the analytics space, “if you look at the jobs being done by data engineers, data scientists, and business analysts, that has not fundamentally changed. Yes, we are dealing with very large data and very complex systems, but foundationally what they do hasn’t become very easy and that’s the paradigm shift that’s coming.” 

Technical presentations from data architects and engineers who are building and managing data pipelines were at the heart of the conference. Highlights included:

  • When One Size Does Not Fit All: General Purpose vs. Purpose-Built Data Pipelines, with Shane Collins, Data Engineer at Meta 

  • Activating Metadata to Manage Data Pipeline Chaos, with Prukalpa Sankar, Co-founder of Atlan

  • What Does It Take to Modernize Your Operational Reporting Platform?, with Bharath Natarajan, Head of Business Intelligence and Intelligent Automation, Keysight Technologies

  • Pipelines in Practice: How Practitioners Operate Using the Modern Data Stack, with Margaret Francis, Chief Product Officer, dbt Labs

At the end of the day, we heard from Jacques Nadeau, co-creator of Sundeck, Apache Arrow, Dremio, and Substrait, who co-presented with Incorta co-founder and EVP of Product Matthew Halliday on the topic of deliberate data engineering.

Nadeau, who is an advisor to Incorta, spoke about how innovation goes in waves from general purpose to specialized tools. “Ten years ago, you basically had to choose among general solutions,” he said. “These allow you to address a wide range of use cases, but typically require a lot of work to fit to purpose. Now there’s a lot more opportunity to specialize solutions. That means you spend a lot less of your resources trying to bridge the gap between what the generalized solution can do and what you need.”

Halliday closed the conference with a call to action to explore these new tools and approaches, as well as to reevaluate old approaches that were developed during a different era. 

“In the old ETL paradigm, every question requires a data transformation before you can answer it. Instead of supporting your business users, you feel like you’re weighing them down. Zero gravity is about helping everyone find ways to operate with more freedom and agility.”

In case you missed it, the full Zero Gravity conference is now available on demand here