Best practices, Building data pipelines

2024 Trends to Watch: Navigating the Impact of GenAI in Data and Analytics 

2023 was the year that belonged to artificial intelligence (AI) and the preamble to what can arguably be called an evolutionary super-cycle in technology equal to the internet and the smartphone. 

AI dominated the headlines and covered the entire spectrum from excitement around the promising benefits to fears of the demise of human civilization at the hands of intelligent machines.

While it is still very early on, and the data landscape around AI will continue to evolve, we can confidently bet on the fact that data and compute will continue to be the ingredients that underpin the future of generative AI and machine learning use cases. In this post, we’ll examine four key trends that enterprises must consider in 2024 to ensure success when implementing AI-led projects.  

Enterprises will operationalize AI and Machine Learning 

Generative AI tools such as ChatGPT and Bard wowed everyone in 2023 with their ability to easily interact, collect, curate and respond to natural language prompts with exceptional speed and accuracy. And while the increased productivity is an obvious and immediate benefit of artificial intelligence and machine learning, look for intelligent applications that can help organizations proactively and accurately analyze, predict and forecast the future outcomes with their data and can help mitigate unidentified risks and expose new opportunities. 

As these new AI-first applications come online, enterprises will look to their data and analytics leaders for selection, integration and the data – lots of high quality data – that enable tuning, modeling and the production rollout to support a variety of use cases. Automating access and immediate availability of data, blending, combining and curating data from multiple sources to feed AI and machine learning models, while also ensuring security and governance protocols will be critical to the success of enterprise AI projects as they broaden the scope of their data and analytics strategy. 

The effective deployment of these applications will rely heavily on the availability and timely delivery of rich datasets at scale, combined with the granularity that minimizes biases and ensures the recommendations derived from this data can be trusted to make decisions. Data and analytics teams will look for solutions that will keep pace with the responsiveness and immediacy of insights generated by artificial intelligence. 

Organizations will see a cultural shift in data literacy with the democratization of GenAI 

One of the most promising benefits of generative AI is how it has broken the technical barrier for workers to connect with data and insights. With the help of Large Language Models (LLM) and a simple interface, tens of millions of non-technical users at all levels will be able to instantly find answers and insights from data. The mass adoption of AI and ML tools in day-to-day work will fundamentally shift how data is valued across every part of the enterprise, and create a push for higher levels of data literacy organizations-wide. 

Organizations that prioritize training and understanding of data available to teams in different functional areas and domains as part of their data and analytics strategy will be the first to cross the data knowledge chasm and realize higher returns on their AI investments. This is because as teams better understand how the underlying data leads to machine-generated recommendations, there will be a higher willingness to adopt, use and even demand data-integrated workflows powered by artificial intelligence. 

This secular shift in how any-level user can freely interact, explore and even argue with data will raise the data literacy bar for organizations that look to rapid adoption of AI-driven applications.

Cloud cost management comes after data

Already in 2023, we saw companies upping the amount of time and investment they are making to control their cloud costs. Cost containment around the cloud is already a key theme for CIOs who are looking at their budgets.

In 2024, that focus will shift to infrastructure and applications that support generative AI use cases. There will be even more focus by business on “what am I paying for”  and the reason why it is being used. Teams will be tasked with justifying their spend on cloud deployments and what value it offers to the business. For many, it will be hard to pick apart the complex and granular cloud service bills and report a return on investment. Whether a company uses a cloud service provider such as AWS, Google or Microsoft or a cloud-hosted service such as Snowflake, uncomfortable conversations around escalating consumption will drive home the need for demonstrable value from their solutions.

Currently, analysts are divorced from the costs of the queries that they run and the dashboards that they build. What would happen if every report that you had would need a cost center added to it? Would you still run that query if it would cost your department $5, $10, or $100 a time? What if no one opened your reports or dashboards?

Once again, an obsessive focus around value will help finance and technology leaders streamline their operational expenses around data and analytics – trimming the fat on reports that consume significant resources without corresponding usage, while highlighting smaller, modular analytic resources that form the core of transformational business applications and realize material savings in process optimization. In these cases, a predictable cost structure to platforms that support AI and machine learning initiatives will be preferred by organizations as they look to scale up the benefits of generative AI. 

Data and analytics leaders will be the heroes… or villains

In 2024, with every business looking to capitalize on GenAI to gain a competitive advantage, data and analytics teams will face the challenges of a growing demand to integrate the latest AI-based applications with their existing organizational data. There will be dozens of projects, all of which will require the organization’s data. Any data strategy that does not offer a flexible, reliable and governed approach to accessing data from the hundreds of business applications and data sources will become the bottleneck of the AI transformation. 

For those teams overly invested in the theoretical ‘modern data stack’ and building data pipeline integrations across dozens of niche analytics components, the return on investment for such complex solution architectures will draw the attention of senior leaders when the total cost of ownership (both capital and operational) outweighs the ongoing business benefits.

Digital transformation for most organizations starts with digitization then moves to the automation of repetitive manual processes. Delivery of analytic applications that target specific business processes (such as the order-to-cash or procure-to-pay cycles in finance) streamline the operation and maximize the value of talent to interpret and react to changing economic headwinds in a single, simple user experience.

Firms with an overly complex data analytics stack find that their corresponding solutions become increasingly fragile when overnight loads fail to deliver data or users struggle to work with a different specialist tool for every new question.

As we look ahead to a new year and this new phase of technology, organizations that recognize the urgency, embrace these trends and invest in transformative benefits of AI, up-level the data literacy of their workforce and optimize their data infrastructure to support a data-centric approach will find themselves with a measurable lead in the market.