Analyst Perspectives

<< Back to Blog Index

Data Pipelines Integrate Data Processing and Enable AI



The development, testing and deployment of data pipelines is a fundamental accelerator of data-driven strategies, enabling enterprises to extract data from the operational applications and data platforms designed to run the business and load, integrate and transform it into the analytic data platforms and tools used to analyze the business. As I explained in our recent Data Pipelines Buyers Guide, data pipelines are essential to generating intelligence from data. Healthy data pipelines are necessary to ensure data is integrated and processed in the sequence required to generate business intelligence (BI) and support the development and deployment of applications driven by artificial intelligence (AI).

Traditionally, data pipelines have involved batch extract, transform and load (ETL) processes, but the need for real-time data processing is driving demand for continuous data processingVentana_Research_2024_Assertion_DataOps_Event-Driven_Microservices_60_S and more agile data pipelines that are adaptable to changing business conditions and requirements, including the increased reliance on streaming data and events. I assert that through 2026, approaches to data operations (DataOps) will continue to evolve as enterprises adapt their utilization of data processing pipelines to reflect increased adoption of event-driven architecture and microservices. 

More than two-thirds of participants in our Analytics and Data Benchmark Research cite preparing data for analysis as consuming most of the time spent analyzing data. As such, the benefits associated with accelerating data pipelines can be considerable. There are multiple approaches to increasing the agility of data pipelines. For example, we see an increased focus on extract, load and transform (ELT) processes that reduce upfront delays in transforming data by pushing transformation execution to the target data platform. I also recently discussed the emergence of zero-ETL approaches, which can be seen as a form of ELT that automates extraction and loading and has the potential to remove the need for transformation in some use cases. Additionally, reverse ETL tools can help improve actionable responsiveness by extracting transformed and integrated data from the analytic data platforms and loading it back into operational systems.

Both ETL and ELT approaches can be accelerated using change data capture (CDC) techniques that reduce complexity and increase agility by only synchronizing changed data rather than the entire dataset, while we also see the application of generative AI (GenAI) to automatically generate or recommend data pipelines in response to natural language explanations of desired outcomes. The development of agile data platforms is an important aspect of DataOps, which focuses on the application of agile development, DevOps and lean manufacturing by data engineering professionals in support of data production.

Agile and collaborative practices were a core component of the Capabilities criteria we used to assess data pipeline tools in our Data Pipelines Buyers Guide, alongside the functionality required to support data pipeline development, deployment and test automation, as well as integration with the wider ecosystem of DevOps, data management, DataOps and BI and AI tools and applications.

The development, testing, and deployment of data pipelines is just one aspect of improving the use of data within an enterprise. DataOps also encompasses data orchestration and data observability, and I will explore these in greater detail in forthcoming Analyst Perspectives. Nevertheless, I recommend that all enterprises explore how the development and deployment of agile data pipelines can help increase the potential for improved data-driven decision-making.

Regards,

Matt Aslett

Matt Aslett
Director of Research, Analytics and Data

Matt Aslett leads the software research and advisory for Analytics and Data at ISG Software Research, covering software that improves the utilization and value of information. His focus areas of expertise and market coverage include analytics, data intelligence, data operations, data platforms, and streaming and events.

JOIN OUR COMMUNITY

Our Analyst Perspective Policy

  • Ventana Research’s Analyst Perspectives are fact-based analysis and guidance on business, industry and technology vendor trends. Each Analyst Perspective presents the view of the analyst who is an established subject matter expert on new developments, business and technology trends, findings from our research, or best practice insights.

    Each is prepared and reviewed in accordance with Ventana Research’s strict standards for accuracy and objectivity and reviewed to ensure it delivers reliable and actionable insights. It is reviewed and edited by research management and is approved by the Chief Research Officer; no individual or organization outside of Ventana Research reviews any Analyst Perspective before it is published. If you have any issue with an Analyst Perspective, please email them to ChiefResearchOfficer@isg-research.net

View Policy

Subscribe to Email Updates

Want more customized control? Fill out your community profile

Posts by Month

see all

Posts by Topic

see all


Analyst Perspectives Archive

See All

Close menu