Services for Organizations

Using our research, best practices and expertise, we help you understand how to optimize your business processes using applications, information and technology. We provide advisory, education, and assessment services to rapidly identify and prioritize areas for improvement and perform vendor selection

Consulting & Strategy Sessions

Ventana On Demand

    Services for Investment Firms

    We provide guidance using our market research and expertise to significantly improve your marketing, sales and product efforts. We offer a portfolio of advisory, research, thought leadership and digital education services to help optimize market strategy, planning and execution.

    Consulting & Strategy Sessions

    Ventana On Demand

      Services for Technology Vendors

      We provide guidance using our market research and expertise to significantly improve your marketing, sales and product efforts. We offer a portfolio of advisory, research, thought leadership and digital education services to help optimize market strategy, planning and execution.

      Analyst Relations

      Demand Generation

      Product Marketing

      Market Coverage

      Request a Briefing

        Ventana Research Analyst Perspectives

        << Back to Blog Index

        Tame Telemetry Data with Mezmo Observability Pipeline

        As engagement with customers, suppliers and partners is increasingly conducted through digital channels, ensuring that infrastructure and applications are performing as expected is not just important but mission critical. My colleague, David Menninger, recently explained the increasing importance of observability to enable organizations to ensure that their systems and applications are operating efficiently. Observability has previously been the domain of the IT department but is increasingly important for business decision-makers as organizations combine machine-generated telemetry data with business event data to understand the impact of a system outage or application performance degradation on their ability to conduct digital business. Companies such as Mezmo are responding with observability platforms designed to facilitate the integration of machine and business data and encourage collaboration between business and IT professionals.

        Mezmo was founded in 2015. Initially known as LogDNA and focused on log management and analytics, the company rebranded as Mezmo in 2022 to reflect its broader focus and expanded functionality to address observability. Log management remains a core capability of the company’s observability platform, enabling the processing and analysis of log data from servers, networking equipment, internet of things (IoT) and applications. However, logs are just one form of machine-generated telemetry data, alongside traces and metrics, which can be used to identify application and infrastructure problems that can impact quality of service. The primary benefit of observability lies in using telemetry data to reduce the mean time to detection (MTTD) of IT infrastructure issues as well as the mean time to resolution (MTTR) — the time it takes to make the necessary changes to resolve them. The importance of machine-generated data is increasingly being recognized outside the IT operations department given the potential benefits for development and DevOps, as well as security and compliance, of correlating telemetry data with business event data to better understand the business risks associated with IT incidents. Almost one-third of participants (31%) in Ventana Research’s Analytics and Data Benchmark Research believe machine data is important for their organization’s analytics activities, while a similar proportion of participants (32%) in Ventana Research’s Data Governance Benchmark Research are managing or planning to manage machine data with their data governance policies. Generating value from telemetry data is not simply about combining it into a single, centralized platform for analysis by IT professionals, therefore. Mezmo’s Telemetry Pipeline provides an environment for ingesting telemetry data from multiple sources before connecting, transforming and enriching it, and then routing the data to multiple destinations, including storage, full-stack cloud-observability platforms, and analytics applications.

        The complexity of modern IT infrastructure means telemetry data needs to be ingested and analyzed from an enormous range of computing equipment, sensors and applications, all of which are distributed across on-premises and cloud computing environments. Increasing the volume of telemetry data to be collected and analyzed might lead to greater insight, but it is also likely to lead to increased cost, complexity and management overhead. Mezmo introduced its observability pipeline offering, now known as Telemetry Pipeline, specifically to address this dichotomy by enabling more intelligent and efficient processing of telemetry data. Observability pipelines are designed to improve time to detection and resolution by automating the centralization of telemetry data from multiple sources, with the additional benefit of transforming data prior to routing it to the observability platform to reduce unnecessary costs and time delays.

        Observability pipelines also allow data to be routed to other destinations such as data lakes, cloud data warehouses or business intelligence (BI) tools for further analysis and visualization. While many observabilityVR_2023_Assertion_Data_Observability_Pipeline_Investment_18_S (1) pipelines are stateless and simply control the flow of data, Mezmo’s Telemetry Pipeline is designed to deliver additional benefits through the unification and enrichment of data as well as the masking of sensitive data, and it can be used in conjunction with the company’s Log Analysis functionality, which provides alerting and visualization capabilities to accelerate actionable insight. The ability to remove duplicate or low-value data in flight, in combination with routing rules to automatically route higher-value data to observability and security platforms and lower-value data to low-cost storage, can help reduce the cost of managing telemetry data. I assert that through 2025, three-quarters of organizations utilizing telemetry data will have invested in observability pipelines to improve time to detection and resolution based on machine logs, traces and metrics. Observability pipelines also potentially lay the foundation for the combination of telemetry data with business event data. Mezmo provides integration with multiple data sources, platforms and tools. Its Mezmo Exporter for OpenTelemetry, introduced in June 2022 and based on the Cloud Native Computing Foundation’s open-source OpenTelemetry format, represents an integral aspect of the company’s strategy to enable customers to gain greater business insight from their telemetry data.

        Observability pipelines also potentially provide a foundation for the combination of telemetry data with business events. Translating telemetry data into business decisions requires close cooperation between technology engineers — who have the specialist skills required to understand the dependencies between infrastructure equipment, identify the signal from the noise, and interpret and act upon it — and business analysts and executives, who have the specialist business expertise required to understand the impact on business operations. Mezmo could facilitate the consumption of telemetry data by employees outside the IT department through the inclusion of workflows and templates that utilize best practices to further lower time to insight. Machine-generated data is increasingly critical to business decision-making. I recommend that organizations investing in machine-generated telemetry data evaluate the potential advantages of observability pipelines and include Mezmo in their assessments.


        Matt Aslett


        Matt Aslett
        Director of Research, Analytics and Data

        Matt Aslett leads the software research and advisory for Analytics and Data at Ventana Research, now part of ISG, covering software that improves the utilization and value of information. His focus areas of expertise and market coverage include analytics, data intelligence, data operations, data platforms, and streaming and events.


        Our Analyst Perspective Policy

        • Ventana Research’s Analyst Perspectives are fact-based analysis and guidance on business, industry and technology vendor trends. Each Analyst Perspective presents the view of the analyst who is an established subject matter expert on new developments, business and technology trends, findings from our research, or best practice insights.

          Each is prepared and reviewed in accordance with Ventana Research’s strict standards for accuracy and objectivity and reviewed to ensure it delivers reliable and actionable insights. It is reviewed and edited by research management and is approved by the Chief Research Officer; no individual or organization outside of Ventana Research reviews any Analyst Perspective before it is published. If you have any issue with an Analyst Perspective, please email them to

        View Policy

        Subscribe to Email Updates

        Posts by Month

        see all

        Posts by Topic

        see all

        Analyst Perspectives Archive

        See All